Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Jurassic Park

Exile in Dinoville

leave a comment »

Earlier this month, a writer named Nick White released Sweet & Low, his debut collection of short fiction. Most of the stories are set in the present day, but one of them, “Break,” includes a paragraph that evokes the early nineties so vividly that I feel obliged to transcribe it here:

For the next few weeks, the three of us spent much of our free time together. We would ride around town listening to Regan’s CDs—she forbid us to play country music in her presence—and we usually ended the night with Forney and me sitting on the hood of his car watching her dance to Liz Phair’s “Never Said”: “All I know is that I’m clean as a whistle, baby,” she sang to us, her voice husky. We went to a lot of movies, and most of the time, I sat between them in a dark theater, our breathing taking the same pattern after a while. We saw Jurassic Park twice at the dollar theater, and I can still remember Forney’s astonishment when the computer-generated brachiosaur filled up the giant screen. “Amazing,” he whispered. “Just amazing.”

And while it may seem like the obvious move to conjure up a period by referring to the popular culture of the time, the juxtaposition of Liz Phair and a brachiosaur sets off its own chain of associations, at least in my own head. Jurassic Park was released on June 11, 1993, and Exile in Guyville came out just eleven days later, and for many young Americans, in the back half of that year, they might as well have been playing simultaneously.

At first, they might not seem to have much to do with each other, apart from their chronological proximity—which can be meaningful in itself. Once enough time has passed, two works of art released back to back can start to seem like siblings, close in age, from the same family. In certain important ways, they’ll have more in common with each other than they ever will with anyone else, and the passage of more than two decades can level even blatant differences in surprising ways. Jurassic Park was a major event long before its release, a big movie from the most successful director of his generation, based on a novel that had already altered the culture. What still feels most vivid about Exile in Guyville, by contrast, is the sense that it was recorded on cassette in total solitude, and that Phair had willed it into existence out of nothing. She was just twenty-six years old, or about the same age as Spielberg when he directed Duel, and the way that their potential was perceived and channeled along divergent lines is illuminating in itself. But now that both the album and the movie feel like our common property, it’s easy to see that both were set apart by a degree of technical facility that was obscured by their extremes of scale. Jurassic Park was so huge that it was hard to appreciate how expertly crafted it was in its details, while Phair’s apparent rawness and the unfinished quality of her tracks distracted from the fact that she was writing pop songs so memorable that I still know all the lyrics after a quarter of a century.

Both also feel like artifacts of a culture that is still coming to terms with its feelings about sex—one by placing it front and center, the other by pushing it so far into the background that a significant plot twist hinges on dinosaurs secretly having babies. But their most meaningful similarity may be that they were followed by a string of what are regarded as underwhelming sequels, although neither one made it easy on their successors. In the case of Jurassic Park, it can be hard to remember the creative breakthrough that it represented. Before its release, I studied the advance images in Entertainment Weekly and reminded myself that the effects couldn’t be that good. When they turned out to be better than I could have imagined, my reaction was much the same as Forney’s in Nick White’s story: “Amazing. Just amazing.” When the movie became a franchise, however, something was lost, including the sense that it was possible for the technology of storytelling to take us by surprise ever again. It wasn’t a story any longer, but a brand. A recent profile by Tyler Coates in Esquire captures much the same moment in Phair’s life:

Looking back, at least for Phair, means recognizing a young woman before she earned indie rock notoriety. “I think what’s most evocative is that lack of self-consciousness,” she said. “It’s the first and last time that I have on record before I had a public awareness of what I represented to other people. There’s me, and then there’s Liz Phair.”

And her subsequent career testifies to the impossible position in which she found herself. A review on All Music describes her sophomore effort, Whip-Smart, as “good enough to retain her critical stature, not good enough to enhance it,” which in itself captures something of the inhuman expectations that critics collectively impose on the artists they claim to love. (The same review observes that “a full five years” separated Exile in Guyville from Whitechocolatespaceegg, as if that were an eternity, even though this seems like a perfectly reasonable span in which to release three ambitious albums.) After two decades, it seems impossible to see Whip-Smart as anything but a really good album that was doomed to be undervalued, and it was about to get even worse. I saw Phair perform live just once, and it wasn’t in the best of surroundings—it was at Field Day in 2003, an event that had been changed at the last minute from an outdoor music festival to a series of opening acts for Radiohead at Giants Stadium. Alone on a huge stage with a guitar, her face projected on a JumboTron, Phair seemed lost, but as game as usual. The tenth anniversary of Exile in Guyville was just around the corner, and a few weeks later, Phair released a self-titled album that was excoriated almost anywhere. Pitchfork gave it zero stars, and it was perceived as a blatant bid for commercial success that called all of her previous work into question. Fifteen years later, it’s very hard to care, and time has done exactly what Phair’s critics never managed to pull off. It confirmed what we should have known all along. Phair broke free, expanded to new territories and crashed through barriers, painfully, maybe even dangerously, and, well, there it is. She found a way.

Written by nevalalee

June 21, 2018 at 8:41 am

The dawn of man

leave a comment »

Note: To celebrate the fiftieth anniversary of the release of 2001: A Space Odyssey, which held its premiere on April 2, 1968, I’ll be spending the week looking at various aspects of what remains the greatest science fiction movie ever made.

Almost from the moment that critics began to write about 2001, it became fashionable to observe that the best performance in the movie was by an actor playing a computer. In his review in Analog, for example, P. Schuyler Miller wrote:

The actors, except for the gentle voice of HAL, are thoroughly wooden and uninteresting, and I can’t help wondering whether this isn’t Kubrick’s subtle way of suggesting that the computer is really more “human” than they and fully justified in trying to get rid of them before they louse up an important mission. Someday we may know whether the theme of this part is a Clarke or a Kubrick contribution. I suspect it was the latter…perhaps just because Stanley Kubrick is said to like gadgets.

This criticism is often used to denigrate the other performances or the film’s supposed lack of humanity, but I prefer to take it as a tribute to the work of actor Douglas Rain, Kubrick and Clarke’s script, and the brilliant design of HAL himself. The fact that a computer is the character we remember best isn’t a flaw in the movie, but a testament to its skill and imagination. And as I’ve noted elsewhere, the acting is excellent—it’s just so understated and naturalistic that it seems vaguely incongruous in such spectacular settings. (Compare it to the performances in Destination Moon, for instance, and you see how good Keir Dullea and William Sylvester really are here.)

But I also think that the best performance in 2001 isn’t by Douglas Rain at all, but by Vivian Kubrick, in her short appearance on the phone as Heywood Floyd’s daughter. It’s a curious scene that breaks many of the rules of good storytelling—it doesn’t lead anywhere, it’s evidently designed to do nothing but show off a piece of hardware, and it peters out even as we watch it. The funniest line in the movie may be Floyd’s important message:

Listen, sweetheart, I want you to tell mommy something for me. Will you remember? Well, tell mommy that I telephoned. Okay? And that I’ll try to telephone tomorrow. Now will you tell her that?

But that’s oddly true to life as well. And when I watch the scene today, with a five-year-old daughter of my own, it seems to me that there’s no more realistic little girl in all of movies. (Kubrick shot the scene himself, asking the questions from offscreen, and there’s a revealing moment when the camera rises to stay with Vivian as she stands. This is sometimes singled out as a goof, although there’s no reason why a sufficiently sophisticated video phone wouldn’t be able to track her automatically.) It’s a scene that few other films would have even thought to include, and now that video chat is something that we all take for granted, we can see through the screen to the touchingly sweet girl on the other side. On some level, Kubrick simply wanted his daughter to be in the movie, and you can’t blame him.

At the time, 2001 was criticized as a soulless hunk of technology, but now it seems deeply human, at least compared to many of its imitators. Yesterday in the New York Times, Bruce Handy shared a story from Keir Dullea, who explained why he breaks the glass in the hotel room at the end, just before he comes face to face with himself as an old man:

Originally, Stanley’s concept for the scene was that I’d just be eating and hear something and get up. But I said, “Stanley, let me find some slightly different way that’s kind of an action where I’m reaching—let me knock the glass off, and then in mid-gesture, when I’m bending over to pick it up, let me hear the breathing from that bent-over position.” That’s all. And he says, “Oh, fine. That sounds good.” I just wanted to find a different way to play the scene than blankly hearing something. I just thought it was more interesting.

I love this anecdote, not just because it’s an example of an evocative moment that arose from an actor’s pragmatic considerations, but because it feels like an emblem of the production of the movie as a whole. 2001 remains the most technically ambitious movie of all time, but it was also a project in which countless issues were being figured out on the fly. Every solution was a response to a specific problem, and it covered a dizzying range of challenges—from the makeup for the apes to the air hostess walking upside down—that might have come from different movies entirely.

2001, in short, was made by hand—and it’s revealing that many viewers assume that computers had to be involved, when they didn’t figure in the process at all. (All of the “digital” readouts on the spacecraft, for instance, were individually animated, shot on separate reels of film, and projected onto those tiny screens on set, which staggers me even to think about it. And even after all these years, I still can’t get my head around the techniques behind the Star Gate sequence.) It reminds me, in fact, of another movie that happens to be celebrating an anniversary this year. As a recent video essay pointed out, if the visual effects in Jurassic Park have held up so well, it’s because most of them aren’t digital at all. The majority consist of a combination of practical effects, stop motion, animatronics, raptor costumes, and a healthy amount of misdirection, with computers used only when absolutely necessary. Each solution is targeted at the specific problems presented by a snippet of film that might last just for a few seconds, and it moves so freely from one trick to another that we rarely have a chance to see through it. It’s here, not in A.I., that Spielberg got closest to Kubrick, and it hints at something important about the movies that push the technical aspects of the medium. They’re often criticized for an absence of humanity, but in retrospect, they seem achingly human, if only because of the total engagement and attention that was required for every frame. Most of their successors lack the same imaginative intensity, which is a greater culprit than the use of digital tools themselves. Today, computers are used to create effects that are perfect, but immediately forgettable. And one of the wonderful ironies of 2001 is that it used nothing but practical effects to create a computer that no viewer can ever forget.

My life as a paleontologist

leave a comment »

Concept art for Jurassic World

As I’ve mentioned here before, I’ve only ever wanted two jobs in my life: paleontologist and novelist. And the fact that I gave up the former goal around the time I turned ten years old doesn’t mean that I don’t look back on it with nostalgia. Reading the paleontologist Stephen Brusatte’s affectionate piece on The Conversation on the appeal of Jurassic World, I felt an odd twinge of regret for a life never led. Brusatte is actually a bit younger than I am—he was nine when Jurassic Park came out, while I was thirteen—and his article is a reminder that the world is still turning out freshly minted paleontologists, most of whom are distinguished by the fact that they held onto that initial spark of curiosity after the rest of us moved on. Jurassic Park, both as a book and as a movie, was responsible for countless careers in the field, just as Star Trek was for the hard sciences and Indiana Jones was for archaeology, but such works can more accurately be seen as igniting something that was already there, or providing an avenue for a certain kind of personality. Everyone knows how it feels to be excited by a book or movie into the prospect of an exotic career; the difference between real paleontologists and the rest of us is that the urge never faded. If there’s one thing you know when you meet a novelist or a paleontologist, it’s that you’re looking at the systematic working out in adulthood of a childhood dream.

Yet the two fields also have a surprising amount in common. In the beginning, both are fundamentally choices about what to spend your time thinking about: when you’re in grade school, you can’t think of anything better to occupy your time than dinosaurs. Later, as your understanding of the subject expands, it becomes slightly more subtle, if no less vast. At its heart, paleontology is the methodical reconstruction of facts that used to be obvious. Few things would have felt less equivocal at the time than a living triceratops—Jack Horner has called them “the cows of the Cretaceous”— but understanding and rediscovering such animals now requires the assemblage of countless tiny, almost invisible details, both in the world and in the mind. Fiction, in turn, is the creation of the obvious, or inevitable, from the small and easily missed. The poet Vladimir Mayakovsky once compared the act of writing to mining for radium: “The output an ounce, the labor a year.” Both fiction and paleontology require that we sift through a huge amount of material in search for a few useful fragments. The difference is that the writer generates his own dirt and then sorts through it. But in both cases, the trick lies in identifying a promising tract of ground in the first place.

Concept art for Jurassic World

Both are also about developing a way of seeing. The evolutionary anthropologist Elwyn Simons has compared the hunt for fossils in the jumbled rock of the Egyptian desert to the ability to find a single rare word in a mass of text, and both fields depend on refining the observer’s eye. Even more important, perhaps, is the ability to see facts in their larger context, while valuing the significance that each detail carries in themselves. One of the first things any writer learns is how crucial a glance, a gesture, or a single image can be: each element deserves as loving a consideration as we can give it. But it also needs to be subordinated to the overall effect. This kind of double vision, in which a stone or bone fragment is granted intense meaning in itself while occupying a place in the larger pattern, is central to all of science. Both are characterized by a constant oscillation between the concrete and the abstract, with the most ingenious theoretical constructs grounded in an engagement with the tangible and particular. Every insight is built on backbreaking labor, and the process itself becomes part of the point. Genuine discoveries are infrequent, so in the meantime, you have the field and the lab, and the workers who survive are the ones who come to love the search for its own sake.

So I’d like to think that if I’d become a paleontologist instead of a writer, my inner life would be more or less the same, even if its externals were very different. (If nothing else, I’d have gone outdoors occasionally.) But even then, I suspect that I’d spend about the same amount of time in my own head. As Stephen Jay Gould writes:

No geologist worth anything is permanently bound to a desk or laboratory, but the charming notion that true science can only be based on unbiased observation of nature in the raw is mythology. Creative work, in geology and anywhere else, is interaction and synthesis: half-baked ideas from a barroom, rocks in the field, chains of thought from lonely walks, numbers squeezed from rocks in a laboratory, numbers from a calculator riveted to a desk, fancy equipment usually malfunctioning on expensive ships, cheap equipment in the human cranium, arguments before a roadcut.

Which all circles back to the point with which I started: that life is ultimately a choice about what to think about. Last year, I finally realized my destiny by writing about dinosaurs for the first time, in my short story “Cryptids.” Even if it isn’t my strongest work—and I still think the ending could be better—it felt like a homecoming of sorts. I got to think about dinosaurs again. And I don’t know what more I could ever want.

Written by nevalalee

June 17, 2015 at 9:50 am

The Ian Malcolm rule

with one comment

Jeff Goldblum in Jurassic Park

A man is rich in proportion to the number of things he can afford to leave alone.

—Henry David Thoreau, Walden

Last week, at the inaugural town hall meeting at Facebook headquarters, one brave questioner managed to cut through the noise and press Mark Zuckerberg on the one issue that really matters: what’s the deal with that gray shirt he always wears? Zuckerberg replied:

I really want to clear my life to make it so I have to make as few decisions as possible about anything except best how to serve this community…I’m in this really lucky position where I get to wake up every day and help serve more than a billion people. And I feel like I’m not doing my job if I spend any of my energy on things that are silly or frivolous about my life…So even though it kind of sounds silly—that that’s my reason for wearing a gray t-shirt every day—it also is true.

There’s a surprising amount to unpack here, starting with the fact, as Allison P. Davis of New York Magazine points out, that it’s considerably easier for a young white male to always wear the same clothes than a woman in the same situation. It’s also worth noting that wearing the exact same shirt each day turns simplicity into a kind of ostentation: there are ways of minimizing the amount of time you spend thinking about your wardrobe without calling attention to it so insistently.

Of course, Zuckerberg is only the latest in a long line of high-achieving nerds who insist, rightly or wrongly, that they have more important things to think about than what they’re going to wear. There’s more than an echo here of the dozens of black Issey Miyake turtlenecks that were stacked in Steve Jobs’s closet, and in the article linked above, Vanessa Friedman of The New York Times also notes that Zuckerberg sounds a little like Obama, who told Michael Lewis in Vanity Fair: “You’ll see I wear only gray or blue suits. I’m trying to pare down decisions. I don’t want to make decisions about what I’m eating or wearing. Because I have too many other decisions to make.” Even Christopher Nolan gets into the act, as we learn in the recent New York Times Magazine profile by Gideon Lewis-Kraus:

Nolan’s own look accords with his strict regimen of optimal resource allocation and flexibility: He long ago decided it was a waste of energy to choose anew what to wear each day, and the clubbable but muted uniform on which he settled splits the difference between the demands of an executive suite and a tundra. The ensemble is smart with a hint of frowzy, a dark, narrow-lapeled jacket over a blue dress shirt with a lightly fraying collar, plus durable black trousers over scuffed, sensible shoes.

Mark Zuckerberg

If you were to draw a family tree between all these monochromatic Vulcans, you’d find that, consciously or not, they’re all echoing their common patron saint, Ian Malcolm in Jurassic Park, who says:

In any case, I wear only two colors, black and gray…These colors are appropriate for any occasion…and they go well together, should I mistakenly put on a pair of gray socks with my black trousers…I find it liberating. I believe my life has value, and I don’t want to waste it thinking about clothing.

As Malcolm speaks, Crichton writes, “Ellie was staring at him, her mouth open”—apparently stunned into silence, as all women would be, at this display of superhuman rationality. And while it’s easy to make fun of it, I’m basically one of those guys. I eat the same breakfast and lunch every day; my daily uniform of polo shirt, jeans, and New Balance sneakers rarely, if ever, changes; and I’ve had the same haircut for the last eighteen years. If pressed, I’d probably offer a rationale more or less identical to the ones given above. As a writer, I’m called upon to solve a series of agonizingly specific problems each time I sit down at my desk, so the less headspace I devote to everything else, the better.

Which is all well and good. But it’s also easy to confuse the externals with their underlying intention. The world, or at least the Bay Area, is full of young guys with the Zuckerberg look, but it doesn’t matter how little time you spend getting dressed if you aren’t mindfully reallocating the time you save, or extending the principle beyond the closet. The most eloquent defense of minimizing extraneous thinking was mounted by the philosopher Alfred North Whitehead, who writes:

It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking of what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them. Operations of thought are like cavalry charges in a battle—they are strictly limited in number, they require fresh horses, and must only be made at decisive moments.

Whitehead isn’t talking about his shirts here; he’s talking about the Arabic number system, a form of “good notation” that frees the mind to think about more complicated problems. Which only reminds us that the shirts you wear won’t make you more effective if you aren’t being equally thoughtful about the decisions that really count. Otherwise, they’re only an excuse for laziness or indifference, which is just as contagious as efficiency. And it often comes to us as a wolf in nerd’s clothing.

The holy grail of props

leave a comment »

Grail diary from Indiana Jones and the Last Crusade

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s question: “What movie prop would you love to own?”

Twenty years ago, when I first saw Jurassic Park, the moment that stuck with me the most wasn’t the raptor attack or even Jeff Goldblum’s creepy laugh: it was the park brochure that appears briefly onscreen before Laura Dern tramples it into the mud. We see it for little more than a second, but the brevity of its role is exactly what struck me. A prop artist—or, more likely, a whole team of them—had painstakingly written, typeset, and printed a tangible piece of ephemera for the sake of that fleeting gag. In a way, it seemed to stand in for the unseen efforts that lie behind every frame of film, those invisible touches of craft and meticulous labor that add up to make the story a little more real. Looking back, I recognize how showy that shot really is: it wasn’t captured by accident, even if it’s staged like a throwaway, and it calls attention to itself in a degree that most good props probably shouldn’t. And my reaction makes me feel uncomfortably like the hypothetical moviegoers that Pauline Kael imagined being impressed by Doctor Zhivago: “The same sort of people who are delighted when a stage set has running water or a painted horse looks real enough to ride.”

But it’s still delightful. I’ve always been fascinated by movie props, perhaps because they feel like the purest expression of the glorious waste of filmmaking: an object is lovingly crafted and aged by hand simply to be photographed, or to sit out of focus in the background of a single shot. My appreciation of the Lord of the Rings trilogy went up another notch after I watched hours of production featurettes last winter, many of which focused on the prop department. I learned, for instance, that the artisans who made the hundreds of sets of chain mail wore down their own fingerprints in the process, and that Theoden’s armor included a golden sun stamped on the inside of the breastplate, where no one but Bernard Hill would ever see it. Each touch is imperceptible, but in the aggregate, they add up to a vision of a world that remains totally convincing: even if we quibble over Peter Jackson’s narrative choices, it’s impossible not to be impressed by his determination to build up so much detail before an audience even existed to see it—if they ever noticed it at all. Props are designed to serve the story, not to dominate it, and I’d be inclined to call it a thankless task if I weren’t so profoundly grateful for the result.

Brochure from Jurassic Park

Maybe because I’m an author, I’ve always been especially taken by props that involve written text, whether they’re John Doe’s notebooks from Seven or the obsessively detailed newspapers of the future that we glimpse in Children of Men. I think I find such props so fascinating because they feel like a reversal of the way words and filmed images naturally relate: if a screenplay serves as the engine or blueprint of the movie as a whole, these words exist only for their visual properties, which can only be convincing if someone has taken the time to treat them as if they were meant to be read in their own right. When a movie falls short here, it can pull you out of the story even more drastically than most comparable mistakes: my favorite example is from The Godfather Part III, which prominently displays a headline from The Wall Street Journal with text that seems to have been copied and pasted from a computer instruction manual. (These days, movies seem aware of how much every shot is likely to be scrutinized, so they’re more likely to take the time to write something up for the sake of viewers and their pause buttons, like Captain America’s to-do list.)

As far as I’m concerned, the greatest prop of them all has to be the grail diary in Indiana Jones and the Last Crusade. We see it clearly for maybe a total of thirty seconds, but those few glimpses were enough to fuel a lifetime’s worth of daydreams: I sometimes think I owe half of my inner life to Henry Jones’s battered little notebook. As it happens, you can read the whole thing online, or some simulacrum of it, thanks to the efforts of such prop replica masters as Indy Magnoli, whose work goes on eBay for nine hundred dollars or more—and I can’t say that I wasn’t tempted, years ago, to pick up one for myself. Recently, the original prop went up for auction at Christie’s, and while I’ve love to be able to tell you that I was the one who shelled out $30,000 for it, sadly, it wasn’t me. Still, I’m probably better off. Up close, a prop rarely has the same magic that it had in the scant seconds you saw it onscreen; an object that seemed unbearably precious can turn out to be made of pasteboard and hot glue. If we believed in it for the brief interval of time in which it appeared on camera, it succeeded. Which is true of everything about the movies. And if we dreamed about it afterward, well, then it belongs to us all the more.

Learning from the masters: Jim Henson

leave a comment »

Many careers in movies have been cut short too soon, but the death of Jim Henson sometimes feels like the greatest loss of all. It’s especially tragic because Henson died in 1990, just as advances in digital effects—in The Abyss, in Terminator 2, and above all in Jurassic Park—were threatening to make his life’s work seem obsolete, when in fact he was more urgently needed than ever. Despite the occasional gesture in the direction of practical effects by the likes of Guillermo Del Toro, Henson still feels like the last of the great handmade magicians. As David Thomson points out:

Jim Henson’s early death was all the harder to take in that he worked with the odd, the personal, the wild, and the homemade, and flourished in the last age before the computer. It’s therefore very important that Henson was not just the entrepreneur and the visionary, but often the hand in the glove, the voice, and the tall man bent double, putting on a show.

As you can tell from the cake topper at my wedding, I’ve always been a Henson fan (although I wouldn’t go so far as to say that I appreciate the Muppets on a much deeper level than you), but his achievement was recently underlined for me by the museum exhibition Jim Henson’s Fantastic World, which I’ve seen twice. The first time was at the Smithsonian in the fall of 2008. It was a stressful time for me—I’d just parted ways with my first agent, had to scrap an entire novel, and was working on a second without a lot to show for it—but the Henson exhibition was a vivid reminder of why I’d taken these risks in the first place. Seeing it again at Chicago’s Museum of Science and Industry a few months ago, when I was in a much better place professionally, only served to reassure me that I’m still on the right track.

Aside from Henson’s commitment to character and storytelling, which I already knew, I was left with two big takeaways from the exhibition. The first was the breadth of Henson’s talent and experience. He wasn’t just a puppeteer, but a gifted graphic artist, animator, cartoonist, experimental filmmaker, and jack of all arts and crafts, which is exactly what a good puppeteer needs to be. Looking at his sketches, drawings, and scripts leaves you stunned by his curiosity and enthusiasm regarding every element of the creative process. Long before his death, he was already exploring computer animation, and if he had lived, it’s likely that he would have brought about the fusion of CGI with practical effects promised by Jurassic Park and sadly neglected ever since.

The second remarkable thing about Henson was his perseverance. It’s startling to realize that by the time The Muppet Show premiered in 1976, Henson had already been working hard as a puppeteer for more than twenty years. Even the ephemera of his early career—like the series of short commercials he did for Wilkins Coffee, or his turn as the La Choy Dragon—have incredible humor and charm. And it was that extended apprenticeship, the years of dedication to building characters and figuring out how to make them live, that made Sesame Street possible when the time came. Jim Henson did what few artists in any medium have ever done: he willed an entire art form into existence, or at least into the mainstream. And of his example, as David Thomson concludes, “we are in urgent need of young artists taking it up all over the world.”

Sherrinford Holmes and the trouble with names

with 8 comments

So work on my second novel is coming along pretty well. Research is winding down; location work is finished. I’ve got a fairly good outline for Part I, a sense of the personalities and backgrounds of a dozen important—though still nameless—characters, and…

Hold on. I have a dozen important characters, but aside from a few holdovers from my first book, I haven’t named them yet. And I need to come up with some names soon. I have just over two weeks before I start writing, but even in the meantime, there’s only so much work I can do with signifiers like “best friend” and “ruthless assassin.” (Note: not the same person.) Characters need names before they can really come to life. And it’s often this step, even before the real imaginative work begins, that feels the most frustrating, if only because it seems so important.

Naming characters is so fundamental a part of the writing process that I’m surprised it hasn’t been discussed more often. John Gardner speaks briefly about it to The Paris Review:

Sometimes I use characters from real life, and sometimes I use their real names—when I do, it’s always in celebration of people that I like. Once or twice, as in October Light, I’ve borrowed other people’s fictional characters. Naming is only a problem, of course, when you make the character up. It seems to me that every character—every person—is an embodiment of a very complicated, philosophical way of looking at the world, whether conscious or not. Names can be strong clues to the character’s system. Names are magic. If you name a kid John, he’ll grow up a different kid than if you named him Rudolph.

I can’t speak to the experience of other writers, but for me, coming up with names for characters becomes more of a nightmare with every story. Unless you’re Thomas Pynchon, who can get away with names like Osbie Feel and Tyrone Slothrop, names need to be distinctive, but not so unusual that they distract the reader; evocative, but natural; easily differentiated from one another; not already possessed by a celebrity or more famous fictional character; and fairly invisible in their origins. (I still haven’t forgiven Michael Crichton for the “Lewis Dodgson” of Jurassic Park.) As a result, it takes me the better part of a day come up with even ten passable names. And it isn’t going to get any easier: the more stories I write, the more names I use, which means that the pool of possibilities is growing ever smaller.

So what do I do? Whatever works. Sometimes a character will have a particular ethnic or national background, like the seemingly endless parade of Russians in Kamera and its sequel, which provides one possible starting point. (Wikipedia’s lists are very useful, especially now that I no longer have a phone book.) I’ll consult baby name sites, scan my bookshelves, and occasionally name characters after friends or people I admire. And the names are always nudging and jostling one another: I try to avoid giving important characters names that sound similar or begin with the same first letter, for example, which means that a single alteration may require numerous other adjustments.

Is it worth it? Yes and no. It certainly isn’t for the sake of the reader, who isn’t supposed to notice any of this—the best character names, I’m convinced, are invisible. And with few exceptions, I’d guess that even the names that feel inevitable now were, in fact, no better or worse than many alternatives: if Conan Doyle had gone with his first inclination, it’s quite possible that we’d all be fans of Ormond Sacker and Sherrinford Holmes. But for the writer, it’s an excuse to brood and meditate on the essence of each character, even if the result barely attracts the reader’s attention. So I feel well within my rights to overthink it. (Although I’m a little worried about what might happen if I ever have to name a baby.)

Written by nevalalee

February 21, 2011 at 9:13 am

%d bloggers like this: