Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Michael Crichton

Into the West

leave a comment »

A few months ago, I was on the phone with a trusted adviser to discuss some revisions to Astounding. We were focusing on the prologue, which I had recently rewritten from scratch to make it more accessible to readers who weren’t already fans of science fiction. Among other things, I’d been asked to come up with ways in which the impact of my book’s four subjects was visible in modern pop culture, and after throwing some ideas back and forth, my adviser asked me plaintively: “Couldn’t you just say that without John W. Campbell, we wouldn’t have Game of Thrones?” I was tempted to give in, but I ultimately didn’t—it just felt like too much of a stretch. (Which isn’t to say that the influence isn’t there. When a commenter on his blog asked whether his work had been inspired by the mythographer Joseph Campbell, George R.R. Martin replied: “The Campbell that influenced me was John W., not Joseph.” And that offhand comment was enough of a selling point that I put it in the very first sentence of my book proposal.) Still, I understood the need to frame the story in ways that would resonate with a mainstream readership, and I thought hard about what other reference points I could honestly provide. Star Trek was an easy one, along with such recent movies as Interstellar and The Martian, but the uncomfortable reality is that much of what we call science fiction in film and television has more to do with Star Wars. But I wanted to squeeze in one last example, and I finally settled on this line about Campbell: “For more than three decades, an unparalleled series of visions of the future passed through his tiny office in New York, where he inaugurated the main sequence of science fiction that runs through works from 2001 to Westworld.”

As the book is being set in type, I’m still comfortable with this sentence as it stands, although there are a few obvious qualifications that ought to be made. Westworld, of course, is based on a movie written and directed by Michael Crichton, whose position in the history of the genre is a curious one. As I’ve written elsewhere, Crichton was an unusually enterprising author of paperback thrillers who found himself with an unexpected blockbuster in the form of The Andromeda Strain. It was his sixth novel, and his first in hardcover, and it seems to have benefited enormously from the input of editor Robert Gottlieb, who wrote in his memoir Avid Reader:

The Andromeda Strain was a terrific concept, but it was a mess—sloppily plotted, underwritten, and worst of all, with no characterization whatsoever. [Crichton’s] scientists were beyond generic—they lacked all human specificity; the only thing that distinguished some of them from the others was that some died and some didn’t. I realized right away that with his quick mind, swift embrace of editorial input, and extraordinary work habits he could patch the plot, sharpen the suspense, clarify the science—in fact, do everything necessary except create convincing human beings. (He never did manage to; eventually I concluded that he couldn’t write about people because they just didn’t interest him.) It occurred to me that instead of trying to help him strengthen the human element, we could make a virtue of necessity by stripping it away entirely; by turning The Andromeda Strain from a documentary novel into a fictionalized documentary. Michael was all for it—I think he felt relieved.

The result, to put it mildly, did quite well, and Crichton quickly put its lessons to work. But it’s revealing that the flaws that Gottlieb cites—indifferent plotting, flat writing, and a lack of real characterization—are also typical of even some of the best works of science fiction that came out of Campbell’s circle. Crichton’s great achievement was to focus relentlessly on everything else, especially readability, and it’s fair to say that he did a better job of it than most of the writers who came up through Astounding and Analog. He was left with the reputation of a carpetbagger, and his works may have been too square and fixated on technology to ever be truly fashionable. Yet a lot of it can be traced back to his name on the cover. In his story “Pierre Menard, Author of the Quixote,” Jorge Luis Borges speaks of enriching “the slow and rudimentary act of reading by means of a new technique—the technique of deliberate anachronism and fallacious attribution.” In this case, it’s pretty useful. I have a hunch that if The Terminal Man, Congo, and Sphere had been attributed on their first release to Robert A. Heinlein, they would be regarded as minor classics. They’re certainly better than many of the books that Heinlein was actually writing around the same time. And if I’m being honest, I should probably confess that I’d rather read Jurassic Park again than any of Asimov’s novels. (As part of my research for this book, I dutifully made my way through Asimov’s novelization of Fantastic Voyage, which came out just three years before The Andromeda Strain, and his fumbling of that very Crichtonesque premise only reminded me of how good at this sort of thing Crichton really was.) If Crichton had been born thirty years earlier, John W. Campbell would have embraced him like a lost son, and he might well have written a better movie than Destination Moon.

At its best, the television version of Westworld represents an attempt to reconcile Crichton’s gifts for striking premises and suspense with the more introspective mode of the genre to which he secretly belongs. (It’s no accident that Jonathan Nolan had been developing it in parallel with Foundation.) This balance hasn’t always been easy to manage, and last night’s premiere suggests that it can only become more difficult going forward. Westworld has always seemed defined by the pattern of forces that were acting on it—its source material, its speculative and philosophical ambitions, and the pressure of being a flagship drama on HBO. It also has to deal now with the legacy of its own first season, which set a precedent for playing with time, as well as the scrutiny of viewers who figured it out prematurely. The stakes here are established early on, with Bernard awakening on a beach in a sequence that seems like a nod to the best film by Nolan’s brother, and this time around, the parallel timelines are put front and center. Yet the strain occasionally shows. The series is still finding itself, with characters, like Dolores, who seem to be thinking through their story arcs out loud. It’s overly insistent on its violence and nudity, but it’s also cerebral and detached, with little possibility of real emotional pain that the third season of Twin Peaks was able to inflict. I don’t know if the center will hold. Yet’s also possible that these challenges were there from the beginning, as the series tried to reconcile Crichton’s tricks with the tradition of science fiction that it clearly honors. I still believe that this show is in the main line of the genre’s development. Its efforts to weave together its disparate influences strike me as worthwhile and important. And I hope that it finds its way home.

Exploring “The Proving Ground,” Part 2

leave a comment »

The Seasteading Institute

Note: My novella “The Proving Ground,” which was first published in the January/February 2017 issue of Analog, is being reprinted this month in Lightspeed Magazine. It will also be appearing in the upcoming edition of The Year’s Best Science Fiction, edited by Gardner Dozois, and is a finalist for the Analytical Laboratory award for Best Novella. This post on the story’s conception and writing originally appeared, in a slightly different form, on January 10, 2017. 

The editor John W. Campbell once pointed out that an industrial safety manual is really the perfect handbook for a saboteur—if you just do the opposite of whatever it says. You see the same mindset in a lot of science fiction, which is often founded on constructing an elaborate futuristic scenario and then figuring out all the things that could possibly go wrong. This is central to most forms of storytelling, of course, but it takes on an added resonance in a genre that purports to tell us how the future will look, and at times, it can be hard to distinguish between the author’s own feelings on the subject and the conflict required for a good story. If dystopias seem more common than utopias, this may be less a prediction than a shrewd narrative choice, and it frequently leads to a streak of what looks like technophobia even in writers who seem otherwise inclined to celebrate all that technology can accomplish. (This is especially true when you start out with the intention of writing a thriller. In the case of someone like Michael Crichton, it can be difficult to tell where his instincts as a novelist leave off and his genuine pessimism begins. Nothing goes right in Jurassic Park, but this has less to do with chaos theory than with the conventions of suspense.) When I started work on “The Proving Ground,” I had a wealth of information at my disposal from the seasteading movement, much of which was devoted to arguing that an ocean colony would be viable and safe. But it also provided me with a list of story ideas, as soon as I began to read it with an eye to the worst that could happen.

For instance, in an online book about seasteading by Patri Friedman, the former executive director of Peter Thiel’s Seasteading Institute, we read: “The ocean is a dangerous environment. There are massive waves, hurricanes, and even pirates.” Taken out of context, this is either an argument for risk mitigation or a line from a pitch to Jerry Bruckheimer. And while I didn’t think much about the possibility of pirates—although for the life of me I can’t remember why—I spent a long time looking into waves and hurricanes. A hurricane or typhoon seemed like a better prospect, mostly because it provided more of a natural buildup than a wave, and it would be easier to structure a story around it. I even read The Perfect Storm from cover to cover to see if it would spark any ideas. What I ultimately concluded was that there was probably a good story to be told about a seastead that was hit by a hurricane, and that if I could work out the logistics, it would be pretty exciting. But it felt more like a disaster movie, and so did most of the other possibilities that I explored for damaging or destroying my seastead. (Looking back at my notes, it seems that I also briefly considered building a plot around a sabotage attempt, which seems a little lazy.) The trouble was that all of these crises were imposed from the outside, and none seemed to emerge naturally from the premise of climate change in the Marshall Islands. So after almost a week of pursuing the hurricane angle, I gave it up, which is a long time to devote to a wrong turn.

Tippi Hedren in The Birds

I was saved by an idea that came from an altogether different direction. One of the first things I had to decide was when the story would be set, both in the chronology of the seastead itself and in the world as a whole. Was the seastead under construction, or had it been occupied for years or decades? Were we talking about a scenario in which the threat of rising sea levels was still a distant one, or had it already happened? And what was taking place elsewhere? I spent a while looking into the various proposals that have been floated for the technological mitigation of global warming, such as the idea of releasing sulfur dioxide into the atmosphere to reflect sunlight back into space. (Even if it wasn’t central to the story, it seemed like it might make a good ironic counterpoint to the plot. The Marshall Islands probably won’t survive, no matter what else we do in the meantime.) I was especially interested in iron fertilization, in which tiny pellets of iron are released into the oceans to encourage the growth of plankton that can suck up carbon dioxide. It’s unclear how well this works, however and there are other potential issues, as I found in a paper with the unpromising title “Iron enrichment stimulates toxic diatom production in high-nitrate, low-chlorophyll areas.” In particular, it can lead to high levels of pseudonitzschia, a plankton species that produces the poison domoic acid, which accumulates in fish and squid. And it turned out that the Marshall Islands leased its offshore waters in the nineties to a private company to conduct iron fertilization on a limited scale, before it was outlawed as a form of illegal dumping.

At this point, I presumably had a vague idea that it might be possible to build a story around iron fertilization in the Marshall Islands and an ensuing outbreak of domoic acid poisoning, which can cause seizures and death. But then I came across a paper that proposed that a similar outbreak might have been responsible for the unexplained incident on August 18, 1961, in which the towns of Capitola and Santa Cruz in California were attacked by mobs of seabirds—an event that also caught the eye of Alfred Hitchcock. Which meant that I knew the following facts:

  1. The Marshall Islands once contracted with a company to perform a series of iron fertilization experiments.
  2. Iron fertilization has been linked to increased levels of pseudonitzschia, which produces domoic acid.
  3. Domoic acid can cause brain damage in seabirds that eat contaminated fish and squid, and it may have been responsible for the attack that inspired The Birds.

Needless to say, I immediately forgot all about my hurricane. If there’s one thing I love about being a writer, it’s when a long process of shapeless research and daydreaming suddenly crystalizes into a form that seems inevitable, and this felt about as inevitable as it gets. Somebody was going to write this story eventually, and I figured that it might as well be me. Tomorrow, I’ll describe how I brought The Birds to the Marshall Islands, and why I ended up combining it with the ghosts of Bikini Atoll.

Written by nevalalee

March 6, 2018 at 8:32 am

Present tense, future perfect

leave a comment »

Michael Crichton

Note: I’m taking a few days off for the holidays, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on August 11, 2016.

Science fiction is set in the future so frequently that it’s hard for many readers, or writers, to envision it in any other way. Yet there are times when a futuristic setting actively interferes with the story. If you think that the genre’s primary function is a predictive one, it’s hard to avoid, although I’ve made it pretty clear that I believe that it’s the other way around—the idea that science fiction is a literature of prediction emerged only after most of its elements were already in place. But if you see it as a vehicle for telling compelling stories in which science plays an important role, or as a sandbox for exploring extreme social or ethical situations, you realize that it can be even more effective when set in the present. This is especially true of science fiction that trades heavily on suspense and paranoia. My favorite science fiction novel ever, Eric Frank Russell’s Sinister Barrier, is set in the near future for no particular reason: its premise of invisible alien beings who manipulate human civilization would work even better in ordinary surroundings, and nothing fundamental about the story itself would have to change. You could say much the same about Heinlein’s The Puppet Masters, which is indebted to Russell’s story in more ways than one. And there’s a sense in which The X-Files actually plays better today, as a period piece, than it did when it initially aired: in hindsight, the early nineties have become the definition of mundanity, and a perfect setting for horror. (At a time when we seem to be actually living in an alternate history novel, its assumption that a sinister government conspiracy had to be kept secret can seem downright comforting.)

When you push science fiction into the present, however, something curious happens: people start to think of it as something else. In particular, it tends to be labeled as a technothriller. This is ultimately just a marketing category, and slipperier than most, but it can be defined as science fiction that limits itself to a single line of extrapolation, usually in the form of a new technology, while grounding the rest in the period in which the book was written. And you’d think that this approach would be seen as worthwhile. Plausibly incorporating a hypothetical technology or scientific advance into the modern world can be just as hard as inventing an entire future society, and it allows the writer to tackle themes that lie close to the heart of the genre. If we’re looking to science fiction to help us work out the implications of contemporary problems, to simulate outcomes of current trends, or to force us to look at our own lives and assumptions a little differently, a story that takes place against a recognizable backdrop can confront us with all of these issues more vividly. A futuristic or interplanetary setting has a way of shading into fantasy, which isn’t necessarily bad, but risks turning the genre into exactly what John W. Campbell always insisted it wasn’t—a literature of escapism. In theory, then, any effort to coax science fiction back into the present is enormously important, and we should welcome the technothriller as a matrix in which the tools of the genre can be brought to bear on the reality around us.

Gillian Anderson in War of the Coprophages

In practice, that isn’t how it turns out. The technothriller is often dismissed as a disreputable subgenre or a diluted version of the real thing, and not always without reason. There are a few possible explanations for this. One is that because of the technothriller’s natural affinity for suspense, it attracts literary carpetbaggers—writers who seem to opportunistically come from outside the genre, rather than emerging from within it. Michael Crichton, for instance, started out by writing relatively straight thrillers under pen names like Jeffrey Hudson and John Lange, and it’s interesting to wonder how we’d regard The Andromeda Strain, or even Sphere or Congo, if he had worked his way up in the pages of Analog. Other reasons might be the genre’s pervasive strain of militarism, which reflects the association of certain kinds of technological development with the armed forces; its emphasis on action; or even the sort of writer that it attracts. Finally, there’s the inescapable point that most technothrillers are providing escapism of another kind, with hardware taking the place of original characters or ideas. That’s true of a lot of science fiction, too, but a technothriller doesn’t even ask readers to make the modicum of effort necessary to transport themselves mentally into another time or place. It’s just like the world we know, except with better weapons. As a result, it appeals more to the mundanes, or readers who don’t think of themselves as science fiction fans, which from the point of view of fandom is probably the greatest sin of all.

Yet it’s worth preserving the ideal of the technothriller, both because it can be a worthwhile genre in itself and because of the light that it sheds on science fiction as a whole. When we think of the didactic, lecturing tone that dominated Crichton’s late novels, starting with Rising Sun, it’s easy to connect it to the psychological role that hardware plays within a certain kind of thriller. As I’ve discussed elsewhere, because the writer gets certain technical details right, we’re more inclined to believe what he says when it comes to other issues, at least while we’re still reading the book. But it takes another level of insight to realize that this is also true of Heinlein. (The story of Campbellian science fiction is one of writers who were so good at teaching us about engineering that we barely noticed when they moved on to sociology.) And the strain of technophobia that runs through the genre—which is more a side effect of the need to generate suspense than a philosophical stance—can serve as a corrective to the unthinking embrace of technology that has characterized so much science fiction throughout its history. Finally, on the level of simple reading pleasure, I’d argue that any attempt to bring suspense into science fiction deserves to be encouraged: it’s a tool that has often been neglected, and the genre as a whole is invigorated when we bring in writers, even mercenary ones, who know how to keep the pages turning. If they also have great, original ideas, they’re unstoppable. This combination doesn’t often appear in the same writer. But the next best thing is to ensure that they can push against each other as part of the same healthy genre.

Written by nevalalee

December 22, 2017 at 9:00 am

Exploring “The Proving Ground,” Part 2

with 2 comments

The Seasteading Institute

Note: I’m discussing the origins of my novella “The Proving Ground,” the cover story for the January/February 2017 issue of Analog Science Fiction and Fact. You can purchase a copy and read a long excerpt of it here.  

The editor John W. Campbell once pointed out that an industrial safety manual is really the perfect handbook for a saboteur—you just do the opposite of everything it says. You see the same mindset in a lot of science fiction, which is often founded on constructing an elaborate futuristic scenario and then figuring out all the things that could possibly go wrong with it. This is central to most forms of storytelling, of course, but it takes on an added resonance in a genre that purports to tell us how the future will look. At times, it can be hard to distinguish between the author’s own views on the subject and the conflict required for a good story. It’s why dystopias are so much more common than utopias; why hubris is usually punished rather than rewarded; and why you frequently see a streak of what looks like technophobia even in writers who seem otherwise inclined to celebrate all that technology can accomplish. (This is especially true when you start out with the intention of writing a thriller. In the case of someone like Michael Crichton, it can be difficult to tell where his instincts as a novelist leave off and his genuine pessimism begins. Nothing goes right in Jurassic Park, but this has less to do with chaos theory than with the conventions of suspense.) When I started work on “The Proving Ground,” I had a wealth of information at my disposal from the seasteading movement, much of which was devoted to arguing that an ocean colony would be viable and safe. But along the way, it also inadvertently provided me with a list of story ideas, as soon as I began to read it with an eye to the worst that could happen.

For instance, in an online book about seasteading by Patri Friedman, the former executive director of Peter Thiel’s Seasteading Institute, we read: “The ocean is a dangerous environment. There are massive waves, hurricanes, and even pirates.” Taken out of context, this is either an argument for risk mitigation or a line from a pitch to Jerry Bruckheimer. And while I didn’t think much about the possibility of pirates—although for the life of me I can’t remember why—I spent a long time looking into waves and hurricanes. A hurricane or typhoon seemed like a better prospect, mostly because it provided more of a natural buildup than a wave, and it would be easier to structure a story around it. I even read The Perfect Storm from cover to cover to see if it would spark any ideas. What I ultimately concluded was that there was probably a good story to be told about a seastead that was hit by a hurricane, and that if I could work out the logistics, it would be pretty exciting. But it felt more like a disaster movie, and so did most of the other possibilities that I explored for damaging or destroying my seastead. (Looking back at my notes, it seems that I also briefly considered building a plot around a sabotage attempt, which seems a little lazy.) The trouble was that all of these crises were imposed from the outside, and none seemed to emerge naturally from the premise of climate change in the Marshall Islands. So after almost a week of pursuing the hurricane angle, I gave it up, which is a long time to devote to a wrong turn.

Tippi Hedren in The Birds

I was saved by an idea that came from an altogether different direction. One of the first things I had to decide was when the story would be set, both in the chronology of the seastead itself and in the world as a whole. Was the seastead under construction, or had it been occupied for years or decades? Were we talking about a scenario in which the threat of rising sea levels was still a distant one, or had it already happened? And what was taking place elsewhere? I spent a while looking into the various proposals that have been floated for the technological mitigation of global warming, such as the idea of releasing sulfur dioxide into the atmosphere to reflect sunlight back into space. (Even if it wasn’t central to the story, it seemed like it might make a good ironic counterpoint to the plot: the Marshall Islands probably won’t survive, no matter what else we do in the meantime.) I was especially interested in iron fertilization, in which tiny pellets of iron are released into the oceans to encourage the growth of plankton that can suck up carbon dioxide. It’s unclear how well this works, however and there are other potential issues, as I found in a paper with the unpromising title “Iron enrichment stimulates toxic diatom production in high-nitrate, low-chlorophyll areas.” In particular, it can lead to high levels of pseudonitzschia, a plankton species that produces the poison domoic acid, which accumulates in fish and squid. And it turned out that the Marshall Islands leased its offshore waters in the nineties to a private company to conduct iron fertilization on a limited scale, before it was outlawed as a form of illegal dumping.

At this point, I presumably had a vague idea that it might be possible to build a story around iron fertilization in the Marshall Islands and an ensuing outbreak of domoic acid poisoning, which can cause seizures and death. But then I came across a paper that proposed that a similar outbreak might have been responsible for the unexplained incident on August 18, 1961, in which the towns of Capitola and Santa Cruz in California were attacked by mobs of seabirds—an event that also caught the eye of Alfred Hitchcock. Which meant that I knew the following facts:

  1. The Marshall Islands once contracted with a company to perform a series of iron fertilization experiments.
  2. Iron fertilization has been linked to increased levels of pseudonitzschia, which produces domoic acid.
  3. Domoic acid can cause brain damage in seabirds that eat contaminated fish and squid, and it may have been responsible for the attack that inspired The Birds.

Needless to say, I immediately forgot all about my hurricane. If there’s one thing I love about being a writer, it’s when a long process of shapeless research and daydreaming suddenly crystalizes into a form that seems inevitable, and this felt about as inevitable as it gets. Somebody was going to write this story eventually, and I figured that it might as well be me. Tomorrow, I’ll describe how I brought The Birds to the Marshall Islands, and how I ended up combining it with the ghosts of Bikini Atoll.

Written by nevalalee

January 10, 2017 at 9:26 am

The western tradition

with 2 comments

Ed Harris on Westworld

As we were watching the premiere of Westworld last week, my wife turned to me and said: “Why would they make it a western park?” Or maybe I asked her—I can’t quite remember. But it’s a more interesting question than it sounds. When Michael Crichton’s original movie was released in the early seventies, the western was still a viable genre. It had clearly fallen from its peak, but major stars were doing important work in cowboy boots: Eastwood, of course, but also Newman, Redford, and Hoffman. John Wayne was still alive, which may have been the single most meaningful factor of all. As a result, it wasn’t hard to imagine a theme park with androids designed to fulfill that particular fantasy. These days, the situation has changed. The western is so beleaguered an art form that whenever one succeeds, it’s treated as newsworthy, and that’s been true for the last twenty years. Given the staggering expense and investment involved in a park like this, it’s hard to see why the western would be anybody’s first choice. (Even with the movie, I suspect that Crichton’s awareness of his relatively low budget was part of the decision: it was his first film as a director, with all of the limitations that implies, and a western could be shot cheaply on standing sets in the studio backlot.) Our daydreams simply run along different lines, and it’s easier to imagine a park being, say, set in a medieval fantasy era, or in the future, or with dinosaurs. In fact, there was even a sequel, Futureworld, that explored some of these possibilities, although it’s fair to say that nobody remembers it.

The television series Westworld, which is arriving in a markedly different pop cultural landscape, can’t exactly ditch the premise—it’s right there in the title. But the nice thing about the second episode, “Chestnut,” is that it goes a long way toward explaining why you’d still want to structure an experience like this around those conventions. It does this mostly by focusing on a new character, William, who arrives at the park knowing implausibly little about it, but who allows us to see it through the eyes of someone encountering it for the first time. What he’s told, basically, is that the appeal of Westworld is that it allows you to find out who you really are: you’re limited only by your inhibitions, your abilities, and your sense of right and wrong. That’s true of the real world, to some extent, but we’re also more conscious of the rules. And if the western refuses to go away as a genre, it’s because it’s the purest distillation of that seductive sense of lawlessness. The trouble with telling certain stories in the present day is that there isn’t room for the protagonist that thrillers have taught us to expect: a self-driven hero who solves his problems for himself in matters of life and death. That isn’t how most of us respond to a crisis, and in order to address the issue of why the main character doesn’t just go to the police, writers are forced to fall back on various makeshift solutions. You can focus on liminal figures, like cops or criminals, who can take justice into their own hands; you can establish an elaborate reason why the authorities are helpless, indifferent, or hostile; or you can set your story in a time or place where the rules are different or nonexistent.

Thandie Newton on Westworld

The western, in theory, is an ideal setting for a story in which the hero has to rely on himself. It’s a genre made up of limitless open spaces, nonexistent government, unreliable law enforcement, and a hostile native population. If there’s too much civilization for your story to work, your characters can just keep riding. To move west, or to leave the center of the theme park, is to move back in time, increasing the extent to which you’re defined by your own agency. (A western, revealingly, is a celebration of the qualities that we tend to ignore or dismiss in our contemporary immigrant population: the desire for a new life, the ability to overcome insurmountable obstacles, and the plain observation that those who uproot themselves and start from scratch are likely to be more competent and imaginative, on average, than those who remain behind.) The western is the best narrative sandbox ever invented, and if it ultimately exhausted itself, it was for reasons that were inseparable from its initial success. Its basic components were limited: there were only so many ways that you could combine those pieces. Telling escapist stories involved overlooking inconvenient truths about Native Americans, women, and minorities, and the tension between the myth and its reality eventually became too strong to sustain. Most of all, its core parts were taken over by other genres, and in particular by science fiction and fantasy. This began as an accidental discovery of pulp western writers who switched genres and realized that their tricks worked equally well in Astounding, and it was only confirmed by Star Trek—which Gene Roddenberry famously pitched as Wagon Train in space—and Star Wars, which absorbed those clichés so completely that they became new again.

What I like about Westworld, the series, is that it reminds us of how artificial this narrative always was, even in its original form. The Old West symbolizes freedom, but only if you envision yourself in the role of the stock protagonist, who is usually a white male antihero making the journey of his own volition. It falls apart when you try to imagine the lives of the people in the background, who exist in such stories solely to enable the protagonist’s fragile range of options. In reality, the frontier brutally circumscribed the lives of most of those who tried to carve out an existence there, and the whole western genre is enabled by a narrative illusion, or a conspiracy, that keeps its solitary and brutish aspects safely in the hands of the characters at the edges of the frame. Westworld takes that notion to its limit, by casting all the supporting roles with literal automatons. They aren’t meant to have inner lives, any more than the peripheral figures in any conventional western, and the gradual emergence of their consciousness implies that the park will eventually come to deconstruct itself. (The premiere quoted cleverly from The Searchers and Unforgiven, but I almost wish that it had saved those references until later, so that the series could unfold as a miniature history of the genre as it slowly attained self-awareness.) If you want to talk about how we picture ourselves in the heroes of our own stories, while minimizing or reducing the lives of those at the margins, it’s hard to imagine a better place to do it than the western, which depended on a process of historical amnesia and dehumanization from the very beginning. I’m not sure I’d want to visit a park like Westworld. But there will always be those who would.

Written by nevalalee

October 10, 2016 at 8:36 am

The Westworld expansion

with 2 comments

Anthony Hopkins on Westworld

Note: Spoilers follow for the series premiere of Westworld.

Producing a television series, as I’ve often said here before, is perhaps the greatest test imaginable of the amount of control that a storyteller can impose on any work of art. You may have a narrative arc in mind that works beautifully over five seasons, but before you even begin, you know that you’ll have to change the plan to deal with the unexpected: the departure of a star, budgetary limitations, negotiations with the network. Hanging overhead at all times is the specter of cancellation, which means that you don’t know if your story will be told over an hour, one season, or many years. You may not even be sure what your audience really wants. Maybe you’ve devoted a lot of thought to creating nuanced, complicated characters, only to realize that most viewers are tuning in for sex, violence, and sudden death scenes. It might even be to your advantage to make the story less realistic, keeping it all safely escapist to avoid raising uncomfortable questions. If you’re going to be a four-quadrant hit, you can’t appeal to just one demographic, so you’ve got to target some combination of teenagers and adults of both sexes. This doesn’t even include the critics, who are likely to nitpick the outcome no matter what. All you can really do, in the end, is set the machine going, adjust it as necessary on the fly, try to keep the big picture in mind, and remain open to the possibility that your creation will surprise you—which are conditions that the best shows create on purpose. But it doesn’t always go as it should, and successes and failures alike tend to wreak havoc with the plans of their creators. Television, you might say, finds a way.

The wonderful thing about Westworld, which might have the best pilot for any show since Mad Men, is that it delivers exceptional entertainment while also functioning as an allegory that you can read in any number of ways. Michael Crichton’s original movie, which I haven’t seen, was pitched as a commentary on the artificially cultivated experience offered to us by parks like Disney World, an idea that he later revisited with far more lucrative results. Four decades later, the immersive, open world experience that Westworld evokes is more likely to remind us of certain video games, which serve as a sandbox in which we can indulge in our best or worst impulses with maximum freedom of movement. (The character played by Ed Harris is like a player who has explored the game so throughly that he’s more interested now in looking for exploits or glitches in the code.) Its central premise—a theme park full of androids that are gradually attaining sentience—suggests plenty of other parallels, and I’m sure the series will investigate most of them eventually. But I’m frankly most inclined to see it as a show about the act of making television itself. Series creators Jonathan Nolan and Lisa Joy have evidently mapped out a narrative for something like the next five or six seasons, which feels like an attempt to reassure viewers frustrated by the way in which serialized, mythology-driven shows tend to peter out toward the end, or to endlessly tease mysteries without ever delivering satisfying answers. But I wonder if Nolan and Joy also see themselves in Dr. Ford, played here with unusual restraint and cleverness by Anthony Hopkins, who looks at his own creations and muses about how little control he really has over the result.

Evan Rachel Wood on Westworld

It’s always dangerous to predict a show’s future from the pilot alone, and I haven’t seen the other episodes that were sent to critics for review. Westworld’s premise is also designed to make you even more wary than usual about trying to forecast a system as complicated as an ambitious cable series, especially one produced by J.J. Abrams. (There are references to the vagaries of television production in the pilot itself, much of which revolves around a technical problem that forces the park’s head writer to rewrite scenes overnight, cranking up the body count in hopes that guests won’t notice the gaps in the narrative. And one of its most chilling moments comes down to the decision to recast a key supporting role with a more cooperative performer.) After the premiere, which we both loved, my wife worried that we’ll just get disillusioned by the show over time, as we did with Game of Thrones. It’s always possible, and the number of shows over the last decade that have sustained a high level of excellence from first episode to last basically starts and ends with Mad Men—which, interestingly, was also a show about writing, and the way in which difficult concepts have to be sold and marketed to a large popular audience. But I have high hopes. The underlying trouble with Game of Thrones was a structural one: one season after another felt like it was marking time in its middle stretches, cutting aimlessly between subplots and relying on showy moments of violence to keep the audience awake, and many of its issues arose from a perceived need to keep from getting ahead of the books. It became a show that only knew how to stall and shock, and I would have been a lot more forgiving of its sexual politics if I had enjoyed the rest of it, or if I believed that the showrunners were building to something worthwhile.

I have more confidence in Westworld, in part because the pilot is such a confident piece of storytelling, but also because the writers aren’t as shackled by the source. And I feel almost grateful for the prospect of fully exploring this world over multiple seasons with this cast and these writers. Jonathan Nolan, in particular, has been overshadowed at times by his brother Christopher, who would overshadow anyone, but his résumé as a writer is just as impressive: the story for Memento, the scripts for The Dark Knight and The Dark Knight Rises, and that’s just on the movie side. (I haven’t seen Person of Interest, but I’ve heard it described as the best science fiction show on television, camouflaged in plain sight as a procedural.) Nolan has always tended to cram more ideas into one screenplay than a movie can comfortably hold, which is a big part of his appeal: The Dark Knight is so overflowing with invention that it only underlines the limpness of the storytelling in most of the Marvel movies. What excites me about Westworld is the opportunity it presents for Nolan to allow the story to breathe, going down interesting byways and exploring its implications at length. And the signs so far are very promising. The plot is a model of story construction, to the point where I’d use it as an example in a writing class: it introduces its world, springs a few big surprises, tells us something about a dozen characters, and ends on an image that is both inevitable and deliciously unexpected. Even its references to other movies are more interesting than most. A visual tribute to The Searchers seems predictable at first, but when the show repeats it, it becomes a wry commentary on how an homage can take the place of real understanding. And a recurring bit with a pesky fly feels like a nod to Psycho, which implicated the audience in similar ways. As Mrs. Bates says to us in one of her last lines: “I hope they are watching. They’ll see.”

Written by nevalalee

October 3, 2016 at 9:45 am

Present tense, future perfect

with 4 comments

Michael Crichton

Science fiction is set in the future so frequently that it’s hard for many readers—or writers—to envision it as anything else. Yet it doesn’t have to be that way, and there are times when a futuristic setting actively gets in the way of the story. If you think that science fiction’s primary function is a predictive one, it’s hard to avoid, although I’ve made it pretty clear that I believe that it’s the other way around: the idea that this is a literature of prediction emerged only after most of its elements were already in place. But if you see it more as a vehicle for telling exciting stories in which science plays a crucial role, or as a sandbox for exploring extreme social or ethical situations, you quickly come to realize that it can be even more effective when set in the present. This is especially true of science fiction that trades heavily on suspense and paranoia. My favorite science fiction novel ever, Eric Frank Russell’s Sinister Barrier, is set in the future for no particular reason: its premise of invisible alien beings who control our lives and manipulate human civilization would work even better in ordinary surroundings, and nothing fundamental about the story itself would have to change. You could say much the same about Heinlein’s The Puppet Masters, which is indebted to Russell’s story in more ways than one. And there’s a sense in which The X-Files actually plays better today, as a period piece, than it did when it initially aired: the early nineties have become even more mundane with time, and a perfect setting for horror.

When you push science fiction into the present, however, something curious happens: people start to think of it as something else. To be more specific, it ends up being labeled as a technothriller. This is, above all, a marketing category, and an even slipperier one than most, but it’s worth defining it simply as science fiction that limits itself to a single line of extrapolation, usually in the form of a new technology, while grounding the rest in the period in which the book was written. And you’d think that this would be seen as a worthwhile approach. Plausibly incorporating a hypothetical technology or scientific advance into the modern world can be just as hard, if done right, as inventing an entire future society, and it allows the writer to tackle themes that lie near the heart of the genre. If we’re looking to science fiction to help us work out the implications of contemporary problems, to simulate outcomes of current trends, or to force us to look at our own lives and assumptions a little differently, a story that takes place against a recognizable backdrop can confront us with these issues more vividly. A future or interplanetary setting has a way of shading into fantasy, which isn’t necessarily bad, but tends to turn the genre into exactly what Campbell always insisted it wasn’t—a literature of escapism. In theory, then, any effort to coax science fiction back into the present is enormously important, and we should welcome the technothriller as a place in which the tools of the genre can be brought to bear on the reality around us.

Gillian Anderson in War of the Coprophages

In practice, of course, that isn’t how it turns out. The technothriller is often dismissed as a disreputable subgenre, or a diluted version of the real thing—and not always without reason. There are a few possible explanations. One is that because of the technothriller’s natural affinity for suspense, it attracts literary carpetbaggers: writers who seem to opportunistically come from outside the genre, rather than emerging from within it. Michael Crichton, for instance, started out by writing relatively straight thrillers under pen names like Jeffrey Hudson and John Lange, and it’s interesting to wonder how we’d regard The Andromeda Strain, or even Sphere or Congo, if he had worked his way up in the pages of Analog. Another is the genre’s pervasive strain of militarism, which may reflect the fact that we associate certain kinds of technological development with the armed forces; the convenient excuse for action that it provides; or even the sort of writer that the genre attracts. Finally, there’s the inescapable point that most technothrillers are just providing escapism of another kind, with hardware taking the place of original characters or ideas. That’s true of a lot of science fiction, too, but a technothriller doesn’t even ask readers to make the modicum of effort necessary to transport themselves mentally into another time or place: it’s just like the world we know, except with better weapons. As a result, it appeals more to the mundanes, or readers who don’t think of themselves as science fiction fans, which from the point of view of the fandom is probably the greatest sin of all.

Yet it’s worth preserving the ideal of the technothriller, both because it can be a worthwhile genre in itself and because of the light it sheds on science fiction as a whole. When we think of the hectoring didacticism that dominated Crichton’s late novels, it’s easy to see it as an instance of the role that hardware plays within a certain kind of thriller: as I’ve discussed elsewhere, because the writer gets certain technical details right, we’re more inclined to believe what he says when it comes to the larger issues, at least while we’re reading the book. But it takes another level of insight to realize that this is also true of Heinlein. (The evolution of Campbellian science fiction is largely one of writers who were so good at lecturing us about engineering that we barely noticed when they moved on to sociology.) And the strain of technophobia that runs through the genre—which is more a side effect of suspense than a fully developed intellectual stance—can serve as a corrective to the unthinking embrace of technology that has characterized science fiction for so much of its history. Finally, on the level of simple reading pleasure, I’d argue that any attempt to bring suspense into science fiction deserves to be encouraged: it’s a tool that has often been neglected or employed in rote ways, and the genre as a whole is invigorated when we bring in writers, even mercenary ones, who know how to structure a story to keep the pages turning. Combine this with great, original ideas, and you’re unstoppable. This combination doesn’t often appear in the same writer. But the next best thing is to ensure that they can push against each other as part of the same healthy genre.

Written by nevalalee

August 11, 2016 at 8:27 am

The Ian Malcolm rule

with one comment

Jeff Goldblum in Jurassic Park

A man is rich in proportion to the number of things he can afford to leave alone.

—Henry David Thoreau, Walden

Last week, at the inaugural town hall meeting at Facebook headquarters, one brave questioner managed to cut through the noise and press Mark Zuckerberg on the one issue that really matters: what’s the deal with that gray shirt he always wears? Zuckerberg replied:

I really want to clear my life to make it so I have to make as few decisions as possible about anything except best how to serve this community…I’m in this really lucky position where I get to wake up every day and help serve more than a billion people. And I feel like I’m not doing my job if I spend any of my energy on things that are silly or frivolous about my life…So even though it kind of sounds silly—that that’s my reason for wearing a gray t-shirt every day—it also is true.

There’s a surprising amount to unpack here, starting with the fact, as Allison P. Davis of New York Magazine points out, that it’s considerably easier for a young white male to always wear the same clothes than a woman in the same situation. It’s also worth noting that wearing the exact same shirt each day turns simplicity into a kind of ostentation: there are ways of minimizing the amount of time you spend thinking about your wardrobe without calling attention to it so insistently.

Of course, Zuckerberg is only the latest in a long line of high-achieving nerds who insist, rightly or wrongly, that they have more important things to think about than what they’re going to wear. There’s more than an echo here of the dozens of black Issey Miyake turtlenecks that were stacked in Steve Jobs’s closet, and in the article linked above, Vanessa Friedman of The New York Times also notes that Zuckerberg sounds a little like Obama, who told Michael Lewis in Vanity Fair: “You’ll see I wear only gray or blue suits. I’m trying to pare down decisions. I don’t want to make decisions about what I’m eating or wearing. Because I have too many other decisions to make.” Even Christopher Nolan gets into the act, as we learn in the recent New York Times Magazine profile by Gideon Lewis-Kraus:

Nolan’s own look accords with his strict regimen of optimal resource allocation and flexibility: He long ago decided it was a waste of energy to choose anew what to wear each day, and the clubbable but muted uniform on which he settled splits the difference between the demands of an executive suite and a tundra. The ensemble is smart with a hint of frowzy, a dark, narrow-lapeled jacket over a blue dress shirt with a lightly fraying collar, plus durable black trousers over scuffed, sensible shoes.

Mark Zuckerberg

If you were to draw a family tree between all these monochromatic Vulcans, you’d find that, consciously or not, they’re all echoing their common patron saint, Ian Malcolm in Jurassic Park, who says:

In any case, I wear only two colors, black and gray…These colors are appropriate for any occasion…and they go well together, should I mistakenly put on a pair of gray socks with my black trousers…I find it liberating. I believe my life has value, and I don’t want to waste it thinking about clothing.

As Malcolm speaks, Crichton writes, “Ellie was staring at him, her mouth open”—apparently stunned into silence, as all women would be, at this display of superhuman rationality. And while it’s easy to make fun of it, I’m basically one of those guys. I eat the same breakfast and lunch every day; my daily uniform of polo shirt, jeans, and New Balance sneakers rarely, if ever, changes; and I’ve had the same haircut for the last eighteen years. If pressed, I’d probably offer a rationale more or less identical to the ones given above. As a writer, I’m called upon to solve a series of agonizingly specific problems each time I sit down at my desk, so the less headspace I devote to everything else, the better.

Which is all well and good. But it’s also easy to confuse the externals with their underlying intention. The world, or at least the Bay Area, is full of young guys with the Zuckerberg look, but it doesn’t matter how little time you spend getting dressed if you aren’t mindfully reallocating the time you save, or extending the principle beyond the closet. The most eloquent defense of minimizing extraneous thinking was mounted by the philosopher Alfred North Whitehead, who writes:

It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking of what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them. Operations of thought are like cavalry charges in a battle—they are strictly limited in number, they require fresh horses, and must only be made at decisive moments.

Whitehead isn’t talking about his shirts here; he’s talking about the Arabic number system, a form of “good notation” that frees the mind to think about more complicated problems. Which only reminds us that the shirts you wear won’t make you more effective if you aren’t being equally thoughtful about the decisions that really count. Otherwise, they’re only an excuse for laziness or indifference, which is just as contagious as efficiency. And it often comes to us as a wolf in nerd’s clothing.

Hard science fiction, harder reading

with 2 comments

Contact

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s question: “What story concept or premise do you wish wasn’t explored by the person that did something with it?”

If there’s one barrier lying between most readers and an appreciation of hard science fiction, it’s that its great ideas and visionary conceptions are so often channeled through mediocre writing. I’ve tried multiple times to read Robert L. Forward’s Dragon’s Egg, for instance, which has a sensational premise—the first contact between humans and a race of intelligent microorganisms living on a neutron star with billions of times Earth’s gravity—but maddeningly pedestrian prose. Here’s a representative paragraph from early in the novel:

Jacqueline Carnot strode over to a long table in the data processing lab in the CCCP-NASA-ESA Deep Space Research Center at CalTech. A frown clouded her pretty face. The cut of her shoulder-length brown hair and her careful choice of tailored clothing stamped her at once as “European.”

I don’t mean to pick on Forward in particular, and I have huge affection for hard science fiction in general. Yet in many cases, whenever I pick up a new story, I get the sense that it would be just as satisfying to read a five-paragraph summary that dropped any pretense of drama and focused on its central big idea. (To be fair, I often feel the same way with mystery fiction, especially of the locked-room variety, which I also love.)

It isn’t hard to see why the narrative element is often lacking. Many of the masters of science fiction were scientists first and writers afterward, and the idea frequently takes precedence over the plot and characters—which might serve as a definition for hard science fiction as a whole. This may be why I’ve always felt a bit out of place in the pages of Analog, which has been kind enough to publish several of my own stories. I think of myself as a writer first, and the ideas in most of my stories are good but not especially great. They’re really there mostly to make the story possible, rather than the other way around. This isn’t an aesthetic judgment; it’s more a reflection of my own background, ability, and tastes, and while it results in the kinds of stories I personally like to read, it also limits me to a particular narrow range. I don’t necessarily have the temperament to write a story that encompasses the entire universe, and I take comfort in the fact that there are other writers more able and inclined to do so. But I imagine that even devoted fans of the genre have to admit that it’s rare to find a writer who can marry ambitious conceptions on the grandest scale with a style that carries you along for its own sake.

Michael Crichton

That’s even true of authors who have proven themselves to be capable writers in other contexts. I’ve always found Asimov’s nonfiction more engaging than his stories—although at his best, as in “The Last Question,” he can be stunning. And I don’t think I’ve ever been as let down by a novel as by Carl Sagan’s Contact. Sagan was a peerless essayist and popularizer, and the scope of the story is as big as it gets, but Gregory Benford’s original review in the New York Times accurately sums up its faults:

Unfortunately, the reader will reach the novel’s enjoyable last third only if drawn by strong curiosity and buffered by tolerance for many first-novelist vices. Characterization proceeds by the dossier method often used by C.P. Snow, with similar results—told much but shown little, we get career profiles, some odd habits, earnest details. The narrative comes to a stop while an expository lump cajoles us into finding this person interesting.

For what it’s worth, the movie version solves a lot of these problems, mostly by focusing on Jodie Foster’s Ellie at the expense of the others, and at its best, it offers the sense of awe that the novel only sporadically delivers—and which I’m hoping to see again in Chris Nolan’s Interstellar.

In fact, while it might sound strange to say it, I often find myself wishing that many of the great ideas in science fiction had been tackled by the likes of Michael Crichton. No one will ever hold Crichton up as a paragon of style, and it’s true that many of his most famous novels repurpose ideas that had been developed earlier by other writers, but at his peak, he was a superb craftsman who knew how to keep the pages turning. (Crichton was also a writer first: he published many paperback thrillers while still in medical school, and if he stuck largely to science fiction after The Andromeda Strain, it was mostly because he was so good at it.) Near the end, as I’ve said before, he was seduced by his own tools, like many of the characters in his cautionary tales, and began to put dubious messages before story, or even his own spectacular ability with facts. Even at his worst, though, he retained a relentless focus on capturing and attaining a wide popular audience, and that kind of professional, even mercenary approach is one that more writers in the genre could stand to imitate. Science fiction has countless visionaries, but what we really need are more brilliant hacks.

Facts with a side of violence

with 2 comments

Frederick Forsyth

Over the last few weeks, I’ve been rereading The Dogs of War by Frederick Forsyth, my favorite suspense novelist. I’ve mentioned before that Forsyth is basically as good as it gets, and that he’s the writer I turn to the most these days in terms of pure enjoyment: he operates within a very narrow range of material and tone, but on those terms, he always delivers. Reading The Dogs of War again was a fascinating experience, because although it takes place in the world of mercenaries and other guns for hire, it contains surprisingly little action—maybe thirty pages’ worth over the course of four hundred dense pages. The rest of the novel is taken up by an obsessively detailed account of how, precisely, a privately funded war might be financed and equipped, from obtaining weapons to hiring a ship to acquiring the necessary amount of shirts and underwear. And although the amount of information is sometimes overwhelming, it’s always a superlatively readable book, if only because Forsyth is a master of organization and clarity.

Of course, it also works because it’s fun to learn about these things. The Dogs of War is perhaps the ultimate example of the kind of fiction that Anthony Lane, speaking of Allan Folsom’s The Day After Tomorrow, has dismissed as “not so much a novel as a six-hundred-page fact sheet with occasional breaks for violence.” Yet the pleasure we take in absorbing a few facts while reading a diverting thriller is perfectly understandable. Recently, I saw a posting on a social news site from a commenter who said that he didn’t read much, but was looking for novels that would teach him some things while telling an interesting story. I pointed him toward Michael Crichton, who is one of those novelists, like Forsyth, whose work has inspired countless imitators, but who remains the best of his breed. This kind of fiction is easy to dismiss, but conveying factual information to a reader is like any other aspect of writing: when done right, it can be a source of considerable satisfaction. In my own novels, I’ve indulged in such tidbits as how to build a handheld laser, how to open a Soviet weapons cache, and what exactly happened at the Dyatlov Pass.

Michael Crichton

That said, like all good things, the desire to satisfy a reader’s craving for information can also be taken too far. I’ve spoken elsewhere about the fiction of Irving Wallace, who crams his books with travelogues, dubious factoids, and masses of undigested research—along with a few clinical sex scenes—until whatever narrative interest the story once held is lost. And my feelings about Dan Brown are a matter of record. Here, as in most things, the key is balance: information can be a delight, but only in the context of a story that the reader finds engaging for the usual reasons. Its effectiveness can also vary within the work of a single author. Forsyth is great, but the weight of information in some of his later novels can be a little deadening; conversely, I’m not a fan of Tom Clancy, and gave up on The Cardinal of the Kremlin after struggling through a few hundred pages, but I found Without Remorse to be a really fine revenge story, hardware and all. The misuse of factual information by popular novelists has given it a bad reputation, but really, like any writing tool, it just needs to be properly deployed.

And it’s especially fascinating to see how this obsession with information—in a somewhat ambivalent form—has migrated into literary fiction. It’s hard to read Thomas Pynchon, for instance, without getting a kick from his mastery of everything from Tarot cards to aeronautical engineering, and James Wood points out that we see much the same urge in Jonathan Franzen:

The contemporary novel has such a desire to be clever about so many elements of life that it sometimes resembles a man who takes too many classes that he has no time to read: auditing abolishes composure. Of course, there are readers who will enjoy the fact that Franzen fills us in on campus politics, Lithuanian gangsters, biotech patents, the chemistry of depression, and so on…

Yet Franzen, like Pynchon, uses voluminous research to underline his point about how unknowable the world really is: if an author with the capacity to write limericks about the vane servomotor feels despair at the violent, impersonal systems of which we’re all a part, the rest of us don’t stand a chance. Popular novelists, by contrast, use information for the opposite reason, to flatter us that perhaps we, too, would make good mercenaries, if only we knew how to forge an end user certificate for a shipment of gun parts in Spain. In both cases, the underlying research gives the narrative a credibility it wouldn’t otherwise have. And the ability to use it correctly, according to one’s intentions, is one that every writer could stand to develop.

The lure of trashy fiction

with 3 comments

Yesterday’s posting on the lure of bad movies, like Birdemic, raises the obvious question of whether the same allure clings to certain trashy books. At first glance, it might seem that the answer is no, at least not the same way: while a bad movie can be polished off in ninety minutes, even the junkiest novel usually requires a somewhat greater commitment, which raises the question of whether this is really the best use of one’s time. Life, it seems, is too short to knowingly waste on bad books, especially when so much good stuff remains unread. (Whenever I read a bad book, I feel as if I need to apologize personally to William Faulkner.) And yet I’ve learned a lot from bad fiction as well. As a writer, it’s useful to know something about every kind of literature, especially when you’re trying to make your mark in a genre that has generated its share of junk. And if you don’t read some trash, as well as better books, you’ll have no way of knowing if you can tell the difference.

The trouble, of course, is that one man’s trashy novel is another man’s masterpiece. The early novels of Thomas Harris, for instance, are hugely important to me, but diminishing returns set in about halfway through Hannibal, and by Hannibal Rising, there’s barely a single interesting page. But this, of course, is a judgment call, and some might draw the line much earlier or later. The same is true of Frederick Forsyth, Stephen King, Michael Crichton, or any other prolific popular novelist. Discriminating between the good (The Day of the Jackal) and the bad (The Negotiator) in a single writer’s body of work is an important part of developing one’s own taste. And sometimes a novelist will surprise you. I’ve repeatedly tried and failed to get into Tom ClancyThe Cardinal of the Kremlin nearly put me to sleep on a recent long bus trip—but I was delighted to discover that Without Remorse is a real novel, vicious, compelling, and with bravura set pieces that recall Forsyth, or even James Ellroy.

And sometimes even literary fiction can benefit from a touch of trash. I love John Updike, and believe that the Rabbit novels are among the essential cultural documents of the last century, but if I could own only one Updike novel, it would be Couples, which even his greatest fans seem to think he wrote at least partly for the money. And yet there’s something weirdly exhilarating about seeing Updike’s extraordinary prose and observational skills applied to blatantly commercial material. Updike can’t help being an artist, even when he’s writing a big sexy novel, and I’d argue that Couples, which isn’t that far removed from Peyton Place, was the novel he was born to write. (His later attempt at a “thriller,” in the form of Terrorist, is much less satisfying, and only comes to life whenever Updike revisits his old adulterous territory.)

But have I ever deliberately set out to read a novel that I knew was bad? Sure. While I haven’t managed to make it through Still Missing, for one, I love reading the bestsellers of yesteryear, embodied in the rows of yellowing paperbacks that line the shelves of thrift stores. The 1970s was a particularly rich era for trash. During my move from New York last year, the only book I kept in my empty apartment was a battered copy of Arthur Hailey’s Hotel, which I enjoyed immensely, especially when I mentally recast all the characters with actors from Mad Men. And I’m a little embarrassed to admit how quickly I plowed through Irving Wallace’s The Fan Club—a terrible book, and much less interesting than Wallace himself, but remarkably evocative of its era in popular fiction. Such books may not be great, but they’re an undeniable part of a writer’s education. (As long as they aren’t all you read.)

Sherrinford Holmes and the trouble with names

with 8 comments

So work on my second novel is coming along pretty well. Research is winding down; location work is finished. I’ve got a fairly good outline for Part I, a sense of the personalities and backgrounds of a dozen important—though still nameless—characters, and…

Hold on. I have a dozen important characters, but aside from a few holdovers from my first book, I haven’t named them yet. And I need to come up with some names soon. I have just over two weeks before I start writing, but even in the meantime, there’s only so much work I can do with signifiers like “best friend” and “ruthless assassin.” (Note: not the same person.) Characters need names before they can really come to life. And it’s often this step, even before the real imaginative work begins, that feels the most frustrating, if only because it seems so important.

Naming characters is so fundamental a part of the writing process that I’m surprised it hasn’t been discussed more often. John Gardner speaks briefly about it to The Paris Review:

Sometimes I use characters from real life, and sometimes I use their real names—when I do, it’s always in celebration of people that I like. Once or twice, as in October Light, I’ve borrowed other people’s fictional characters. Naming is only a problem, of course, when you make the character up. It seems to me that every character—every person—is an embodiment of a very complicated, philosophical way of looking at the world, whether conscious or not. Names can be strong clues to the character’s system. Names are magic. If you name a kid John, he’ll grow up a different kid than if you named him Rudolph.

I can’t speak to the experience of other writers, but for me, coming up with names for characters becomes more of a nightmare with every story. Unless you’re Thomas Pynchon, who can get away with names like Osbie Feel and Tyrone Slothrop, names need to be distinctive, but not so unusual that they distract the reader; evocative, but natural; easily differentiated from one another; not already possessed by a celebrity or more famous fictional character; and fairly invisible in their origins. (I still haven’t forgiven Michael Crichton for the “Lewis Dodgson” of Jurassic Park.) As a result, it takes me the better part of a day come up with even ten passable names. And it isn’t going to get any easier: the more stories I write, the more names I use, which means that the pool of possibilities is growing ever smaller.

So what do I do? Whatever works. Sometimes a character will have a particular ethnic or national background, like the seemingly endless parade of Russians in Kamera and its sequel, which provides one possible starting point. (Wikipedia’s lists are very useful, especially now that I no longer have a phone book.) I’ll consult baby name sites, scan my bookshelves, and occasionally name characters after friends or people I admire. And the names are always nudging and jostling one another: I try to avoid giving important characters names that sound similar or begin with the same first letter, for example, which means that a single alteration may require numerous other adjustments.

Is it worth it? Yes and no. It certainly isn’t for the sake of the reader, who isn’t supposed to notice any of this—the best character names, I’m convinced, are invisible. And with few exceptions, I’d guess that even the names that feel inevitable now were, in fact, no better or worse than many alternatives: if Conan Doyle had gone with his first inclination, it’s quite possible that we’d all be fans of Ormond Sacker and Sherrinford Holmes. But for the writer, it’s an excuse to brood and meditate on the essence of each character, even if the result barely attracts the reader’s attention. So I feel well within my rights to overthink it. (Although I’m a little worried about what might happen if I ever have to name a baby.)

Written by nevalalee

February 21, 2011 at 9:13 am

The curious case of Michael Crichton

with 3 comments

As long as we’re on the subject of research in fiction, we may as well consider the singular example of Michael Crichton, who, more than any other popular novelist in recent history, appeared to spin the straw of factual information into fictional gold. In many ways, it’s one of the most extraordinary careers in twentieth century culture: in addition to his bestselling novels, Crichton was a screenwriter, director, and creator of ER. He was perhaps the last popular novelist, aside from John Grisham, whose books all seemed destined to be made into movies, and without the benefit of a series character. He was also a graduate of Harvard Medical School, extremely tall, and exceptionally good-looking.

Given all these accomplishments, it would give me great malicious pleasure to inform you that, alas, Crichton was a bad writer. Except he wasn’t. Within the constrains of the genre that he invented, or at least perfected, he was the best there was. When you’ve read as many bad thrillers as I have, a novel like Jurassic Park comes to seem like a model of the craft: it’s smart, expertly paced, with characters who are just distinct enough not to be interchangeable, but not so memorable that they get in the way of the story. In short, Crichton’s books are the Cadillacs of technothrillers, and their quality is impressive enough to make Jonathan Franzen’s insistence that “the work I’m doing is simply better than Crichton’s” seem more than a little childish.

Michael Crichton

Which isn’t to say that Crichton was always great, or even good. Near the end of his career, as is the case for many popular novelists, the titles begin to blend together, and I can’t say I’ve managed to get more than halfway through anything since Airframe. Even some of the books I read and enjoyed when I was younger seem a little thin these days. Rising Sun, especially, was a huge disappointment when I read it again a few years ago. It wasn’t the alleged xenophobia, which might have worked in a better thriller—and there’s certainly room in this world for a great suspense novel about two American cops up against an implacable Asian adversary. It’s more Crichton’s determination, in Rising Sun and elsewhere, to subjugate his facts to his message, when in his early novels, the facts were the message, and a very compelling one indeed.

And there’s an important lesson here. In the best Crichton novels—Jurassic Park, Sphere, and my own favorite, Congo—the facts are a filigree, a treat, an additional reward layered onto an exciting story. I won’t go as far as to say that Crichton’s use of factual information is as artful as, say, Thomas Pynchon’s, but there’s a joy in science for its own sake that seems to be missing in subsequent books. And the growing inclination to use information to convince or convert the reader, which all but destroys the later novels, is as deadly as in the most sentimental religious fiction. In the end, Crichton’s religion was science, or politics, when it should have been story. Like the writers who become ever more seduced by the possibilities of voice or style, or even the scientists in his own cautionary tales, he was destroyed by his own tools.

Written by nevalalee

February 3, 2011 at 10:17 am

%d bloggers like this: