Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Dune

The analytical laboratory

leave a comment »

The Martian

Over the last few months, there’s been a surprising flurry of film and television activity involving the writers featured in my upcoming book Astounding. SyFy has announced plans to adapt Robert A. Heinlein’s Stranger in the Strange Land as a miniseries, with an imposing creative team that includes Hollywood power broker Scott Rudin and Zodiac screenwriter James Vanderbilt. Columbia is aiming to reboot Starship Troopers with producer Neal H. Mortiz of The Fast and the Furious, prompting Paul Verhoeven, the director of the original, to comment: “Going back to the novel would fit very much in a Trump presidency.” The production company Legendary has bought the film and television rights to Dune, which first appeared as a serial edited by John W. Campbell in Analog. Meanwhile, Jonathan Nolan is apparently still attached to an adaptation of Isaac Asimov’s Foundation, although he seems rather busy at the moment. (L. Ron Hubbard remains relatively neglected, unless you want to count Leah Remini’s new show, which the Church of Scientology would probably hope you wouldn’t.) The fact that rights have been purchased and press releases issued doesn’t necessarily mean that anything will happen, of course, although the prospects for Stranger in a Strange Land seem strong. And while it’s possible that I’m simply paying more attention to these announcements now that I’m thinking about these writers all the time, I suspect that there’s something real going on.

So why the sudden surge of interest? The most likely, and also the most heartening, explanation is that we’re experiencing a revival of hard science fiction. Movies like Gravity, Interstellar, The Martian, and Arrival—which I haven’t seen yet—have demonstrated that there’s an audience for films that draw more inspiration from Clarke and Kubrick than from Star Wars. Westworld, whatever else you might think of it, has done much the same on television. And there’s no question that the environment for this kind of story is far more attractive now than it was even ten years ago. For my money, the most encouraging development is the movie Life, a horror thriller set on the International Space Station, which is scheduled to come out next summer. I’m tickled by it because, frankly, it doesn’t look like anything special: the trailer starts promisingly enough, but it ends by feeling very familiar. It might turn out to be better than it looks, but I almost hope that it doesn’t. The best sign that a genre is reaching maturity isn’t a series of singular achievements, but the appearance of works that are content to color inside the lines, consciously evoking the trappings of more visionary movies while remaining squarely focused on the mainstream. A film like Interstellar is always going to be an outlier. What we need are movies like what Life promises to be: a science fiction film of minimal ambition, but a certain amount of skill, and a willingness to copy the most obvious features of its predecessors. That’s when you’ve got a trend.

Jake Gyllenhaal in Life

The other key development is the growing market for prestige dramas on television, which is the logical home for Stranger in a Strange Land and, I think, Dune. It may be the case, as we’ve been told in connection with Star Trek: Discovery, that there isn’t a place for science fiction on a broadcast network, but there’s certainly room for it on cable. Combine this with the increased appetite for hard science fiction on film, and you’ve got precisely the conditions in which smart production companies should be snatching up the rights to Asimov, Heinlein, and the rest. Given the historically rapid rise and fall of such trends, they shouldn’t expect this window to remain open for long. (In a letter to Asimov on February 3, 1939, Frederik Pohl noted the flood of new science fiction magazines on newsstands, and he concluded: “Time is indeed of the essence…Such a condition can’t possibly last forever, and the time to capitalize on it is now; next month may be too late.”) What they’re likely to find, in the end, is that many of these stories are resistant to adaptation, and that they’re better off seeking out original material. There’s a reason that there have been so few movies derived from Heinlein and Asimov, despite the temptation that they’ve always presented. Heinlein, in particular, seems superficially amenable to the movies: he certainly knew how to write action in a way that Asimov couldn’t. But he also liked to spend the second half of a story picking apart the assumptions of the first, after sucking in the reader with an exciting beginning, and if you aren’t going to include the deconstruction, you might as well write something from scratch.

As it happens, the recent spike of action on the adaptation front has coincided with another announcement. Analog, the laboratory in which all these authors were born, is cutting back its production schedule to six double issues every year. This is obviously intended to manage costs, and it’s a reminder of how close to the edge the science fiction digests have always been. (To be fair, the change also coincides with a long overdue update of the magazine’s website, which is very encouraging. If this reflects a true shift from print to online, it’s less a retreat than a necessary recalibration.) It’s easy to contrast the game of pennies being played at the bottom with the expenditure of millions of dollars at the top, but that’s arguably how it has to be. Analog, like Astounding before it, was a machine for generating variations, which needs to be done on the cheap. Most stories are forgotten almost at once, and the few that survive the test of time are the ones that get the lion’s share of resources. All the while, the magazine persists as an indispensable form of research and development—a sort of skunk works that keeps the entire enterprise going. That’s been true since the beginning, and you can see this clearly in the lives of the writers involved. Asimov, Heinlein, Herbert, and their estates became wealthy from their work. Campbell, who more than any other individual was responsible for the rise of modern science fiction, did not. Instead, he remained in his little office, lugging manuscripts in a heavy briefcase twice a week on the train. He was reasonably well off, but not in a way that creates an empire of valuable intellectual property. Instead, he ran the lab. And we can see the results all around us.

The great scene theory

with 2 comments

The Coronation of Napoleon by Jacques-Louis David

“The history of the world is but the biography of great men,” Thomas Carlyle once wrote, and although this statement was criticized almost at once, it accurately captures the way many of us continue to think about historical events, both large and small. There’s something inherently appealing about the idea that certain exceptional personalities—Alexander the Great, Julius Caesar, Napoleon—can seize and turn the temper of their time, and we see it today in attempts to explain, say, the personal computing revolution though the life of someone like Steve Jobs. The alternate view, which was expressed forcefully by Herbert Spencer, is that history is the outcome of impersonal social and economic forces, in which a single man or woman can do little more than catalyze trends that are already there. If Napoleon had never lived, the theory goes, someone very much like him would have taken his place. It’s safe to say that any reasonable view of history has to take both theories into account: Napoleon was extraordinary in ways that can’t be fully explained by his environment, even if he was inseparably a part of it. But it’s also worth remembering that much of our fascination with such individuals arises from our craving for narrative structures, which demand a clear hero or villain. (The major exception, interestingly, is science fiction, in which the “protagonist” is often humanity as a whole. And the transition from the hard science fiction of the golden age to messianic stories like Dune, in which the great man reasserts himself with a vengeance, is a critical turning point in the genre’s development.)

You can see a similar divide in storytelling, too. One school of thought implicitly assumes that a story is a delivery system for great scenes, with the rest of the plot serving as a scaffold to enable a handful of awesome moments. Another approach sees a narrative as a series of small, carefully chosen details designed to create an emotional effect greater than the sum of its parts. When it comes to the former strategy, it’s hard to think of a better example than Game of Thrones, a television series that often seems to be marking time between high points: it can test a viewer’s patience, but to the extent that it works, it’s because it constantly promises a big payoff around the corner, and we can expect two or three transcendent set pieces per season. Mad Men took the opposite tack: it was made up of countless tiny but riveting choices that gained power from their cumulative impact. Like the theories of history I mentioned above, neither type of storytelling is necessarily correct or complete in itself, and you’ll find plenty of exceptions, even in works that seem to fall clearly into one category or the other. It certainly doesn’t mean that one kind of story is “better” than the other. But it provides a useful way to structure our thinking, especially when we consider how subtly one theory shades into the other in practice. The director Howard Hawks famously said that a good movie consisted of three great scenes and no bad scenes, which seems like a vote for the Game of Thrones model. Yet a great scene doesn’t exist in isolation, and the closer we look at stories that work, the more important those nonexistent “bad scenes” start to become.

Leo Tolstoy

I got to thinking about this last week, shortly after I completed the series about my alternative movie canon. Looking back at those posts, I noticed that I singled out three of these movies—The Night of the Hunter, The Limey, and Down with Love—for the sake of one memorable scene. But these scenes also depend in tangible ways on their surrounding material. The river sequence in The Night of the Hunter comes out of nowhere, but it’s also the culmination of a language of dreams that the rest of the movie has established. Terence Stamp’s unseen revenge in The Limey works only because we’ve been prepared for it by a slow buildup that lasts for more than twenty minutes. And Renée Zellweger’s confessional speech in Down with Love is striking largely because of how different it is from the movie around it: the rest of the film is relentlessly active, colorful, and noisy, and her long, unbroken take stands out for how emphatically it presses the pause button. None of the scenes would play as well out of context, and it’s easy to imagine a version of each movie in which they didn’t work at all. We remember them, but only because of the less showy creative decisions that have already been made. And at a time when movies seem more obsessed than ever with “trailer moments” that can be spliced into a highlight reel, it’s important to honor the kind of unobtrusive craft required to make a movie with no bad scenes. (A plot that consists of nothing but high points can be exhausting, and a good story both delivers on the obvious payoffs and maintains our interest in the scenes when nothing much seems to be happening.)

Not surprisingly, writers have spent a lot of time thinking about these issues, and it’s noteworthy that one of the most instructive examples comes from Leo Tolstoy. War and Peace is nothing less than an extended criticism of the great man theory of history: Tolstoy brings Napoleon onto the scene expressly to emphasize how insignificant he actually is, and the novel concludes with a lengthy epilogue in which the author lays out his objections to how history is normally understood. History, he argues, is a pattern that emerges from countless unobservable human actions, like the sum of infinitesimals in calculus, and because we can’t see the components in isolation, we have to content ourselves with figuring out the laws of their behavior in the aggregate. But of course, this also describes Tolstoy’s strategy as a writer: we remember the big set pieces in War and Peace and Anna Karenina, but they emerge from the diligent, seemingly impersonal collation of thousands of tiny details, recorded with what seems like a minimum of authorial interference. (As Victor Shklovsky writes: “[Tolstoy] describes the object as if he were seeing it for the first time, an event as if it were happening for the first time.”) And the awesome moments in his novels gain their power from the fact that they arise, as if by historical inevitability, from the details that came before them. Anna Karenina was still alive at the end of the first draft, and it took her author a long time to reconcile himself to the tragic climax toward which his story was driving him. Tolstoy had good reason to believe that great scenes, like great men, are the product of invisible forces. But it took a great writer to see this.

Quote of the Day

leave a comment »

Frank Herbert

For Dune, I also used what I call a “camera position” method—playing back and forth (and in varied orders depending on the required pace) between long shot, medium, closeup, and so on…The implications of color, position, word root, and prosodic suggestion—all are taken into account when a scene has to have maximum impact. And what scene doesn’t if a book is tightly written?

Frank Herbert

Written by nevalalee

August 27, 2015 at 7:20 am

Posted in Quote of the Day, Writing

Tagged with ,

“Make it recognizable!”

leave a comment »

David Mamet

I’ve mentioned before how David Mamet’s little book On Directing Film rocked my world at a time when I thought I’d already figured out storytelling to my own satisfaction. It provides the best set of tools for constructing a plot I’ve ever seen, and to the extent that I can call any book a writer’s secret weapon, this is it. But I don’t think I’ve ever talked about the moment when I realized how powerful Mamet’s advice really is. The first section of the book is largely given over to a transcript of one of the author’s seminars at Columbia, in which the class breaks down the beats of a simple short film: a student approaches a teacher to request a revised grade. The crucial prop in the scene, which is told entirely without dialogue, is the student’s notebook, its contents unknown—and, as Mamet points out repeatedly, unimportant. Then he asks:

Mamet: What answer do we give to the prop person who says “what’s the notebook look like?” What are you going to say?

The students respond with a number of suggestions: put a label on it, make it look like a book report, make it look “prepared.” Mamet shoots them down one by one, saying that they’re things that the audience can’t be expected to care about, if they aren’t intrinsically impossible:

Mamet: No, you can’t make the book look prepared. You can make it look neat. That might be nice, but that’s not the most important thing for your answer to the prop person…To make it prepared, to make it neat, to make it convincing, the audience ain’t going to notice. What are they doing to notice?
Student: That it’s the same book they’ve seen already.
Mamet: So what’s your answer to the prop person?
Student: Make it recognizable.
Mamet: Exactly so! Good. You’ve got to be able to recognize it. That is the most important thing about this report. This is how you use the principle of throughline to answer questions about the set and to answer questions about the costumes.

A recognizable notebook

Now, this might seem like a small thing, but to me, this was an unforgettable moment: it was a powerful illustration of how close attention to the spine of the plot—the actions and images you use to convey the protagonist’s sequence of objectives—can result in immediate, practical answers to seemingly minor story problems, as long as you’re willing to rigorously apply the rules. “Make it recognizable,” in particular, is a rule whose true value I’ve only recently begun to understand. In writing a story, regardless of the medium, you only have a finite number of details that you can emphasize, so it doesn’t hurt to focus on ones that will help the reader recognize and remember important elements—a character, a prop, an idea—when they recur over the course of the narrative. Mamet notes that you can’t expect a viewer to read signs or labels designed to explain what isn’t clear in the action, and it took me a long time to see that this is equally true of the building blocks of fiction: if the reader needs to pause to remember who a character is or where a certain object has appeared before, you haven’t done your job as well as you could.

And like the instructions a director gives to the prop department, this rule translates into specific, concrete actions that a writer can take to keep the reader oriented. It’s why I try to give my characters names that can be readily distinguished from one another, to the point where I’ll often try to give each major player a name that begins with a different letter. This isn’t true to life, where, as James Wood points out, we’re likely to know three people named John and three more named Elizabeth, but it’s a useful courtesy to the reader. The same applies to other entities within the story: it can be difficult to keep track of the alliances in a novel like Dune, but Frank Herbert helps us tremendously by giving the families distinctive names like House Atreides and House Harkonnen. (Try to guess which house contains all the bad guys.) This is also why it’s useful to give minor characters some small characteristic to lock them in the reader’s mind: we may not remember that we’ve met Robert in Chapter 3 when he returns in Chapter 50, but we’ll recall his bristling eyebrows. Nearly every choice a writer makes should be geared toward making these moments of recognition as painless as possible, without the need for labels. As Mamet says: “The audience doesn’t want to read a sign; they want to watch a motion picture.” And to be told a story.

Written by nevalalee

June 19, 2013 at 9:02 am

Why hobbits need to be short

leave a comment »

Ian McKellen in The Hobbit: An Unexpected Journey

It’s never easy to adapt a beloved novel for the screen. On the one hand, you have a book that has been widely acclaimed as one of the greatest works of speculative fiction of all time, with a devoted fanbase and an enormous invented backstory spread across many novels and appendices. On the other, you have a genius director who moved on from his early, bizarre, low-budget features to a triumphant mainstream success with multiple Oscar nominations, but whose skills as a storyteller have sometimes been less reliable than his unquestioned visual talents. The result, after a protracted development process clouded by rights issues, financial difficulties, and the departure of the previous director, is an overlong movie with too many characters that fails to capture the qualities that drew people to this story in the first place. By trying to appease fans of the book while also drawing in new audiences, it ends up neither here nor there. While it’s cinematically striking, and has its defenders, it leaves critics mostly cold, with few of the awards or accolades that greeted its director’s earlier work. And that’s why David Lynch had so much trouble with Dune.

But it’s what Lynch did next that is especially instructive. After Dune‘s financial failure, he found himself working on his fourth movie under far greater constraints, with a tiny budget and a contractual runtime of no more than 120 minutes. The initial cut ran close to three hours, but eventually, with the help of editor Duwayne Dunham, he got it down to the necessary length, although it meant losing a lot of wonderful material along the way. And what we got was Blue Velvet, which isn’t just Lynch’s best film, but my favorite American movie of all time. I recently had the chance to watch all of the deleted scenes as part of the movie’s release on Blu-ray, and it’s clear that if Lynch had been allowed to retain whatever footage he wanted—as he clearly does these days—the result would have been a movie like Inland Empire: fascinating, important, but ultimately a film that I wouldn’t need to see more than once. The moral, surprisingly enough, is that even a director like Lynch, a genuine artist who has earned the right to pursue his visions wherever they happen to take him, can benefit from the need, imposed by a studio, to cut his work far beyond the level where he might have been comfortable.

Kyle MacLachlan in Blue Velvet

Obviously, the case of Peter Jackson is rather different. The Lord of the Rings trilogy was an enormous international success, and did as much as anything to prove that audiences will still sit happily through a movie of more than three hours if the storytelling is compelling enough. As a result, Jackson was able to make The Hobbit: An Unexpected Journey as long as he liked, which is precisely the problem. The Hobbit isn’t a bad movie, exactly; after an interminable first hour, it picks up considerably in the second half, and there are still moments I’m grateful to have experienced on the big screen. Yet I can’t help feeling that if Jackson had felt obliged, either contractually or artistically, to bring it in at under two hours, it would have been vastly improved. This would have required some hard choices, but even at a glance, there are entire sequences here that never should have made it past a rough cut. As it stands, we’re left with a meandering movie that trades largely on our affection for the previous trilogy—its actors, its locations, its music. And if this had been the first installment of a series, it’s hard to imagine it making much of an impression on anyone. Indeed, it might have justified all our worst fears about a cinematic adaptation of Tolkien.

And the really strange thing is that Jackson has no excuse. For one thing, it isn’t the first time he’s done this: I loved King Kong, but I still feel that it would have been rightly seen as a game changer on the level of Avatar if he’d cut it by even twenty minutes. And unlike David Lynch and Blue Velvet, whose deleted scenes remained unseen for decades before being miraculously rediscovered, Jackson knows that even if has to cut a sequence he loves, he has an audience of millions that will gladly purchase the full extended edition within a year of the movie’s release. But it takes a strong artistic will to accept such constraints if they aren’t being imposed from the outside, and to acknowledge that sometimes an arbitrary limit is exactly what you need to force yourself to make those difficult choices. (My own novels are contractually required to come in somewhere around 100,000 words, and although I’ve had to cut them to the bone to get there, they’ve been tremendously improved by the process, to the point where I intend to impose the same limit on everything I ever write.) The Hobbit has two more installments to go, and I hope Jackson takes the somewhat underwhelming critical and commercial response to the first chapter to heart. Because an unwillingness to edit your work is a hard hobbit to break.

So what happened to John Carter?

leave a comment »

In recent years, the fawning New Yorker profile has become the Hollywood equivalent of the Sports Illustrated cover—a harbinger of bad times to come. It isn’t hard to figure out why: both are awarded to subjects who have just reached the top of their game, which often foreshadows a humbling crash. Tony Gilroy was awarded a profile after the success of Michael Clayton, only to follow it up with the underwhelming Duplicity. For Steve Carrell, it was Dinner with Schmucks. For Anna Faris, it was What’s Your Number? And for John Lasseter, revealingly, it was Cars 2. The latest casualty is Andrew Stanton, whose profile, which I discussed in detail last year, now seems laden with irony, as well as an optimism that reads in retrospect as whistling in the dark. “Among all the top talent here,” a Pixar executive is quoted as saying, “Andrew is the one who has a genius for story structure.” And whatever redeeming qualities John Carter may have, story structure isn’t one of them. (The fact that Stanton claims to have closely studied the truly awful screenplay for Ryan’s Daughter now feels like an early warning sign.)

If nothing else, the making of John Carter will provide ample material for a great case study, hopefully along the lines of Julie Salamon’s classic The Devil’s Candy. There are really two failures here, one of marketing, another of storytelling, and even the story behind the film’s teaser trailer is fascinating. According to Vulture’s Claude Brodesser-Akner, a series of lost battles and miscommunications led to the release of a few enigmatic images devoid of action and scored, in the manner of an Internet fan video, with Peter Gabriel’s dark cover of “My Body is a Cage.” And while there’s more to the story than this—I actually found the trailer quite evocative, and negative responses to early marketing materials certainly didn’t hurt Avatar—it’s clear that this was one of the most poorly marketed tentpole movies in a long time. It began with the inexplicable decision to change the title from John Carter of Mars, on the assumption that women are turned off by science fiction, while making no attempt to lure in female viewers with the movie’s love story or central heroine, or even to explain who John Carter is. This is what happens when a four-quadrant marketing campaign goes wrong: when you try to please everybody, you please no one.

And the same holds true of the movie itself. While the story itself is fairly clear, and Stanton and his writers keep us reasonably grounded in the planet’s complex mythology, we’re never given any reason to care. Attempts to engage us with the central characters fall curiously flat: to convey that Princess Dejah is smart and resourceful, for example, the film shows her inventing the Barsoomian equivalent of nuclear power, evidently in her spare time. John Carter himself is a cipher. And while some of these problems might have been solved by miraculous casting, the blame lands squarely on Stanton’s shoulders. Stanton clearly loves John Carter, but forgets to persuade us to love him as well. What John Carter needed, more than anything else, was a dose of the rather stark detachment that I saw in Mission: Impossible—Ghost Protocol, as directed by Stanton’s former Pixar colleague Brad Bird. Bird clearly had no personal investment in the franchise, except to make the best movie he possibly could. John Carter, by contrast, falls apart on its director’s passion and good intentions, as well as a creative philosophy that evidently works in animation, but not live action. As Stanton says of Pixar:

We’re in this weird, hermetically sealed freakazoid place where everybody’s trying their best to do their best—and the films still suck for three out of the four years it takes to make them.

Which only makes us wonder what might have happened if John Carter had been granted a fourth year.

Stanton should take heart, however. If there’s one movie that John Carter calls to mind, it’s Dune, another financial and critical catastrophe that was doomed—as much as I love it—by fidelity to its source material. (In fact, if you take Roger Ebert’s original review of Dune, which came out in 1985, and replace the relevant proper names, you end up with something remarkably close to a review of John Carter: “Actors stand around in ridiculous costumes, mouthing dialogue with little or no context.”) Yet its director not only recovered, but followed it up with my favorite movie ever made in America. Failure, if it results in another chance, can be the opposite of the New Yorker curse. And while Stanton may not be David Lynch, he’s not without talent: the movie’s design is often impressive, especially its alien effects, and it displays occasional flashes of wit and humor that remind us of what Stanton can do. John Carter may go on record as the most expensive learning experience in history, and while this may be cold comfort to Disney shareholders, it’s not bad for the rest of us, as long as Stanton gets his second chance. Hopefully far away from the New Yorker.

Written by nevalalee

March 15, 2012 at 10:31 am

So what is science fiction?

with 5 comments

Like most authors, although I don’t always like to admit it, I’m very interested in other people’s reactions to my work. One of the singular things about being a writer these days is that one has access to a huge range of opinions about one’s writing: on review sites, blogs, discussion boards, and all the other venues for talking about fiction that didn’t exist even twenty years ago. As a result, every few days I’ll snoop around the web to see what people are saying. (One of my few disappointments following the publication of “Kawataro” was that it coincided with the demise of the Analog readers’ forum, where I had once been able to count on a spirited discussion—or at least a ruthless nitpicking—of my stories.)

For the most part, readers seem to enjoy my stuff well enough, and it’s always gratifying to find a positive review online. Over time, though, I’ve noticed a particular theme being struck repeatedly even by people who like my work: they don’t think it’s science fiction at all. Now, I’m pretty sure that my novelettes and short stories are science fiction—if they weren’t, they  wouldn’t be published in Analog, which doesn’t have much interest in anything else—but I can understand the source of the confusion. Thanks mostly to my X-Files roots, my stories are set in the present day. They all take place on this planet. I don’t do aliens or robots. And while the plots do turn on science, they’re more often structured as contemporary mysteries where the solution depends on scientific information, which I gather is fairly uncommon.

It’s worth asking, then, whether we can come up with a definition of science fiction broad enough to include both my work and, say, Kim Stanley Robinson’s. (Or even L. Ron Hubbard’s.) TV Tropes, usually a good starting point for this sort of thing, despite its sometimes breathless fangirl tone, argues that science fiction hinges on technology:

The one defining(-ish, definitions differ) trait of Science Fiction is that there is technology that doesn’t exist in the time period the story is written in.

Which automatically disqualifies most of my stories, since I don’t have much interest in technology for its own sake, at least not as a narrative device. I’m also not especially interested in world-building, another hallmark of conventional science fiction, if only because so many other writers are better at it than I am.

So if my stories don’t include technology or alien worlds, where does that leave me? Wikipedia comes to the rescue, defining science fiction as dealing with “imagined innovations in science or technology,” including one particular subcategory:

Stories that involve discovery or application of new scientific principles, such as time travel or psionics, or new technology, such as nanotechnology, faster-than-light travel or robots.

Which is basically where I fit in, as long as you stretch the definition to include connections between previously unrelated scientific principles. “Inversus,” my first published novelette, is basically about psionics, but links it to a number of existing phenomena, like situs inversus. “The Last Resort” takes a known phenomenon—limnic eruptions—and transfers it to a novel part of the world, with a speculative explanation of how it might be caused by human activity. “Kawataro” fictionalizes the case of the Al-Sayyid Bedouin, moves it to Japan, and connects it to another medical mystery. And my upcoming “The Boneless One” begins with a real scientific project, the effort to sample genetic diversity in the world’s oceans, and speculates as to how it might lead to unexpected—and murderous—consequences.

Much of my favorite fiction is about such connections, whether it’s the paranoid synthetic vision of Foucault’s Pendulum, Illuminatus!, or Gravity’s Rainbow, or the constructive impulse of the great science fiction novels. (Dune, for instance, gains much of its fascination from the variety of Frank Herbert’s interests—ecology, energy policy, the Bedouin, the story of T.E. Lawrence—and from how he juxtaposes them in astonishing ways.) My love of connections is what led me to focus on my two genres of choice, science fiction and suspense, both of which reward the ability to see connections that haven’t been noticed in print. And the ultimate playground for ideas is science. The science is real; the connections are plausible, but fictional. Put them together, and you get science fiction. Or something like it, anyway.

%d bloggers like this: