Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Umberto Eco

The private eyes of culture

leave a comment »

Yesterday, in my post on the late magician Ricky Jay, I neglected to mention one of the most fascinating aspects of his long career. Toward the end of his classic profile in The New Yorker, Mark Singer drops an offhand reference to an intriguing project:

Most afternoons, Jay spends a couple of hours in his office, on Sunset Boulevard, in a building owned by Andrew Solt, a television producer…He decided now to drop by the office, where he had to attend to some business involving a new venture that he has begun with Michael Weber—a consulting company called Deceptive Practices, Ltd., and offering “Arcane Knowledge on a Need to Know Basis.” They are currently working on the new Mike Nichols film, Wolf, starring Jack Nicholson.

When the article was written, Deceptive Practices was just getting off the ground, but it went on to compile an enviable list of projects, including The Illusionist, The Prestige, and most famously Forrest Gump, for which Jay and Weber designed the wheelchair that hid Gary Sinise’s legs. It isn’t clear how lucrative the business ever was, but it made for great publicity, and best of all, it allowed Jay to monetize the service that he had offered for free to the likes of David Mamet—a source of “arcane knowledge,” much of it presumably gleaned from his vast reading in the field, that wasn’t available in any other way.

As I reflected on this, I was reminded of another provider of arcane knowledge who figures prominently in one of my favorite novels. In Umberto Eco’s Foucault’s Pendulum, the narrator, Casaubon, comes home to Milan after a long sojourn abroad feeling like a man without a country. He recalls:

I decided to invent a job for myself. I knew a lot of things, unconnected things, but I wanted to be able to connect them after a few hours at a library. I once thought it was necessary to have a theory, and that my problem was that I didn’t. But nowadays all you needed was information; everybody was greedy for information, especially if it was out of date. I dropped in at the university, to see if I could fit in somewhere. The lecture halls were quiet; the students glided along the corridors like ghosts, lending one another badly made bibliographies. I knew how to make a good bibliography.

In practice, Casaubon finds that he knows a lot of things—like the identities of such obscure figures as Lord Chandos and Anselm of Canterbury—that can’t be found easily in reference books, prompting a student to marvel at him: “In your day you knew everything.” This leads Casaubon to a sudden inspiration: “I had a trade after all. I would set up a cultural investigation agency, be a kind of private eye of learning. Instead of sticking my nose into all-night dives and cathouses, I would skulk around bookshops, libraries, corridors of university departments…I was lucky enough to find two rooms and a little kitchen in an old building in the suburbs…In a pair of bookcases I arranged the atlases, encyclopedias, catalogs I acquired bit by bit.”

This feels a little like the fond daydream of a scholar like Umberto Eco himself, who spent decades acquiring arcane knowledge—not all of it required by his academic work—before becoming a famous novelist. And I suspect that many graduate students, professors, and miscellaneous bibliophiles cherish the hope that the scraps of disconnected information that they’ve accumulated over time will turn out to be useful one day, in the face of all evidence to the contrary. (Casaubon is evidently named after the character from Middlemarch who labors for years over a book titled The Key to All Mythologies, which is already completely out of date.) To illustrate what he does for a living, Casaubon offers the example of a translator who calls him one day out of the blue, desperate to know the meaning of the word “Mutakallimūn.” Casaubon asks him for two days, and then he gets to work:

I go to the library, flip through some card catalogs, give the man in the reference office a cigarette, and pick up a clue. That evening I invite an instructor in Islamic studies out for a drink. I buy him a couple of beers and he drops his guard, gives me the lowdown for nothing. I call the client back. “All right, the Mutakallimūn were radical Moslem theologians at the time of Avicenna. They said the world was a sort of dust cloud of accidents that formed particular shapes only by an instantaneous and temporary act of the divine will. If God was distracted for even a moment, the universe would fall to pieces, into a meaningless anarchy of atoms. That enough for you? The job took me three days. Pay what you think is fair.”

Eco could have picked nearly anything to serve as a case study, of course, but the story that he choses serves as a metaphor for one of the central themes of the book. If the world of information is a “meaningless anarchy of atoms,” it takes the private eyes of culture to give it shape and meaning.

All the while, however, Eco is busy undermining the pretensions of his protagonists, who pay a terrible price for treating information so lightly. And it might not seem that such brokers of arcane knowledge are even necessary these days, now that an online search generates pages of results for the Mutakallimūn. Yet there’s still a place for this kind of scholarship, which might end up being the last form of brainwork not to be made obsolete by technology. As Ricky Jay knew, by specializing deeply in one particular field, you might be able to make yourself indispensable, especially in areas where the knowledge hasn’t been written down or digitized. (In the course of researching Astounding, I was repeatedly struck by how much of the story wasn’t available in any readily accessible form. It was buried in letters, manuscripts, and other primary sources, and while this happens to be the one area where I’ve actually done some of the legwork, I have a feeling that it’s equally true of every other topic imaginable.) As both Jay and Casaubon realized, it’s a role that rests on arcane knowledge of the kind that can only be acquired by reading the books that nobody else has bothered to read in a long time, even if it doesn’t pay off right away. Casaubon tells us: “In the beginning, I had to turn a deaf ear to my conscience and write theses for desperate students. It wasn’t hard; I just went and copied some from the previous decade. But then my friends in publishing began sending me manuscripts and foreign books to read—naturally, the least appealing and for little money.” But he perseveres, and the rule that he sets for himself might still be enough, if you’re lucky, to fuel an entire career:

Still, I was accumulating experience and information, and I never threw anything away…I had a strict rule, which I think secret services follow, too: No piece of information is superior to any other. Power lies in having them all on file and then finding the connections.

Written by nevalalee

November 27, 2018 at 8:41 am

The Machine of Lagado

with one comment

Yesterday, my wife wrote to me in a text message: “Psychohistory could not predict that Elon [Musk] would gin up a fraudulent stock buyback price based on a pot joke and then get punished by the SEC.” This might lead you to wonder about our texting habits, but more to the point, she was right. Psychohistory—the fictional science of forecasting the future developed by Isaac Asimov and John W. Campbell in the Foundation series—is based on the assumption that the world will change in the future more or less as it has in the past. Like all systems of prediction, it’s unable to foresee black swans, like the Mule or Donald Trump, that make nonsense of our previous assumptions, and it’s useless for predicting events on a small scale. Asimov liked to compare it to the kinetic theory of gases, “where the individual molecules in the gas remain as unpredictable as ever, but the average person is completely predictable.” This means that you need a sufficiently large number of people, such as the population of the galaxy, for it to work, and it also means that it grows correspondingly less useful as it becomes more specific. On the individual level, human behavior is as unforeseeable as the motion of particular molecules, and the shape of any particular life is impossible to predict, even if we like to believe otherwise. The same is true of events. Just as a monkey or a dartboard might do an equally good job of picking stocks as a qualified investment advisor, the news these days often seems to have been generated by a bot, like the Subreddit Simulator, that automatically cranks out random combinations of keywords and trending terms. (My favorite recent example is an actual headline from the Washington Post: “Border Patrol agent admits to starting wildfire during gender-reveal party.”)

And the satirical notion that combining ideas at random might lead to useful insights or predictions is a very old one. In Gulliver’s Travels, Jonathan Swift describes an encounter with a fictional machine—located in the academy of Lagado, the capital city of the island of Balnibarbi—by which “the most ignorant person, at a reasonable charge, and with a little bodily labour, might write books in philosophy, poetry, politics, laws, mathematics, and theology, without the least assistance from genius or study.” The narrator continues:

[The professor] then led me to the frame, about the sides, whereof all his pupils stood in ranks. It was twenty feet square, placed in the middle of the room. The superfices was composed of several bits of wood, about the bigness of a die, but some larger than others. They were all linked together by slender wires. These bits of wood were covered, on every square, with paper pasted on them; and on these papers were written all the words of their language, in their several moods, tenses, and declensions; but without any order…The pupils, at his command, took each of them hold of an iron handle, whereof there were forty fixed round the edges of the frame; and giving them a sudden turn, the whole disposition of the words was entirely changed.  He then commanded six-and-thirty of the lads, to read the several lines softly, as they appeared upon the frame; and where they found three or four words together that might make part of a sentence, they dictated to the four remaining boys, who were scribes.

And Gulliver concludes: “Six hours a day the young students were employed in this labour; and the professor showed me several volumes in large folio, already collected, of broken sentences, which he intended to piece together, and out of those rich materials, to give the world a complete body of all arts and sciences.”

Two and a half centuries later, an updated version of this machine figured in Umberto Eco’s novel Foucault’s Pendulum, which is where I first encountered it. The book’s three protagonists, who work as editors for a publishing company in Milan, are playing in the early eighties with their new desktop computer, which they’ve nicknamed Abulafia, after the medieval cabalist. One speaks proudly of Abulafia’s usefulness in generating random combinations: “All that’s needed is the data and the desire. Take, for example, poetry. The program asks you how many lines you want in the poem, and you decide: ten, twenty, a hundred. Then the program randomizes the line numbers. In other words, a new arrangement each time. With ten lines you can make thousands and thousands of random poems.” This gives the narrator an idea:

What if, instead, you fed it a few dozen notions taken from the works of [occult writers]—for example, the Templars fled to Scotland, or the Corpus Hermeticum arrived in Florence in 1460—and threw in a few connective phrases like “It’s obvious that” and “This proves that?” We might end up with something revelatory. Then we fill in the gaps, call the repetitions prophecies, and—voila—a hitherto unpublished chapter of the history of magic, at the very least!

Taking random sentences from unpublished manuscripts, they enter such lines as “Who was married at the feast of Cana?” and “Minnie Mouse is Mickey’s fiancee.” When strung together, the result, in one of Eco’s sly jokes, is a conspiracy theory that exactly duplicates the thesis of Holy Blood, Holy Grail, which later provided much of the inspiration for The Da Vinci Code. “Nobody would take that seriously,” one of the editors says. The narrator replies: “On the contrary, it would sell a few hundred thousand copies.”

When I first read this as a teenager, I thought it was one of the great things in the world, and part of me still does. I immediately began to look for similar connections between random ideas, which led me to some of my best story ideas, and I still incorporate aspects of randomness into just about everything that I do. Yet there’s also a pathological element to this form of play that I haven’t always acknowledged. What makes it dangerous, as Eco understood, is the inclusion of such seemingly innocent expressions as “it’s obvious that” and “this proves that,” which instantly transforms a scenario into an argument. (On the back cover of the paperback edition of Foucault’s Pendulum, the promotional copy describes Abulafia as “an incredible computer capable of inventing connections between all their entires,” which is both a great example of hyping a difficult book and a reflection of how credulous we can be when it comes to such practices in real life.) We may not be able to rule out any particular combination of events, but not every explanatory system is equally valid, even if all it takes is a modicum of ingenuity to turn it into something convincing. I used to see the creation of conspiracy theories as a diverting game, or as a commentary on how we interpret the world around us, and I devoted an entire novel to exorcising my fascination with this idea. More recently, I’ve realized that this attitude was founded on the assumption that it was still possible to come to some kind of cultural consensus about the truth. In the era of InfoWars, Pizzagate, and QAnon, it no longer seems harmless. Not all patterns are real, and many of the horrors of the last century were perpetuated by conspiracy theorists who arbitrarily seized on one arrangement of the facts—and then acted on it accordingly. Reality itself can seem randomly generated, but our thoughts and actions don’t need to be.

Written by nevalalee

October 2, 2018 at 9:36 am

The surprising skepticism of The X-Files

with one comment

Gillian Anderson in "Jose Chung's From Outer Space"

Note: To celebrate the twenty-fifth anniversary of the premiere of The X-Files, I’m republishing a post that originally appeared, in a somewhat different form, on September 9, 2013.

Believe it or not, this week marks the twenty-fifth anniversary of The X-Files, which aired its first episode on September 10, 1993. As much as I’d like to claim otherwise, I didn’t watch the pilot that night, and I’m not even sure that I caught the second episode, “Deep Throat.” “Squeeze,” which aired the following week, is the first installment that I clearly remember seeing on its original broadcast, and I continued to tune in afterward, although only sporadically. In its early days, I had issues with the show’s lack of continuity: it bugged me to no end that after every weekly encounter with the paranormal—any one of which should have been enough to upend Scully’s understanding of the world forever—the two leads were right back where they were at the start of the next episode, and few, if any, of their cases were ever mentioned again. Looking back now, of course, it’s easy to see that this episodic structure was what allowed the show to survive, and that it was irrevocably damaged once it began to take its backstory more seriously. In the meantime, I learned to accept the show’s narrative logic on its own terms. And I’m very grateful that I did.

It’s no exaggeration to say that The X-Files has had a greater influence on my own writing than any work of narrative art in any medium. That doesn’t mean it’s my favorite work of art, or even my favorite television show—only that Chris Carter’s supernatural procedural came along at the precise moment in my young adulthood that I was most vulnerable to being profoundly influenced by a great genre series. I was thirteen when the show premiered, toward the end of the most pivotal year of my creative life. Take those twelve months away, or replace them with a different network of cultural influences, and I’d be a different person altogether. It was the year I discovered Umberto Eco, Stephen King, and Douglas R. Hofstadter; Oliver Stone’s JFK set me on a short but fruitful detour into the literature of conspiracy; I bought a copy of Very by the Pet Shop Boys, about which I’ll have a lot more to say soon; I acquired copies of Isaac Asimov’s Science Fiction Magazine and 100 Great Science Fiction Short Short Stories; and I took my first deep dive into the work of David Lynch and, later, Jorge Luis Borges. Some of these works have lasted, while others haven’t, but they all shaped who I became, and The X-Files stood at the heart of it all, with imagery drawn in equal part from Twin Peaks and Dealey Plaza and a playful, agnostic spirit that mirrored that of the authors I was reading at the same time.

Gillian Anderson and David Duchovny in The X-Files pilot

And this underlying skepticism—which may seem like a strange word to apply to The X-Files—was a big part of its appeal. What I found enormously attractive about the show was that although it took place in a world of aliens, ghosts, and vampires, it didn’t try to force these individual elements into one overarching pattern. Even in its later seasons, when it attempted, with mixed results, to weave its abduction and conspiracy threads into a larger picture, certain aspects remained incongruously unexplained. The same world shaped by the plans of the Consortium or Syndicate also included lake monsters, clairvoyants, and liver-eating mutants, all of whom would presumably continue to go about their business after the alien invasion occurred. It never tried to convert us to anything, because it didn’t have any answers. And what I love about it now, in retrospect, is the fact that this curiously indifferent attitude toward its own mysteries arose from the structural constraints of network television itself. Every episode had to stand on its own. There was no such thing as binge-watching. The show had to keep moving or die.

Which goes a long way toward explaining why even fundamentally skeptical viewers, like me, could become devoted fans, or why Mulder and Scully could appear on the cover of the Skeptical Inquirer. It’s true that Scully was never right, but it’s remarkable how often it seemed that she could be, which is due as much to the show’s episodic construction as to Gillian Anderson’s wonderful performance. (As I’ve mentioned before, Scully might be my favorite character on any television show.) Every episode changed the terms of the game, complete with a new supporting cast, setting, and premise—and after the advent of Darin Morgan, even the tone could be wildly variable. As a result, it was impossible for viewers to know where they stood, which made a defensive skepticism seem like the healthiest possible attitude. Over time, the mythology grew increasingly unwieldy, and the show’s lack of consistency became deeply frustrating, as reflected in its maddening, only occasionally transcendent reboot. The X-Files eventually lost its way, but not until after a haphazard, often dazzling initial season that established, in spite of what its creators might do in the future, that anything was possible, and no one explanation would ever be enough. And it’s a lesson that I never forgot.

Written by nevalalee

September 14, 2018 at 9:00 am

The stuff of thought

leave a comment »

On December 4, 1972, the ocean liner SS Statendam sailed from New York to Florida, where its passengers would witness the launch of Apollo 17, the final manned mission to the moon. The guests on the cruise included Isaac Asimov, Robert A. Heinlein, Frederik Pohl, Theodore Sturgeon, Norman Mailer, Katherine Anne Porter, and the newscaster Hugh Downs. It’s quite a story, and I’ve written about it elsewhere at length. What I’d like to highlight today, though, is what was happening a few miles away on shore, as Tom Wolfe recounts in the introduction to the paperback edition of The Right Stuff:

This book grew out of some ordinary curiosity. What is it, I wondered, that makes a man willing to sit up on top of an enormous Roman candle, such as a Redstone, Atlas, Titan, or Saturn rocket, and wait for someone to light the fuse? I decided on the simplest approach possible. I would ask a few astronauts and find out. So I asked a few in December of 1972 when they gathered at Cape Canaveral to watch the last mission to the moon, Apollo 17. I discovered quickly enough that none of them, no matter how talkative otherwise, was about to answer the question or even linger for more than a few seconds on the subject at the heart of it, which is to say, courage.

Wolfe’s “ordinary curiosity” led him to tackle a project that would consume him for the better part of a decade, driven by his discovery of “a rich and fabulous terrain that, in a literary sense, had remained as dark as the far side of the moon for more than half a century: military flying and the modern American officer corps.”

And my mind sometimes turns to the contrast between Wolfe, trying to get the astronauts to open up about their experiences, and the writers aboard the Statendam. You had Mailer, of course, who had written his own book on the moon, and the result was often extraordinary. It was more about Mailer himself than anything else, though, and during the cruise, he seemed more interested in laying out his theory of the thanatosphere, an invisible region around the moon populated by the spirits of the dead. Then you had such science fiction writers as Heinlein and Asimov, who would occasionally cross paths with real astronauts, but whose fiction was shaped by assumptions about the competent man that had been formed decades earlier. Wolfe decided to go to the source, but even he kept the pulps at the back of his mind. In his introduction, speaking of the trend in military fiction after World War I, he observes:

The only proper protagonist for a tale of war was an enlisted man, and he was to be presented not as a hero but as Everyman, as much a victim of war as any civilian. Any officer above the rank of second lieutenant was to be presented as a martinet or a fool, if not an outright villain, no matter whom he fought for. The old-fashioned tale of prowess and heroism was relegated to second- and third-rate forms of literature, ghostwritten autobiographies, and stories in pulp magazines on the order of Argosy and Bluebook.

Wolfe adds: “Even as late as the 1930s the favorite war stories in the pulps concerned World War I pilots.” And it was to pursue “the drama and psychology” of this mysterious courage in the real world that he wrote The Right Stuff.

The result is a lasting work of literary journalism, as well as one of the most entertaining books ever written, and we owe it to the combination of Wolfe’s instinctive nose for a story and his obsessiveness in following it diligently for years. Last year, in a review of John McPhee’s new collection of essays, Malcolm Harris said dryly: “I would recommend Draft No. 4 to writers and anyone interested in writing, but no one should use it as a professional guide uncritically or they’re liable to starve.” You could say much the same about Wolfe, who looks a lot like the kind of journalist we aren’t likely to see again, in part because the market has changed, but also because this kind of luck can be hard for anyone to sustain over the course of a career. Wolfe hit the jackpot on multiple occasions, but he also spent years on books that nobody read—Back to Blood, his last novel, cost its publisher a hundred dollars for every copy that it sold. (Toward the end, he could even seem out of his depth. It probably isn’t a coincidence that I never read I Am Charlotte Simmons, a novel about “Harvard, Yale, Princeton, Stanford, Duke, and a few other places all rolled into one” that was published a few years after I graduated from college. Wolfe’s insights into undergraduate life, delivered with his customary breathlessness, didn’t seem useful for understanding an experience that I had just undergone, and I’ve never forgotten the critic who suggested that the novel should have been titled I Am Easily Impressed.)

But that’s also the kind of risk required to produce major work. Wolfe’s movement from nonfiction to novels still feels like a loss, and I think that it deprived us of two or three big books of the kind that he could write better than anyone else. (It’s too bad that he never wrote anything about science fiction, which is a subject that could only be grasped by the kind of writer who could produce both The Right Stuff and The Electric Kool-Aid Acid Test.) Yet it isn’t always the monumental achievements that matter. In fact, when I think of what Wolfe has meant to me, it’s his offhand critical comments that have stuck in my head. The short introduction that he wrote to a collection of James M. Cain’s novels, in which he justifiably praised Cain’s “momentum,” has probably had a greater influence on my own style—or at least my aspirations for it—than any other single piece of criticism. His description of Umberto Eco as “a very good example of a writer who leads dozens of young writers into a literary cul-de-sac” is one that I’ll always remember, mostly because he might have been speaking of me. In college, I saw him give a reading once, shortly before the release of the collection Hooking Up. I was struck by his famous white suit, of course, but what I’ll never forget is the moment, just before he began to read, when he reached into his inside pocket and produced a pair of reading glasses—also spotlessly white. It was a perfect punchline, with the touch of the practiced showman, and it endeared Wolfe to me at times when I grew tired of his style and opinions. His voice and his ambition inspired many imitators, but at his best, it was the small stuff that set him apart.

The axioms of behavior

with one comment

Earlier this week, Keith Raniere, the founder of an organization known as Nxivm, was arrested in Mexico, to which he had fled last year in the wake of a devastating investigation published in the New York Times. The article described a shady operation that combined aspects of a business seminar, a pyramid scheme, and a sex cult, with public workshops shading into a “secret sisterhood” that required its members to provide nude photographs or other compromising materials and be branded with Raniere’s initials. (In an email obtained by the Times, Raniere reassured one of his followers: “[It was] not originally intended as my initials but they rearranged it slightly for tribute.”) According to the report, about sixteen thousand people have taken the group’s courses, which are marketed as leading to “greater self-fulfillment by eliminating psychological and emotional barriers,” and some went even further. As the journalist Barry Meier wrote:

Most participants take some workshops, like the group’s “Executive Success Programs,” and resume their lives. But other people have become drawn more deeply into Nxivm, giving up careers, friends and families to become followers of its leader, Keith Raniere, who is known within the group as “Vanguard”…Former members have depicted [Raniere] as a man who manipulated his adherents, had sex with them and urged women to follow near-starvation diets to achieve the type of physique he found appealing.

And it gets even stranger. In 2003, Raniere sued the Cult Education Institute for posting passages from his training materials online. In his deposition for the suit, which was dismissed just last year, Raniere stated:

I discovered I had an exceptional aptitude for mathematics and computers when I was twelve. It was at the age of twelve I read The Second Foundation [sic] by Isaac Asimov and was inspired by the concepts on optimal human communication to start to develop the theory and practice of Rational Inquiry. This practice involves analyzing and optimizing how the mind handles data. It involves mathematical set theory applied in a computer programmatic fashion to processes such as memory and emotion. It also involves a projective methodology that can be used for optimal communication and decision making.

Raniere didn’t mention any specific quotations from Asimov, but they were presumably along the lines of the following, which actually appears in Foundation and Empire, spoken by none other than the Mule:

Intuition or insight or hunch-tendency, whatever you wish to call it, can be treated as an emotion. At least, I can treat it so…The human mind works at low efficiency. Twenty percent is the figure usually given. When, momentarily, there is a flash of greater power it is termed a hunch, or insight, or intuition. I found early that I could induce a continual use of high brain-efficiency. It is a killing process for the person affected, but it is useful.

At this point, one might be tempted to draw parallels to other cults, such as Aum Shinrikyo, that are also said to have taken inspiration from Asimov’s work. In this case, however, the connection to the Foundation series seems tangential at best. A lot of us read science fiction at the golden age of twelve, and while we might be intrigued by psychohistory or mental engineering, few of us take it in the direction that Raniere evidently did. (As one character observes in Umberto Eco’s Foucault’s Pendulum: “People don’t get the idea of going back to burn Troy just because they read Homer.”) In fact, Raniere comes off a lot more like L. Ron Hubbard, at least in the version of himself that he presents in public. In the deposition, he provided an exaggerated account of his accomplishments that will strike those who know Hubbard as familiar:

In 1988, I was accepted into the Mega Society. The requirements to be accepted into the Mega Society were to have a demonstrated IQ of 176…In 1989, I was accepted into the Guinness Book of World Records under the category “Highest IQ.” I also left my position as a Computer Programmer/Analyst and resumed business consulting with the intention to raise money to start the “Life Learning Institute.” At this point in time I became fascinated with how human motivation affected behavior. I started to refine my projective mathematical theory of the human mind to include a motivational behavior equation.

And when Raniere speaks of developing “a set of consistent axioms of how human behavior interfaced with the world,” it’s just a variation on an idea that has been recycled within the genre for decades.

Yet it’s also worth asking why the notion of a “mathematical psychology” appeals to these manipulative personalities, and why many of them have repackaged these ideas so successfully for their followers. You could argue that Raniere—or even Charles Manson—represents the psychotic fringe of an impulse toward transformation that has long been central to science fiction, culminating in the figure of the superman. (It’s probably just a coincidence, but I can’t help noting that two individuals who have been prominently linked with the group, the actresses Kristin Kreuk and Allison Mack, both appeared on Smallville.) And many cults hold out a promise of change for which the genre provides a convenient vocabulary. As Raniere said in his deposition:

In mathematics, all things are proven based on axioms and a step by step systematic construction. Computers work the same way. To program a computer one must first understand the axioms of the computer language, and then the step by step systematic construction of the problem-solution methodology. Finally, one must construct the problem-solution methodology in a step by step fashion using the axioms of the language. I discovered the human mind works the same way and I formalized the process.

This sounds a lot like Hubbard, particularly in the early days of dianetics, in which the influence of cybernetics was particularly strong. But it also represents a limited understanding of what the human mind can be, and it isn’t surprising that it attracts people who see others as objects to be altered, programmed, and controlled. The question of whether such figures as Hubbard or Raniere really buy into their own teachings resists any definitive answer, but one point seems clear enough. Even if they don’t believe it, they obviously wish that it were true.

The stories of our lives

with one comment

Last week, I mentioned the evocative challenge that the writer and literary agent John Brockman recently posed to a group of scientists and intellectuals: “Ask the question for which you will be remembered.” I jokingly said that my own question would probably resemble the one submitted by the scholar Jonathan Gottschall: “Are stories bad for us?” As often happens with such snap decisions, however, this one turned out to be more revealing than I had anticipated. When I look back at my work as a writer, it’s hard to single out any overarching theme, but I do seem to come back repeatedly to the problem of reading the world as a text. My first novel, The Icon Thief, was openly inspired by Foucault’s Pendulum by Umberto Eco, which inverts the conventions of the conspiracy thriller to explore how we tell ourselves stories about history and reality. I didn’t go quite as far as Eco did, but it was a subject that I enjoyed, and it persisted to a lesser extent in my next two books. My science fiction stories tend to follow a formula that I’ve described as The X-Files in reverse, in which a paranormal interpretation of a strange event is supplanted by another that fits the same facts into a more rational pattern. And I’m currently finishing up a book that is secretly about how the stories that we read influence our behavior in the real world. As Isaac Asimov pointed out in his essay “The Sword of Achilles,” most readers are drawn to science fiction at a young age, and its values and assumptions subtly affect how they think and feel. If there’s a single thread that runs through just about everything I’ve written, then, it’s the question of how our tendency to see the world as a story—or a text—can come back to haunt us in unexpected ways.

As it happens, we’re all living right now through a vast social experiment that might have been designed to test this very principle. I got to thinking about this soon after reading an excellent essay, “The Weight of the Words,” by the political scientist Jacob T. Levy. He begins with a discussion of Trump’s “shithole countries” remark, which led a surprising number of commentators—on both the right and the left—to argue that the president’s words were less important than his actions. Levy summarizes this view: “Ignore the tweets. Ignore Trump’s inflammatory language. Ignore the words. What counts is the policy outcomes.” He continues:

I have a hard time believing that anyone really thinks like this as a general proposition…The longstanding view among conservatives was that Churchill’s “Iron Curtain” speech and Reagan’s call to “tear down this wall” were important events, words that helped to mobilize western resistance to Communism and to provide moral clarity about the stakes of that resistance.

On a more basic level, since it’s impossible for the government to accomplish everything by force, much of politics lies in emotional coercion, which suggest that words have power in themselves. Levy refers to Hannah Arendt’s argument in The Human Condition, in which a familiar figure appears:

The stature of the Homeric Achilles can be understood only if one sees him as “the doer of great deeds and the speakers of great words”…Thought was secondary to speech, but speech and action were considered to be coeval and coequal, of the same rank and the same kind; and this originally meant not only that most political action, in so far as it remains outside the sphere of violence, is indeed transacted in words, but more fundamentally that finding the right words at the right moment, quite apart from the information or communication they may convey, is action.

Levy then lists many of the obvious ways in which Trump’s words have had tangible effects—the erosion of America’s stature abroad, the undermining of trust in law enforcement and the civil service, the growth of tribalism and xenophobia, and the redefinition of what it means to be a Republican. (As Levy notes of Trump’s relationship to his supporters: “He doesn’t speak for them; how many of them had a view about ‘the deep state’ two years ago? He speaks to them, and it matters.”) Trump routinely undercuts the very notion of truth, in what seems like the ultimate example of the power of speech over the world of fact. And Levy’s conclusion deserves to be read whenever we need to be reminded of how this presidency differs from all the others that have come before:

The alleged realism of those who want to ignore words will often point to some past president whose lofty rhetoric obscured ugly policies. Whether those presidents are named “Reagan and George W. Bush” or “JFK and Barack Obama” varies in the obvious way, but the deflationary accounts are similar; there are blunders, crimes, abuses, and atrocities enough to find in the record of every American president. But all those presidents put forward a public rhetorical face that was better than their worst acts. This inevitably drives political opponents crazy: they despise the hypocrisy and the halo that good speeches put on undeserving heads. I’ve had that reaction to, well, every previous president in my living memory, at one time or another. But there’s something important and valuable in the fact that they felt the need to talk about loftier ideals than they actually governed by. They kept the public aspirations of American political culture pointed toward Reagan’s “shining city on a hill.”

He concludes of all of our previous presidents: “In words, even if not in deeds, they championed a free and fair liberal democratic order, the protection of civil liberties, openness toward the world, rejection of racism at home, and defiance against tyranny abroad. And their words were part of the process of persuading each generation of Americans that those were constitutively American ideals.” America, in short, is a story that Americans tell one another—and the world—about themselves, and when we change the assumptions behind this narrative, it has profound implications in practice. We treat others according to the roles that we’ve imagined for ourselves, or, more insidiously, that our society has imagined for us. Those roles are often restrictive, but they can also be liberating, both for good and for bad. (Levy perceptively notes that the only federal employees who don’t feel devalued these days are immigration and border agents.) And Levy sounds a warning that we would all do well to remember:

“Ignore the tweets, ignore the language, ignore the words” is advice that affects a kind of sophistication: don’t get distracted by the circus, keep your eye on what’s going on behind the curtain. This is faux pragmatism, ignoring what is being communicated to other countries, to actors within the state, and to tens of millions of fellow citizens. It ignores how all those actors will respond to the speech, and how norms, institutions, and the environment for policy and coercion will be changed by those responses. Policy is a lagging indicator; ideas and the speech that expresses them pave the way.

“Trump has spent a year on the campaign trail and a year in office telling us where he intends to take us,” Levy concludes. And we’re all part of this story now. But we should be even more worried if the words ever stop. As Arendt wrote more than half a century ago: “Only sheer violence is mute.”

My ten great books #10: Foucault’s Pendulum

with 3 comments

Foucault's Pendulum

When a novel has been a part of your life for over twenty years, your feelings for it tend to trace the same ups and downs as those of any other friendship. An initial burst of passionate enthusiasm is followed by a long period of comfortable familiarity; you gradually start to take it for granted; and you even find your emotions beginning to cool. Faced with the same unchanging text for so long, you begin to see its flaws as well as its virtues, and if its shortcomings seem similar to your own, you can even start to resent it a little, or to question what you ever saw in it. Few books have inspired as great a range of responses in me as Foucault’s Pendulum, which in many ways is the novel that had the greatest influence on the kind of fiction I’ve attempted for most of my career. I read it at what feels in retrospect like an absurdly young age: I was thirteen, more comfortable around books than around people, and I was drawn to Umberto Eco as an exemplar of the temperament that I hoped would come from a life spent in the company of ideas. “It is a tale of books, not of everyday worries,” Eco says in the prologue to The Name of the Rose, and every line he writes is suffused with a love of history, language, art, and philosophy. Foucault’s Pendulum takes the same tendency to an even higher level: it’s a novel that often seems to be about nothing but books, with characters who exist primarily as a vehicle for long, witty conservations, crammed with esoteric lore, and a bare sliver of a thriller plot to hold it all together. For a young man who wanted to know something about everything, it was enormously attractive, and it set me off on an intellectual foxhunt that has lasted for over two decades.

Much later, as I began to write fiction of my own, I began to see how dangerous an influence this was, and I found myself agreeing with Tom Wolfe, who famously called Eco “a very good example of a writer who leads dozens of young writers into a literary cul-de-sac.” After I’d gotten my early Eco pastiches out my system, I put the book away for a long time—although not after having read it to tatters—and I started to wonder how my writing life would have been different if I’d been sucked in by the example of, say, John Fowles or John Updike. It’s only within the last few years, after I finally wrote and published my own homage to this book’s peculiar magic, that I’ve finally felt free to enjoy and appreciate it on its own terms, as an odd, inimitable byway in the history of literature that just happened to play a central role in my own life. (If I’d encountered it a few years later, I wonder if I’d even be able to finish it—I’ve never been able to get through any of Eco’s later novels.) In its final measure, Foucault’s Pendulum is one of the best of all literary entertainments, a spirited tour of some of the oddest corners of the Western mind. It’s the most genial and welcoming of encyclopedic novels, as ironic as Gravity’s Rainbow when it comes to the limits of interpretation, but too charmed by texts and libraries for its lessons to hold any sting. In the course of his research, Eco reportedly read something like a thousand works of occult literature, winnowing out and saving the best parts, and the result is a book that vibrates with the joys of the musty and obscure. And it ultimately changed me for the better. I no longer want to be Umberto Eco. But I’m very glad that Eco did.

Written by nevalalee

May 19, 2017 at 9:00 am

Rogue One and the logic of the story reel

leave a comment »

Gareth Edwards and Felicity Jones on the set of Rogue One

Last week, I came across a conversation on Yahoo Movies UK with John Gilroy and Colin Goudie, two of the editors who worked on Rogue One. I’ve never read an interview with a movie editor that wasn’t loaded with insights into storytelling, and this one is no exception. Here’s my favorite tidbit, in which Goudie describes cutting together a story reel early in the production process:

There was no screenplay, there was just a story breakdown at that point, scene by scene. [Director Gareth Edwards] got me to rip hundreds of movies and basically make Rogue One using other films so that they could work out how much dialogue they actually needed in the film.

It’s very simple to have a line [in the script] that reads “Krennic’s shuttle descends to the planet.” Now that takes maybe two to three seconds in other films, but if you look at any other Star Wars film you realize that takes forty-five seconds or a minute of screen time. So by making the whole film that way—I used a lot of the Star Wars films—but also hundreds of other films, too, it gave us a good idea of the timing.

This is a striking observation in itself. If Rogue One does an excellent job of recreating the feel of its source material, and I think it does, it’s because it honors its rhythms—which differ in subtle respects from those of other films—to an extent that the recent Star Trek movies mostly don’t. Goudie continues:

For example, the sequence of them breaking into the vault, I was ripping the big door closing in WarGames to work out how long does a vault door take to close.

So that’s what I did, and that was three months work to do that, and that had captions at the bottom which explained the action that was going to be taking place, and two thirds of the screen was filled with the concept art that had already been done and one quarter, the bottom corner, was the little movie clip to give you how long that scene would actually take.

Then I used dialogue from other movies to give you a sense of how long it would take in other films for someone to be interrogated. So for instance, when Jyn gets interrogated at the beginning of the film by the Rebel council, I used the scene where Ripley gets interrogated in Aliens.

Rogue One

This might seem like little more than interesting trivia, but there’s actually a lot to unpack. You could argue that the ability to construct an entire Star Wars movie out of analogous scenes from other films only points to how derivative the series has always been: it’s hard to imagine doing this for, say, Manchester By the Sea, or even Inception. But that’s also a big part of the franchise’s appeal. Umberto Eco famously said that Casablanca was made up of the memories of other movies, and he suggested that a cult movie—which we can revisit in our imagination from different angles, rather than recalling it as a seamless whole—is necessarily “unhinged”:

Only an unhinged movie survives as a disconnected series of images, of peaks, of visual icebergs. It should display not one central idea but many. It should not reveal a coherent philosophy of composition. It must live on, and because of, its glorious ricketiness.

After reminding us of the uncertain circumstances under which Casablanca was written and filmed, Eco then suggests: “When you don’t know how to deal with a story, you put stereotyped situations in it because you know that they, at least, have already worked elsewhere…My guess is that…[director Michael Curtiz] was simply quoting, unconsciously, similar situations in other movies and trying to provide a reasonably complete repetition of them.”

What interests me the most is Eco’s conclusion: “What Casablanca does unconsciously, other movies will do with extreme intertextual awareness, assuming also that the addressee is equally aware of their purposes.” He cites Raiders of the Lost Ark and E.T. as two examples, and he easily could have named Star Wars as well, which is explicitly made up of such references. (In fact, George Lucas was putting together story reels before there was even a word for it: “Every time there was a war movie on television, like The Bridges at Toko-Ri, I would watch it—and if there was a dogfight sequence, I would videotape it. Then we would transfer that to 16mm film, and I’d just edit it according to my story of Star Wars. It was really my way of getting a sense of the movement of the spaceships.”) What Eco doesn’t mention—perhaps because he was writing a generation ago—is how such films can pass through intertextuality and end up on the other side. They create memories for viewers who aren’t familiar with the originals, and they end up being quoted in turn by filmmakers who only know Star Wars. They become texts in themselves. In assembling a story reel from hundreds of other movies, Edwards and Goudie were only doing in a literal fashion what most storytellers do in their heads. They figure out how a story should “look” at its highest level, in a rough sketch of the whole, and fill in the details later. The difference here is that Rogue One had the budget and resources to pay someone to do it for real, in a form that could be timed down to the second and reviewed by others, on the assumption that it would save money and effort down the line. Did it work? I’ll be talking about this more tomorrow.

Written by nevalalee

January 12, 2017 at 9:13 am

How not to read the news

with 4 comments

Umberto Eco in his library

In “How Not to Use a Cellular Phone,” an essay first published in the early nineties, the late author Umberto Eco described what seemed, at the time, like the most obnoxious kind of cell phone user imaginable. It was the person who is anxious to show us how much in demand he is “for complex business discussions,” and who conducts these conversations at great length in public spaces like airports or restaurants, thinking that the impression he makes is “very Rockefellerian.” Eco observed:

What these people don’t realize is that Rockefeller doesn’t need a portable telephone; he has a spacious room full of secretaries so efficient that at the very worst, if his grandfather is dying, the chauffeur comes and whispers something in his ear. The man with power is the man who is not required to answer every call; on the contrary, he is always—as the saying goes—in a meeting…So anyone who flaunts a portable phone as a symbol of power is, on the contrary, announcing to all and sundry his desperate, subaltern position, in which he is obliged to snap to attention, even when making love, if the CEO happens to telephone…The fact that he uses, ostentatiously, his cellular phone is proof that he doesn’t know these things.

At first glance, Eco’s point might seem dated. Few people these days regard the mere act of using a cell phone as a status symbol, and if anything, the sight of someone actually talking on one has begun to feel slightly quaint. In fact, of course, the essay isn’t dated at all. The only difference is that we’ve all been transformed into the sorry figure whom Eco describes. Like him, we’re expected to be available at all times for emails, texts, tweets, and even the occasional phone call, and we don’t have the consolation of thinking that it makes us special. Instead, we’re all uniformly vulnerable to constant interruption, not only by friends and colleagues, but by strangers, spammers, and nonhuman sources of distraction. I’m thinking, in particular, of the news. The gap between an event in the world and its dissemination, analysis, and dismissal online has been reduced to invisibility, and it’s only going to get worse. During the election, there were times when I felt like a slave to information, which is just one step away from noise, and I took steps to insulate myself from it. At the time, I thought it was a temporary measure, but now it looks more like a way of life. Which, in a way, may be the only truly positive outcome of this past year. It forced me to do what I never would have been able to accomplish voluntarily: to take a step back and think more critically about my relationship to the unending deluge of data in which we live.

Robert A. Heinlein

You could make the case we have a moral obligation to be informed of all events as soon as they occur, or that unplugging is a form of denial in itself, but those who lived through even more stressful times knew better. In a letter dated December 21, 1941, two weeks after the attack on Pearl Harbor, Robert A. Heinlein described his own “mental ostrichism” to John W. Campbell:

A long time ago I learned that it was necessary to my own mental health to insulate myself emotionally from everything I could not help and to restrict my worrying to things I could help. But wars have a tremendous emotional impact and I have a one-track mind. In 1939 and 1940 I deliberately took the war news about a month later, via Time magazine, in order to dilute the emotional impact. Otherwise I would not have been able to concentrate on fiction writing at all. Emotional detachment is rather hard for me to achieve, so I cultivate it by various dodges whenever the situation is one over which I have no control.

It’s a statement that seems all the more remarkable to me the more I think about it. Whatever his other flaws, Heinlein wasn’t a mental weakling, or a man inclined to avoid confronting reality, and the fact that he felt the need—as a form of preventative mental hygiene—to delay the news by a month is tremendously comforting. And it reassures me that I’m justified in thinking hard about the way in which I relate to the information at my disposal.

To put it bluntly, there’s nothing wrong with reading the paper every morning, absorbing what seems to have mattered over the last twenty-four hours, and then turning off the spigot for the rest of the day. It’s how people got their news for most of the twentieth century, which certainly wasn’t lacking in meaningful events. (Increased coverage doesn’t always lead to greater understanding, and you could even make the case that the sheer volume of it—which has diffused the impact of what is truly important and paved the way for the rise of fake news—has inhibited our ability to respond.) It may even turn out to be more useful to postpone these confrontations to a modest degree. When Napoleon was the Emperor of France, he developed a strategy for dealing with the massive amount of correspondence that he received: he would wait a week before opening any new letters, and by the time he got around to looking at a particular problem or request, he would usually find that it had been resolved, or that the passage of time had put it into perspective. The news works in much the same way. There are very few items that can’t be better understood after a day or two has passed, and for those rare events that are so urgent that they can’t be ignored, there will always be a chauffeur, as Eco puts it, to whisper it in our ears. As Heinlein understood, when you can’t help something in the short term, you have to manage your relationship to it in ways that maximize your potential impact over the long run. It’s measured in years rather than seconds. And it starts right now.

Written by nevalalee

January 2, 2017 at 9:03 am

The Importance of Writing “Ernesto,” Part 1

leave a comment »

My short story “Ernesto,” which originally appeared in the March 2012 issue of Analog Science Fiction and Fact, has just been reprinted by Lightspeed. To celebrate its reappearance, I’ll be publishing revised versions of a few posts in which I described the origins of this story, which you can read for free here, along with a nice interview. Please note that this post reveals details about the ending. 

Readers of the story “Ernesto” might reasonably assume that I have a strong interest in the career of Ernest Hemingway. The central character, after all, is a thinly veiled version of the young Hemingway, with a dash of Sherlock Holmes, investigating what initially appears to be a paranormal mystery in the Madrid of the Spanish Civil War. At first glance, it might even seem like a work of Hemingway fanfic, like Bradbury’s “The Kilimanjaro Device,” or Joe Haldeman’s far darker and more sophisticated “The Hemingway Hoax.” (Science fiction writers have always been drawn to Hemingway, who certainly had a lot to say about the figure of the competent man.) In fact, although I live in Hemingway’s hometown of Oak Park, and my daughter has learned to recognize his face on the omnipresent signs that have been posted near the library, he’s a writer I’ve always found hard to like, if only because his style and preoccupations are so radically removed from mine. And the chain of events that led me to write about him is my favorite example from my own career of what I’ve elsewhere called the anthropic principle of fiction, or how a story is never really about what it seems.

“Ernesto” emerged, like many of my stories, from an idea sparked by a magazine article. In this case, it was a piece in Discover by the science writer Jeanne Lenzer about the work of Dr. William Coley, the nineteenth-century surgeon who experimented with bacterial infections, especially erysipelas, as a treatment for cancer. Around the same time, another article in the same magazine had started me thinking about a story about the investigation of miracles by the Catholic Church. And while that particular notion didn’t go anywhere, I ended up settling on a related premise: a mystery about a series of apparently miraculous cures that are actually due to the sort of cancer immunotherapy that Coley had investigated. The crucial step, it seemed, was to find an appropriate figure of veneration, ideally a Catholic saint, around whom I could build the story. And it took only a few minutes of searching online to come up with a viable candidate: St. John of the Cross, the Spanish mystic of the sixteenth century, who died of erysipelas. No other historical figure, as far as I could see, fit all the criteria so well.

Here, then, I had the germ of a story, which could be described in a single sentence: a number of visitants to the tomb of St. John of the Cross are cured of cancer, in what seems like a miracle, but is really due to the side effects of an erysipelas infection. (I knew that there were a few holes in the science here, but I was confident I could work my way around them.) At this point, however, I became conscious of a problem. Since the story was supposed to be a mystery along the lines of The X-Files, I couldn’t have the solution be obvious from the beginning, and I was pretty sure that any modern doctor would be able to tell fairly quickly that a patient was suffering from erysipelas. To delay this revelation, and to mislead the reader, I had to keep my patients away from the hospital for as long as possible, which implied that I couldn’t set the story in the present day. This meant that I was suddenly looking at a period piece that was set in Spain, although not so far in the past that I couldn’t talk about Coley’s work. Which led me, by a logical process of elimination, to the Spanish Civil War.

And that’s how Hemingway entered the story—in the most roundabout way imaginable. When I began devising the plot, not only did I not have Hemingway in mind, but I didn’t even have a setting or a time period. The search for the right saint carried me to Spain, and the specifics of the story I wanted to tell led me to the Spanish Civil War, which would allow me to confuse the issue long enough to delay the solution. At the time, it felt almost random, but when I look back, it seems as mathematically necessary as the reasoning that Poe once claimed was behind the composition of “The Raven.” Once the essential foundations have been set, the writer’s imagination can begin to play, and it seemed to me that if I was going to tell a story about the Spanish Civil War, it pretty much had to include Hemingway. As Umberto Eco says in Foucault’s Pendulum: “Like soy sauce in Chinese dishes. If it’s not there, it’s not Chinese.” Within a few days of starting my research, then, I found myself facing the prospect of writing a story about Hemingway investigating a paranormal mystery in wartime Spain. I really wanted to do it. But I wasn’t sure that I could.

The alphabet method

leave a comment »

Your Key to Creative Thinking

It might seem like quite a leap to get from The Gulag Archipelago to The Complete Scarsdale Medial Diet, but creativity makes for strange bedfellows. I got to thinking yesterday about Aleksander Solzhenitsyn’s rosary, which he used to compose and memorize poetry in prison, after picking up a book by Samm Sinclair Baker, who cowrote the aforementioned diet manual with the unfortunate Dr. Herman Tarnower. Baker, of whom I hadn’t heard until recently, was an intriguing figure in his own right. He was a former gag cartoonist who became an advertising copywriter and executive at two agencies during the Mad Men era, and then quit to write a series of self-help books on subjects ranging from gardening to skin problems to sex. Among them was a slim volume called Your Key to Creative Thinking, which I picked up at a yard sale last weekend for less than a dollar. It’s a breezy read, full of useful advice, much of which I’ve covered on this blog before. Baker advises the reader to seek out as many facts as possible; to adapt ideas from different fields or categories; to use words or pictures as a source of random associations; to invert your criteria or assumptions; to take good notes; and to let the ideas simmer by relaxing or going for a walk. They’re all valuable tips, of the kind that nearly every creative professional figures out eventually, and Baker presents them in a fluffy but engaging way. Used copies of his book currently sell for a penny on Amazon, and it’s worth checking out if, like me, you’re addicted to this sort of thing.

But what really caught my eye—and for reasons that may not have occurred to the author himself—was a section titled “Alphabet Creative-Spur System.” Baker writes:

Here’s a little creative-spur system that I’ve always kept as a helpful, small “secret method” for myself. It’s a quick aid in sparking creative thinking and rapid results.

This system is simply a matter of running down the alphabet with the key word of your problem and developing ideas in rhyming variations of the word…On quick, simple problems run the key word through your mind, varying it letter by letter, from A to Z, in rhyming fashion.

In respect to more complicated, weightier problems, work with pencil and paper, or typewriter, setting down letter by letter and filling out accordingly.

As an example, Baker uses the word “detergent.” He runs through the alphabet, looking for rhymes and near-rhymes like “emergent” (“You can see how greater cleanliness ‘emerges’ from using this detergent”), “he-detergent” (“Consider featuring this one as the ‘he-man’ detergent that has extra muscle”), and “pre-tergent” (“This suggests a preparatory phase built into the product, so that it produces double cleaning action”).


At first glance, the method seems cute but not particularly revelatory. What struck me when I tried it, though, is how conveniently it can be done in your head, and how easy it is to remember the results. That’s a more powerful combination than it sounds. I’ve developed a lot of creative hacks over the years, from mind maps to the use of random quotations to spark a train of thought, but most require a fair amount of preparation, or at least that I sit down for half an hour or so with pen and paper. This isn’t always possible, and one of the key problems in any creative artist’s life is how to fill in those precious scraps of time—on the bus, in line at the grocery store, in the shower—that seem like prime real estate for thinking. The nifty thing about the alphabet method is its simplicity, its instantaneous accessibility, and its ease of retention. It doesn’t require any tools at all. The underlying mechanism is automatic, almost mindless. You can do it for thirty seconds or five minutes while keeping half of your attention somewhere else. And best of all, the ideas that it generates can be called back without any effort, assuming that the connection between the rhyming key word and the associated concept is solid enough. That’s a nice benefit in itself. Writers are advised to keep a notebook on hand at all times, but that isn’t always possible. With the alphabet method, you don’t need to worry about writing down what it generates, because you can always recreate your train of thought with a minimum of trouble.

And I have a hunch that it could provide the basis for other creative strategies. The idea of using the alphabet as a mnemonic device isn’t a new one, and there are even theories that the alphabet itself arose as a way to memorize information encoded in the order and names of the letters. (Robert Graves, in The White Goddess, offers up a particularly ingenious interpretation along these lines.) But it isn’t hard to envision a system in which the beats of a story, say, could be retained in the head by associating each section with an alphabetic keyword. Here, for instance, is how I’d memorize the first few story points of Casablanca:

A) “African music,” followed by the Marseillaise, plays over the opening credits. As Umberto Eco notes: “Two different genres are evoked: adventure movie and patriotic movie.”
B) “But not everyone could get to Lisbon directly.” The narrator describes the refugee trail from Paris.
C) “Casablanca to Lisbon to America.” Refugees wait for visas to make the trip to the promised land.
D) “Deutschland über Alles.” The arrival of Major Strasser. His conversation with Captain Renault.
E) “Everybody comes to Rick’s…”

And so on. The human brain isn’t particularly good at keeping track of more than a few pieces of information at a time, but the great thing about the alphabet method is that you aren’t really memorizing anything: you’re just preserving the initial seed of a process that can be used to generate the same idea when necessary. I may not remember exactly what Baker had in mind with the word “pre-tergent,” but I can reconstruct it easily, and that’s doubly true when it comes to my own ideas. All it requires is that you know the alphabet, that you can run through it letter by letter, and that you’re more or less the same person you were when you came up with the idea in the first place. You don’t need a rosary. All you need is the alphabet, and yourself.

The act of noticing

leave a comment »

Jonathan Franzen

Note: I’m on vacation this week, so I’ll be republishing a few of my favorite posts from earlier in this blog’s run. This post originally appeared, in a slightly different form, on September 24, 2014.

Yesterday, while playing with my daughter at the park, I found myself oddly fascinated by the sight of a landscaping crew that was taking down a tree across the street. It’s the kind of scene you encounter on a regular basis in suburbia, but I wound up watching with unusual attention, mostly because I didn’t have much else to do. (I wasn’t alone, either. Any kind of construction work amounts to the greatest show on earth for toddlers, and there ended up being a line of tiny spectators peering through the fence.) Maybe because I’ve been in a novelistic state of mind recently, I focused on details that I’d never noticed before. There’s the way a severed tree limb dangles from the end of the crane almost exactly like a hanged man, as Eco describes it in Foucault’s Pendulum, with its heavy base tracing a second, smaller circle in the air. I noted how a chainsaw in action sprays a fan of fine particles behind it, like a peacock’s tail. And when the woodchipper shoots chips into the back of the truck, a cloud of light golden dust forms above the container, like the soul of the tree ascending.

As I watched, I had the inevitable thought: I should put this into a story. Unfortunately, nothing I’m writing at the moment includes a landscaping scene, and the easiest way to incorporate it would be through some kind of elaborate metaphor, as we often see, at its finest, in Proust. (“As he listened to her words, he found himself reminded of a landscaping crew he had once seen…”) But it made me reflect both on the act of noticing and on the role it plays, or doesn’t, in my own fiction. Most of the time, when I’m writing a story, I’m following the dictates of a carefully constructed plot, and I’ll find myself dealing with a building or a city scene that has imposed itself by necessity on the action: my characters end up at a hospital or a police station, and I strain to find a way to evoke it in a few economical lines that haven’t been written a million times before. Occasionally, this strikes me as a backward way of working. It would be better, it seems, to build the story around locations and situations that I already know I can describe—or which caught my attention in the way that landscaping crew did—rather than scrambling to push out something original under pressure.

Joseph O'Neill

In fact, that’s the way a lot of novelists work, particularly on the literary end. One of the striking trends in contemporary fiction is how so much of it doubles as reportage, with miniature New Yorker pieces buried like bonbons within the larger story. This isn’t exactly new: writers from Nabokov to Updike have filled their novels with set pieces that serve, in James Wood’s memorable phrase, as “propaganda on behalf of good noticing.” What sets more recent novels apart is how undigested some of it seems. At times, you can feel the narrative pausing for a page or two as the writer—invariably a talented one, or else these sections wouldn’t survive the editorial process—serves up a chunk of journalistic observation. As Norman Mailer writes, rather unkindly, of Jonathan Franzen:

Everything of novelistic use to him that came up on the Internet seems to have bypassed the higher reaches of his imagination—it is as if he offers us more human experience than he has literally mastered, and this is obvious when we come upon his set pieces on gourmet restaurants or giant cruise ships or modern Lithuania in disarray. Such sections read like first-rate magazine pieces, but no better—they stick to the surface.

This isn’t entirely fair to Franzen, a superb noticer who creates vivid characters even as he auditions for our admiration. But I thought of this again after finishing Joseph O’Neill’s Netherland. It’s a novel I’d wanted to read for years, and I enjoyed it a hell of a lot, while remaining conscious of its constant shifts into what amounts to nonfiction: beautifully written and reported essays on New York, London, the Hague, India, cricket, and just about everything else. It’s a gorgeous book, but it ends up feeling more like a collection of lovingly burnished parts than a cohesive whole, and its acts of noticing occasionally interfere with its ability to invent real interactions for its characters. It was Updike himself, I think, who warned writers against mining their journals for material, and you can see why: it encourages a sort of novelistic bricolage rather than an organic discovery of the action, and the best approach lies somewhere in the middle. And there’s more than one way of telling a story. As I was studying the landscaping crew at the park, my daughter was engaged in a narrative of her own: she ran into her friend Elise, played on the seesaw, and then had to leave abruptly for a diaper change. Or, as Beatrix put it, when I asked about her day: “Park. Elyse. Say hi. Seesaw. Poop. Go home.” And I don’t think I can do better than that.

Our struggle, part two

leave a comment »

William B. Davis on The X-Files

Note: Spoilers follow for the X-Files episode “My Struggle II.”

“The writers we absorb when we’re young bind us to them, sometimes lightly, sometimes with iron,” Daniel Mendelsohn once wrote in The New Yorker. “In time, the bonds fall away, but if you look very closely you can sometimes make out the pale white groove of a faded scar, or the telltale chalky red of old rust.” That’s true of movies, television, and other forms of art, too, and it’s particularly powerful when it happens in your early teens. If you want to change somebody’s life forever, just find him when he’s thirteen—and give him a book. I’ve increasingly come to recognize that two-thirds of my inner life was shaped by half a dozen objects that I happened to encounter, almost by accident, during a window of time that opened up when I was twelve and closed about two years later. They included a copy of Isaac Asimov’s Science Fiction Magazine, a movie and a television series by David Lynch, and a pair of novels by Umberto Eco. Take any of these props away, and the whole edifice comes crashing down, or at least reassembles itself into a drastically different form. And of all the nudges I received that put me on the course I’m on today, few have been more dramatic than that of The X-Files, which premiered as I was entering the eighth grade and left a mark, or a scar like that of a smallpox vaccination, that I can still see now.

I’m writing this because I’ve realized that a young person encountering The X-Files today for the first time at age thirteen, as I did, wouldn’t even have been born when the original finale aired. It’s likely, then, that there’s a version of me being exposed to this premise and these characters courtesy of the show’s revival who has never seen the series in any other form. And I honestly have no idea what that kid must be thinking right now. Aside from a miracle of an episode from Darin Morgan, the reboot has been an undeniable letdown even for longtime fans, but to new viewers, it must seem totally inexplicable. It’s easy to picture someone watching this week’s finale—which is devoid of thrills, suspense, or even basic clarity—and wondering what all the fuss was about. I’ve long since resigned myself to the fact that my favorite television series, or at least the one that had the greatest impact on what I’ve ended up doing with my life, was so uneven that I don’t need to watch the majority of its episodes ever again. But to someone who hasn’t made that mental adjustment, or isn’t familiar with the heights the show could reach on those rare occasions when it was firing on all cylinders, the revival raises the question of why anyone was clamoring for its return in the first place. If I were watching it with someone who had never seen it before, and who knew how much I loved it, I’d be utterly humiliated.

Lauren Ambrose and Gillian Anderson on The X-Files

I don’t think anyone, aside perhaps from Chris Carter, believes that this season gained many new fans. But that isn’t the real loss. The X-Files, for all its flaws, was a show that could change lives. I’ve written here before of the Scully effect that led young women to pursue careers in science, medicine, and law enforcement—which would be completely incomprehensible to someone who knows Scully only from her reappearance here. (Gillian Anderson does what she can, as always, but she still sounds as if she’s reading the opening narration to “My Struggle II” at gunpoint. And when she sequences her own genome in what feels like record time, I just wanted her to say that she was sending it to Theranos.) The reboot isn’t likely to spark anyone’s curiosity about anything, aside from the question of why so many people cared. And while it’s a tall order to ask a television show to change lives, it isn’t so unreasonable when you consider how it once pulled it off. The X-Files entered my life and never left it because it was clever, competent, and atmospheric; it featured a pair of attractive leads whom I’d be happy to follow anywhere; and its premise pointed toward a world of possible stories, however little of it was fulfilled in practice. It changed me because it came along at the right time and it did what it was supposed to do. The reboot didn’t even manage that. If anything, it made me retroactively question my own good taste.

I won’t bother picking apart “My Struggle II” in detail, since the episode did a fine job of undermining itself, and there are plenty of postmortems available elsewhere. But I’ve got to point out the fundamental narrative miscalculation of keeping Mulder and Scully apart for the entire episode, which is indefensible, even if it was the result of a scheduling issue. Even at the revival’s low points, the chemistry between the leads was enough to keep us watching, and removing it only highlights how sloppy the rest really was. It doesn’t help that Scully is paired instead with Lauren Ambrose, giving a misdirected interpretation of a character who isn’t that far removed from Scully herself in the show’s early seasons—which just reminds us of how much Anderson brought to that part. The episode falls to pieces as you watch it, packing a contagion storyline that could have filled an entire season into less than fifty minutes, reducing Joel McHale’s right-wing pundit, who was such a promising character on paper, to a device for delivering exposition. (Since the episode ends on a cliffhanger anyway, it could have just moved it to earlier in the story, ending on the outbreak, which would have given it some breathing room. Not that I think it would have mattered.) As the revival slunk to its whimper of a close, my wife said that I’d been smart to keep my expectations low, but as it turns out, they weren’t low enough. If the series comes back, I’ll still watch it, in yet another triumph of hope over experience. Keeping up my hopes will be a struggle. But it wouldn’t be the first time.

Written by nevalalee

February 24, 2016 at 9:48 am

The Eco Maker

with 16 comments

Umberto Eco

Umberto Eco died on Saturday. The news shattered me, all the more because Eco had been slowly drifting away from the central role in my life that he had played for so long. But I was thinking of him just a few days ago, daydreaming, as I often do, about living on the road with nothing but what I could carry in a backpack. Foucault’s Pendulum, I decided, would be the one novel I would bring for my own pleasure, consumed a page or two at a time in hotel lobbies or on train station platforms. I’ve noted here before that just about everything I’ve published at novel length, notably The Icon Thief, has been a kind of dialogue with Foucault’s Pendulum, which rocked my world when I first read it more than twenty years ago, so that countless thankfully unfinished manuscripts from my teens bore the mark of its influence. The paperback copy I bought in my hometown all those years ago has accompanied me on every move ever since, and it’s been read so often—certainly more than a dozen times—that it has taken on some of the qualities of my own face. On its spine, it bears two deep parallel creases, about half an inch apart, like the lifelines on the palm of a lovingly worn workman’s glove: a testament to a lifetime’s faithful service. And although I’ve retreated from and returned to it countless times over the last two decades, every cycle brings me closer to its vision again, so that by now I’ve accepted that it’s a part of me.

But it wasn’t until I heard the news of Eco’s death that I began to reflect on what this really meant. Eco was an incredibly prolific writer and scholar, but for most fans, I have a feeling that he’ll be remembered best for two books. The Name of the Rose remains, rightly, the more famous and beloved, and the first to be mentioned in any obituary: William of Baskerville is still the most fully realized character Eco ever created, even if I insist on picturing him as Sean Connery, and for a lot of readers, Brother William became the guide to a labyrinthine library that some of us never escaped. He pointed me toward Borges, as he did with so many others, which is legacy enough for anyone. If I’m honest with myself, he also turned me onto Latin, and to a lesser extent Greek, which I spent four years studying in college in part just so I could read that novel’s many untranslated passages. (I achieved that goal with mixed success: there was probably a period of six months or so where I could easily read those sections at sight—as if the meanings were appearing in the margin in invisible ink—but the words have long since faded again.) And Foucault’s Pendulum, his other lasting work, opened up a whole world of ideas, or, more accurately, the idea that seeking patterns in those ideas was the greatest game in the world. Eco warns us that it can also be a pathology, but he can only make his case by exposing us first to its delights, and I’ve spent most of my life walking the fine line that he traced.

Foucault's Pendulum

That remarkable ability to spin webs of ideas into a book that ordinary readers could love—a challenge that many writers have tackled since and none has done nearly as well—explains why Eco was such a problematic figure to so many other authors. There was Salman Rushdie, who appears to have glimpsed the similarity between Eco and himself, in a sort of uncanny valley, and famously trashed Foucault’s Pendulum as “entirely free of anything resembling a credible spoken word.” And I’ve often quoted Tom Wolfe, who said, accurately enough: “Eco is a very good example of a writer who leads dozens of young writers into a literary cul-de-sac.” (Although when I look at that statement now, I can’t help but echo Tobias on Arrested Development: “There are dozens of us. Dozens!”) Eco was a dangerous example for all the obvious reasons. He came to fiction in his fifties, after an entire career spent living in the world and thinking about ideas, which is very different from doing it in your twenties. His preternatural facility as a writer camouflaged the fact that his accomplishment in The Name of the Rose and Foucault’s Pendulum was not only difficult, but all but irreproducible, even for him: I never managed to finish any of his later novels. These two books wouldn’t exist at all if they didn’t pull off the impossible feat of making us care about ideas as if they were people, but it’s such a freakish trick that you want to warn young writers to spend their time with authors who struggle honestly with creating real men and women.

I agree with all of this. And I’ve spent much of my life trying to free myself from Eco’s magic spell, even if it took me the better part of two novels to do so. But I’ve also come to realize that if Eco is a dead end, it’s one that’s still worth taking, if you’re one of the dozens of young writers for whose souls Wolfe was so concerned. To take an illustration that Eco, with his omnivorous embrace of popular culture, might have liked: if this is a cul-de-sac, it’s like the fake tunnel that Wile E. Coyote paints on the side of a wall, only to have the Road Runner race through it. There’s something tantalizingly real and moving about Eco’s artifice, and it strikes me now as just as valid as the ones created by all those painstaking noticers of human behavior. It’s a story of books, not of everyday worries, as he writes in The Name of the Rose, and it’s the story in which I’ve found myself, when I look honestly at my own life. It’s a dead end that feels more expansive, as time goes on, than any other alternative. Even if it isn’t for everyone, I’ve come to recognize that Eco was the point of origin from which I had to distance myself, only to find my steps curving back in its direction after I’d acquired some necessary experience that didn’t come from libraries. And I can only return to those words from Thomas à Kempis that Eco shared with us so long ago: In omnibus requiem quaesivi, et nusquam inveni nisi in angulo cum libro. I sought peace everywhere, and I found it only in a corner with a book. And its title was Foucault’s Pendulum.

Written by nevalalee

February 22, 2016 at 9:49 am

The working library

with one comment

Umberto Eco in his library

A…shock of banality occurs to many people in my condition—that is, people who possess a fairly sizable library (large enough in my case that someone entering our house can’t help but notice it; actually, it takes up the whole place). The visitor enters and says, “What a lot of books! Have you read them all?” At first I thought that the question characterized only people who had scant familiarity with books, people accustomed to seeing a couple of shelves with five paperback mysteries and a children’s encyclopedia, bought in installments. But experience has taught me that the same words can be uttered also by people above suspicion. It could be said that they are still people who consider a bookshelf as a mere storage place for already-read books and do not think of the library as a working tool. But there is more to it than that. I believe that, confronted by a vast array of books, anyone will be seized by the anguish of learning and will inevitably lapse into asking the question that expresses his torment and his remorse…

But the question about your books has to be answered, while your jaw stiffens and rivulets of cold sweat trickle down your spine. In the past I adopted a tone of contemptuous sarcasm. “I haven’t read any of them; otherwise, why would I keep them here?” But this is a dangerous answer because it invites the obvious follow-up: “And where do you put them after you’ve read them?” The best answer is the one always used by Roberto Leydi: “And more, dear sir, many more,” which freezes the adversary and plunges him into a state of awed admiration. But I find it merciless and angst-generating. Now I have fallen back on the riposte: “No, these are the ones I have to read by the end of the month. I keep the others in my office,” a reply that on the one hand suggests a sublime ergonomic strategy, and on the other leads the visitor to hasten the moment of his departure.

Umberto Eco, How to Travel with a Salmon

Written by nevalalee

February 20, 2016 at 7:00 am

The space between us

with 2 comments

Elizabeth Kolbert

Last year, Elizabeth Kolbert of The New Yorker published a skeptical article about the various proposals to put human beings on Mars. Kolbert, who won a Pulitzer Prize for her excellent book The Sixth Extinction, is inclined—as many of us are—to regard such projects as Mars One as the province of hucksters and crackpots, but she’s also doubtful of the entire idea of planetary colonization itself. Taking note of the Fermi Paradox, which asks why we haven’t seen any evidence of the alien life that logic says should be all around us, Kolbert suggests that the lack of visible signs of intelligent activity isn’t due to some unavoidable cataclysm that swallows up all civilizations or a mysterious resolve to remain invisible, but the result of a sensible focus elsewhere: “Perhaps the reason we haven’t met any alien beings is that those which survive aren’t the type to go zipping around the galaxy. Maybe they’ve stayed quietly at home, tending their own gardens.” Kolbert concludes that the idea of sending people to Mars “is either fantastically far-fetched or deeply depressing.” When I read those words six months ago, something in me rebelled against them on a fundamental level: I wasn’t ready to give up on that dream. But at some point in recent days, I realized that I’d changed my mind, and that I now agree with Kolbert. I no longer think that we have any business going to Mars. At least not yet.

And I’ve arrived at this conclusion not despite my background in science fiction, but because of it. One of the smartest observations ever made about the genre comes courtesy of the great Jack Williamson, who once said:

The average [science fiction] author is more stage magician, a creator of convincing illusions, than scientist or serious prophet. In practice, once you’re into the process of actually writing a work of fiction, the story itself gets to be more important than futurology. You become more involved in following the fictional logic you’ve invented for your characters, the atmosphere, the rush of action; meanwhile, developing real possibilities recedes. You may find yourself even opting for the least probable event rather than the most probable, simply because you want the unexpected.

This certainly squares with my own experience as a writer. And that last sentence applies not just to the plots of individual stories but to the conventions of science fiction as a whole. When we think of science fiction, we tend to think first of manned space flight, which means that it’s also inextricably tied up with our vision of our “real” future. But when you look at that assumption more closely, it falls apart. Why, exactly, should we assume that space will be an integral part of our destiny as a species? And why did science fiction try so hard to convince us that it would be?

Jack Williamson

The real answer lies in Williamson’s shrewd observation: “The story itself gets to be more important than futurology.” When science fiction reemerged as a viable genre in the late twenties and early thirties, it was essentially a subcategory of men’s adventure fiction, with ray guns substituted for revolvers. Many of the stories it told could easily have been translated into earthly terms, and space was less important in itself than as the equivalent of the unexplored frontier of the western: it stood for the unknown, and it was a perfect backdrop for exciting plots. Later, however, under the guidance of editors like F. Orlin Tremaine and John W. Campbell of Astounding Science Fiction, the genre began to take itself more seriously as futurology—but with outer space grandfathered in as a setting, even if it had little to do with any plausible vision of things to come. Space exploration began to seem like an essential part of our shared future because it happened to be part of the genre already, for reasons that had less to do with serious speculation than with a writer’s need to keep the pages turning. And it takes a real effort of the imagination, now that science fiction seems so inevitable, to see how arbitrary that emphasis really was, and how so much of it depends on what Campbell, in particular, happened to find interesting. (As Bruce Sterling put it: “There has never been another editor of [Campbell’s] stature who would sort of come in and say, ‘All right, you guys are going to do it my way—and here is like a series of things we’re going to write about: robots, psi, space travel. And here’s a bunch of stuff we’re not going to write about: women, black people, drugs.'”)

And trying to shape our future based on decisions made by an army of pulp writers, no matter how talented, strikes me now as quixotic, in the original sense of the term. As Umberto Eco says in Foucault’s Pendulum: “People don’t get the idea of going back to burn Troy just because they read Homer.” In reality, our future is already taking a very different form: grounded on this planet, founded on information, and mindful of the fragility of our predicament right here. And it’s time that we grudgingly recognized this. This doesn’t mean that we need to give up on the dream of putting a person on Mars: only that we detach it, gently but firmly, from the idea of our collective destiny, and restore it to its proper place as a kind of interesting side project. In the grand scheme of things, it doesn’t matter if we do it in the next fifty years or the next five hundred, especially when there are so many other problems that require our attention right now. (The longing to see it happen in our own lifetimes is understandable, but also a little selfish.) Our efforts to explore and understand space itself are vital and elevating, as the recent flurry of excitement over a potential Planet Nine reminds us, but devoting billions of dollars to placing a human being on a spacecraft—simply because a few good writers seized our imagination decades ago—seems misguided at best, irresponsible at worst. If we really want to explore the unknown for the sake of our souls, there’s always the deep sea, or Antarctica, which would confer the same spiritual benefits at far less of a cost. And while there may not be life on Mars, now or ever, we can still allow ourselves to hope for a life beyond it.

“The yacht was a monster…”

leave a comment »

"Maddy gazed out at the sea..."

Note: This post is the thirty-fifth installment in my author’s commentary for Eternal Empire, covering Chapter 34. You can read the previous installments here.

Umberto Eco once said that he wrote The Name of the Rose because he felt like poisoning a monk. For William Faulkner, The Sound and the Fury began with a mental picture:

I didn’t realize at the time it was symbolical. The picture was of the muddy seat of a little girl’s drawers in a pear tree, where she could see through a window where her grandmother’s funeral was taking place and report what was happening to her brothers on the ground below. By the time I explained who they were and what they were doing and how her pants got muddy, I realized it would be impossible to get all of it into a short story and that it would have to be a book.

Joseph Heller started writing Something Happened with two sentences that came to him out of nowhere: “In the office in which I work, there are four people of whom I am afraid. Each of these four people is afraid of five people.” And E.L. Doctorow, in the middle of a bad case of writer’s block, began Ragtime by staring at the wall of his office, writing about it and the surrounding house, and then trying to imagine the period in which it was built—”In desperation,” Doctorow told The Paris Review, “to those few images.”

One of the subtle privileges of the writer’s craft is that while a reader generally reads a story from first page to last, the initial seed from which it grew in the author’s mind can occur at any point in the narrative, and it often isn’t clear, when you look at the finished result, which part came first. The idea of an author beginning with an inciting incident and following its implications to the very last page is an attractive one, and many writers start their apprentice efforts in much the same way. Usually, though, after the writer learns more about structure and the logistics of finishing a major project, the germ that gives rise to the rest of it turns out to be a moment that lies somewhere in the middle, with the writer working in either direction to lead toward and away from that first spark of inspiration. And this approach can work enormously in the story’s favor. We’re all hoping to come up with an arresting beginning, but we’re less likely to discover it from first principles than to derive it, almost mathematically, from a scene to which it leads a hundred pages down the line. The more rigorously you work out that logic, following what I’ve elsewhere called the anthropic principle of fiction, the more likely you are to arrive at an opening—as well as a setting and a cast of characters—that never would have occurred to you if you had tried to invent a grabber from scratch. (If you do, the strain often shows, and the reader may rightly wonder if you’ll be able to sustain that level of intensity to the end.)

"The yacht was a monster..."

Even novels or stories that unfold along fairly conventional lines often benefit from originating in an odd, intensely personal seed of obsession. The Icon Thief and its sequels were written to honor, rather than to undermine, the conventions of the thriller, but each one grew out of an eccentric core that had little to do with the plot summary you see on the back cover. For The Icon Thief, the real inciting factor—aside from a vague ambition to write a suspense novel about the art world—was my discovery of Marcel Duchamp’s Étant Donnés and my determination to be the first writer to build a novel around what Jasper Johns called “the strangest work of art in any museum.” For City of Exiles, it was my longstanding interest in the vision of Ezekiel, which I’d tried on and off to incorporate into a novel for almost two decades before finding a place for it here. And for Eternal Empire, it was my desire to write a novel about a megayacht. I’m not sure if this comes through in text of the book itself: the yacht in question, the Rigden, doesn’t make an appearance until halfway through the story, and maybe a quarter of the book as a whole is set on or around it. But I knew before I’d figured out anything else about the plot that I wanted a yacht like this to be at the center, which, in turn, implied much of the rest. You don’t write a novel about a megayacht, especially one owned by a Russian oligarch at the heart of what looks to be a vast conspiracy, without being prepared to sink it with everyone on board.

The moment when the yacht goes down—and I don’t think I’m spoiling much by saying this—won’t occur for another hundred pages or so, and I’ll deal with those scenes when I come to them. (To my eyes, the yacht’s destruction and the ensuing showdown onshore are the best extended sequences I’ve ever written, and they’re among the few sections that I’m likely to read again for my own pleasure.) But I want to focus for now on the first time we see the Rigden, in Chapter 34, after a few dozen pages’ worth of buildup. Aside from Titanic, my inspiration here was the obligatory scene in the early Star Trek films in which Kirk first approaches the Enterprise, allowing for a few minutes of awed tracking shots of the starship’s exterior—a convention that J.J. Abrams, alas, is too busy to honor. It slows down the narrative incrementally, but it also provides a sense of scale that strengthens much of what follows. And since this is more or less the reason I wanted to write the entire book, I felt justified in lingering on it. When Maddy gets her first glimpse of the yacht, the metaphorical implications are obvious, as is the impact of the ship’s existence on the shape of the story itself: a book about a yacht also has to be about a journey, and figuring out the start and end points was half the fun. Even if most of the book takes place on land, the events that unfold there are largely designed to get us onto and off that ship. And even if the destination remains unknown, we know that we’ll get there in style…

Inventing conspiracies for fun and profit

leave a comment »

Umberto Eco

Note: Since I’m taking a deserved break for Thanksgiving, I’m reposting a few popular posts this week from earlier in this blog’s run. This post was originally published, in a slightly different form, on December 19, 2012.

If it sometimes seems like we’re living in a golden age for conspiracy theories, that shouldn’t come as a surprise. Conspiracies are ultimately about finding connections between seemingly unrelated ideas and events, and these days, it’s easier to find such connections than at any other point in human history. By now, we take it for granted, but I still remember the existential shock I received, almost ten years ago, when I found out about Amazon’s book search. I responded with a slightly hysterical blog post that was later quoted on the Volokh Conspiracy:

Their Search Inside the Book feature, which allows you to search and browse 33 million pages worth of material from 120,000 books, is just about the most intoxicating online toy I’ve ever seen. But it terrifies me at the same time. Between this monstrous djinn and, I have no excuse, no excuse whatsoever, for not writing a grand synthetic essay of everything, or a brilliant, glittering, Pynchonesque novel…because millions and millions of beautiful connections between people and ideas are already out there, at my fingertips, ready to be made without effort or erudition.

Looking back at this post, it’s easy to smile at my apocalyptic tone—not to mention my use of the phrase “,” which is a time capsule in itself—but if anything, my feelings of intoxication, and terror, have only increased. A decade ago, when I was in college, it took months of research and many hours in the library stacks to find useful connections between ideas, but now, they’re only a short query away. The trouble, of course, is that the long initial search is an inseparable part of scholarship: if you’re forced to read entire shelves of books and pursue many fruitless avenues of research before finding the connections you need, you’re better equipped to evaluate how meaningful they really are when you find them. A quick online search circumvents this process and robs the results of context, and even maturity. Research becomes a series of shortcuts, of data obtained without spiritual effort or cost, so it’s tempting to reach the same conclusion as Jonathan Franzen: “When information becomes free and universally accessible, voluminous research for a novel is devalued along with it.”

A spreadsheet for paranoids

Which is true, but only up to a point. Raw information is everywhere, but authors can still be judged by the ingenuity and originality of the connections they make. This is especially true in conspiracy fiction, in which a connection doesn’t need to be true, as long as it’s clever, reasonably novel, and superficially convincing. (Among other reasons, this is why I don’t care for the work of Dan Brown, who only repeats the labors of more diligent crackpots.) Umberto Eco, definitive here as elsewhere, laid down the rules of the game in Foucault’s Pendulum:

  1. Concepts are connected by analogy. There is no way to decide at once whether an analogy is good or bad, because to some degree everything is connected to everything else.
  2. If everything hangs together in the end, the connection works.
  3. The connections must not be original. They must have been made before, and the more often the better, by others. Only then do the crossings seem true, because they are obvious.

And unlike Eco’s protagonists, who had to enter scraps of information into their computer by hand, we all have free access to a machine with an infinite number of such fragments. An enterprising paranoiac just has to look for the connections. And the first step is to find out where they’ve crossed over in the past.

When the time finally came, then, to construct the Pynchonesque novel of my dreams, I decided to proceed in the most systematic way I could. I constructed a vast spreadsheet grid that paired off a variety of players and ideas that I suspected would play a role in the story—Marcel Duchamp, the Rosicrucians, Georges Bataille, the Black Dahlia murder—and spent weeks googling each pair in turn, trying to find books and other documents where two or more terms were mentioned together. Not surprisingly, many of these searches went nowhere, but I also uncovered a lot of fascinating material that I wouldn’t have found in any other way, which opened up further avenues of inquiry that I researched more deeply. I felt justified in this approach, which is the opposite of good scholarship, because I was writing a work of fiction about paranoia, overinterpretation, and the danger of taking facts out of context, which was precisely what I was doing myself. And I came away with the realization that you could do this with anything—which is something to keep in mind whenever you see similar arguments being made in earnest. There’s nothing like building a conspiracy theory yourself to make you even more skeptical than you were before. Or to quote Foucault’s Pendulum yet again: “That day, I began to be incredulous.”

Written by nevalalee

November 25, 2014 at 9:00 am

An unread life

with one comment

The author's library

A few months ago, I was proudly showing off my home library to a friend when he asked a version of a question I’ve often heard before: “When I see most people with libraries like this, I assume they haven’t read the books. But you’ve read most of these—right?” In response, I may have stammered a little. No, I said, I haven’t read them all, but they’re all here for a reason. Each book fits into its own particular niche, I’ve grazed in each one, and they’re all important to me. If I’d been in a different mood, I might have quoted Umberto Eco’s testy reply to similar queries:

The visitor enters and says, “What a lot of books! Have you read them all?” At first I thought that the question characterized only people who had scant familiarity with books, people accustomed to seeing a couple of shelves with five paperback mysteries and a children’s encyclopedia, bought in installments. But experience has taught me that the same words can be uttered also by people above suspicion. It could be said that they are still people who consider a bookshelf as a mere storage place for already-read books and do not think of the library as a working tool.

In other words, as Nassim Nicholas Taleb notes, while also citing Eco: “Read books are far less valuable than unread ones.” Which isn’t to say that I’ll simply buy a book and stick it on the shelf to admire. Whenever I acquire a new book, I give it a good browse, just enough to give me an idea of what I really have, and then I file it away, content in the knowledge that when I need to dig deeper, it’ll be there. Or at least that’s the rule I try to follow. In practice, I’ve found myself accumulating books by certain authors—Lewis Mumford, for instance—on a vague suspicion that they’re going to come in handy one day, and others that force themselves on my attention simply because of an alluring look and a reasonable price. In other cases, I’m drawn to books primarily by what they represent: a commitment to a single overwhelming idea, which is something I value without being able to replicate. A book like The Plan of St. Gall or Marcello Malpighi and the Evolution of Embryology is the product of decades of singleminded work, and as a writer who is happiest when switching frequently between projects, I keep them around as a reminder of a different, and maybe better, way of art and thought.

The author's library, temporarily unshelved

But there’s no question that I browse more than I read these days. Part of this has to do with the shape my life has taken: between an active toddler and my own unwritten pages, it’s hard to find time to sit down with a book for more than half an hour at a stretch. My criteria, in fact, for buying new books has shifted slightly ever since my daughter was born. At the moment, I tend to buy books that I’ll be glad to own even if I don’t read them from cover to cover, which favors titles that are either inherently browsable—where I can turn to a random page in the middle and find something enlightening or diverting—or that have strong aesthetic interest in themselves. The latter encompasses lovely little paperbacks as much as their big leatherbound brothers, but it’s especially why I’m so taken by the idea of the tome. When a book is large enough, the pressure to get through all of it is correspondingly reduced: The Plan of St. Gall seems content to hang around forever as a permanent presence, to be dipped into as often or rarely as I want, rather than plowed through from first volume to last. There comes a point when a book’s sheer size ceases to be formidable and becomes almost comforting in its insistence on pages unread and byways unexplored.

This may be why I’ve been increasingly drawn to rare books like this as my free time has grown ever more contracted. If I blow $200 on St. Gall or $80 on Marcello Malpighi, you shouldn’t be misled into thinking I have oodles of disposable cash; really, they’re just about all I treat myself to these days, aside from the occasional album. When I’m tempted to buy a video game or Blu-ray, there’s a reasonable voice in my head that asks when, exactly, I think I’ll get around to playing or watching it. With a book, I’ve got my answer ready: I’ll leaf through it a little now, then save the rest for the same undefined retirement home in which I’ll finally read all of Gibbon. In the meantime, my unread books give me a satisfaction—as well as occasional injections of pleasure, whenever I remember to take one down from the shelf for a few minutes—that I don’t feel from an unwatched movie or unplayed game. It isn’t clear to me if the result is a working tool, as Eco would say, or a stealth form of vanity, but it probably lies somewhere in the middle. The most generous interpretation is that it’s a monument to possibility, a collection of paths I can take whenever I like. It may not be today, or even in this lifetime. But they’re still a part of the life I have now.

Written by nevalalee

October 14, 2014 at 9:48 am

The act of noticing

leave a comment »

Jonathan Franzen

Yesterday, while playing with my daughter at the park, I found myself oddly fascinated by the sight of a landscaping crew that was taking down a tree across the street. It’s the kind of scene you encounter on a regular basis in suburbia, but I wound up watching with unusual attention, mostly because I didn’t have much else to do. (I wasn’t alone, either. Any kind of construction work amounts to the greatest show on earth for toddlers, and there ended up being a line of tiny spectators peering through the fence.) Maybe because I’ve been in a novelistic state of mind recently, I focused on details that I’d never noticed before. There’s the way a severed tree limb dangles from the end of the crane almost exactly like a hanged man, as Eco describes it in Foucault’s Pendulum, with its heavy base tracing a second, smaller circle in the air. I noted how a chainsaw in action sprays a fan of fine particles behind it, like a peacock’s tail. And when the woodchipper shoots chips into the back of the truck, a cloud of light golden dust forms above the container, like the soul of the tree ascending.

As I watched, I had the inevitable thought: I should put this into a story. Unfortunately, my current novel project doesn’t include a landscaping scene, and the easiest way to incorporate it would be through some kind of elaborate metaphor, as we often see, at its finest, in Proust. (“As he listened to her words, he found himself reminded of a landscaping crew he had once seen…”) But it made me reflect both on the act of noticing and on the role it plays, or doesn’t, in my own fiction. Most of the time, when I’m writing a story, I’m following the dictates of a carefully constructed plot, and I’ll find myself confronted by a building or a city scene that has imposed itself by necessity on the action: my characters end up at a hospital or a police station, and I strain to find a way of evoking it in a few economical lines that haven’t been written a million times before. Occasionally, this strikes me as a backward way of working. It would be better, it seems, to build the story around locations and situations that I already know I can describe—or which caught my attention in the way that landscaping crew did—rather than scrambling to push out something original under pressure.

Joseph O'Neill

In fact, that’s the way a lot of novelists work, particularly on the literary end. One of the striking trends in contemporary fiction is how so much of it doubles as reportage, with miniature New Yorker pieces buried like bonbons within the larger story. This isn’t exactly new: writers from Nabokov to Updike have filled their novels with set pieces that serve, in James Wood’s memorable phrase, as “propaganda on behalf of good noticing.” What sets more recent novels apart is how undigested some of it seems. At times, you can feel the narrative pausing for a page or two as the writer—invariably a talented one, or else these sections wouldn’t survive the editorial process—serves up a chunk of journalistic observation. As Norman Mailer writes, unkindly, of Jonathan Franzen:

Everything of novelistic use to him that came up on the Internet seems to have bypassed the higher reaches of his imagination—it is as if he offers us more human experience than he has literally mastered, and this is obvious when we come upon his set pieces on gourmet restaurants or giant cruise ships or modern Lithuania in disarray. Such sections read like first-rate magazine pieces, but no better—they stick to the surface.

This isn’t entirely fair to Franzen, a superb noticer who creates vivid characters even as he auditions for our admiration. But I thought of this again after finishing Joseph O’Neill’s Netherland this week. It’s a novel I’d wanted to read for years, and I enjoyed it a hell of a lot, while remaining conscious of its constant shifts into what amounts to nonfiction: beautifully written and reported essays on New York, London, the Hague, India, cricket, and just about everything else. It’s a gorgeous book, but it ends up feeling more like a collection of lovingly burnished parts than a cohesive whole, and its acts of noticing occasionally interfere with its ability to invent real interactions for its characters. It was Edmund Wilson, I think, who warned writers against mining their journals for material, and you can see why: it encourages a sort of novelistic bricolage rather than an organic discovery of the action, and the best approach lies somewhere in the middle. And there’s more than one way of telling a story. As I was studying the landscaping crew at the park, my daughter was engaged in a narrative of her own: she ran into her friend Elyse, played on the seesaw, and then had to leave abruptly for a diaper change. Or, as Beatrix put it, when I asked about her day: “Park. Elyse. Say hi. Seesaw. Poop. Go home.” And I don’t think I can do better than that.

%d bloggers like this: