Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Umberto Eco

The private eyes of culture

leave a comment »

Yesterday, in my post on the late magician Ricky Jay, I neglected to mention one of the most fascinating aspects of his long career. Toward the end of his classic profile in The New Yorker, Mark Singer drops an offhand reference to an intriguing project:

Most afternoons, Jay spends a couple of hours in his office, on Sunset Boulevard, in a building owned by Andrew Solt, a television producer…He decided now to drop by the office, where he had to attend to some business involving a new venture that he has begun with Michael Weber—a consulting company called Deceptive Practices, Ltd., and offering “Arcane Knowledge on a Need to Know Basis.” They are currently working on the new Mike Nichols film, Wolf, starring Jack Nicholson.

When the article was written, Deceptive Practices was just getting off the ground, but it went on to compile an enviable list of projects, including The Illusionist, The Prestige, and most famously Forrest Gump, for which Jay and Weber designed the wheelchair that hid Gary Sinise’s legs. It isn’t clear how lucrative the business ever was, but it made for great publicity, and best of all, it allowed Jay to monetize the service that he had offered for free to the likes of David Mamet—a source of “arcane knowledge,” much of it presumably gleaned from his vast reading in the field, that wasn’t available in any other way.

As I reflected on this, I was reminded of another provider of arcane knowledge who figures prominently in one of my favorite novels. In Umberto Eco’s Foucault’s Pendulum, the narrator, Casaubon, comes home to Milan after a long sojourn abroad feeling like a man without a country. He recalls:

I decided to invent a job for myself. I knew a lot of things, unconnected things, but I wanted to be able to connect them after a few hours at a library. I once thought it was necessary to have a theory, and that my problem was that I didn’t. But nowadays all you needed was information; everybody was greedy for information, especially if it was out of date. I dropped in at the university, to see if I could fit in somewhere. The lecture halls were quiet; the students glided along the corridors like ghosts, lending one another badly made bibliographies. I knew how to make a good bibliography.

In practice, Casaubon finds that he knows a lot of things—like the identities of such obscure figures as Lord Chandos and Anselm of Canterbury—that can’t be found easily in reference books, prompting a student to marvel at him: “In your day you knew everything.” This leads Casaubon to a sudden inspiration: “I had a trade after all. I would set up a cultural investigation agency, be a kind of private eye of learning. Instead of sticking my nose into all-night dives and cathouses, I would skulk around bookshops, libraries, corridors of university departments…I was lucky enough to find two rooms and a little kitchen in an old building in the suburbs…In a pair of bookcases I arranged the atlases, encyclopedias, catalogs I acquired bit by bit.”

This feels a little like the fond daydream of a scholar like Umberto Eco himself, who spent decades acquiring arcane knowledge—not all of it required by his academic work—before becoming a famous novelist. And I suspect that many graduate students, professors, and miscellaneous bibliophiles cherish the hope that the scraps of disconnected information that they’ve accumulated over time will turn out to be useful one day, in the face of all evidence to the contrary. (Casaubon is evidently named after the character from Middlemarch who labors for years over a book titled The Key to All Mythologies, which is already completely out of date.) To illustrate what he does for a living, Casaubon offers the example of a translator who calls him one day out of the blue, desperate to know the meaning of the word “Mutakallimūn.” Casaubon asks him for two days, and then he gets to work:

I go to the library, flip through some card catalogs, give the man in the reference office a cigarette, and pick up a clue. That evening I invite an instructor in Islamic studies out for a drink. I buy him a couple of beers and he drops his guard, gives me the lowdown for nothing. I call the client back. “All right, the Mutakallimūn were radical Moslem theologians at the time of Avicenna. They said the world was a sort of dust cloud of accidents that formed particular shapes only by an instantaneous and temporary act of the divine will. If God was distracted for even a moment, the universe would fall to pieces, into a meaningless anarchy of atoms. That enough for you? The job took me three days. Pay what you think is fair.”

Eco could have picked nearly anything to serve as a case study, of course, but the story that he choses serves as a metaphor for one of the central themes of the book. If the world of information is a “meaningless anarchy of atoms,” it takes the private eyes of culture to give it shape and meaning.

All the while, however, Eco is busy undermining the pretensions of his protagonists, who pay a terrible price for treating information so lightly. And it might not seem that such brokers of arcane knowledge are even necessary these days, now that an online search generates pages of results for the Mutakallimūn. Yet there’s still a place for this kind of scholarship, which might end up being the last form of brainwork not to be made obsolete by technology. As Ricky Jay knew, by specializing deeply in one particular field, you might be able to make yourself indispensable, especially in areas where the knowledge hasn’t been written down or digitized. (In the course of researching Astounding, I was repeatedly struck by how much of the story wasn’t available in any readily accessible form. It was buried in letters, manuscripts, and other primary sources, and while this happens to be the one area where I’ve actually done some of the legwork, I have a feeling that it’s equally true of every other topic imaginable.) As both Jay and Casaubon realized, it’s a role that rests on arcane knowledge of the kind that can only be acquired by reading the books that nobody else has bothered to read in a long time, even if it doesn’t pay off right away. Casaubon tells us: “In the beginning, I had to turn a deaf ear to my conscience and write theses for desperate students. It wasn’t hard; I just went and copied some from the previous decade. But then my friends in publishing began sending me manuscripts and foreign books to read—naturally, the least appealing and for little money.” But he perseveres, and the rule that he sets for himself might still be enough, if you’re lucky, to fuel an entire career:

Still, I was accumulating experience and information, and I never threw anything away…I had a strict rule, which I think secret services follow, too: No piece of information is superior to any other. Power lies in having them all on file and then finding the connections.

Written by nevalalee

November 27, 2018 at 8:41 am

The Machine of Lagado

with one comment

Yesterday, my wife wrote to me in a text message: “Psychohistory could not predict that Elon [Musk] would gin up a fraudulent stock buyback price based on a pot joke and then get punished by the SEC.” This might lead you to wonder about our texting habits, but more to the point, she was right. Psychohistory—the fictional science of forecasting the future developed by Isaac Asimov and John W. Campbell in the Foundation series—is based on the assumption that the world will change in the future more or less as it has in the past. Like all systems of prediction, it’s unable to foresee black swans, like the Mule or Donald Trump, that make nonsense of our previous assumptions, and it’s useless for predicting events on a small scale. Asimov liked to compare it to the kinetic theory of gases, “where the individual molecules in the gas remain as unpredictable as ever, but the average person is completely predictable.” This means that you need a sufficiently large number of people, such as the population of the galaxy, for it to work, and it also means that it grows correspondingly less useful as it becomes more specific. On the individual level, human behavior is as unforeseeable as the motion of particular molecules, and the shape of any particular life is impossible to predict, even if we like to believe otherwise. The same is true of events. Just as a monkey or a dartboard might do an equally good job of picking stocks as a qualified investment advisor, the news these days often seems to have been generated by a bot, like the Subreddit Simulator, that automatically cranks out random combinations of keywords and trending terms. (My favorite recent example is an actual headline from the Washington Post: “Border Patrol agent admits to starting wildfire during gender-reveal party.”)

And the satirical notion that combining ideas at random might lead to useful insights or predictions is a very old one. In Gulliver’s Travels, Jonathan Swift describes an encounter with a fictional machine—located in the academy of Lagado, the capital city of the island of Balnibarbi—by which “the most ignorant person, at a reasonable charge, and with a little bodily labour, might write books in philosophy, poetry, politics, laws, mathematics, and theology, without the least assistance from genius or study.” The narrator continues:

[The professor] then led me to the frame, about the sides, whereof all his pupils stood in ranks. It was twenty feet square, placed in the middle of the room. The superfices was composed of several bits of wood, about the bigness of a die, but some larger than others. They were all linked together by slender wires. These bits of wood were covered, on every square, with paper pasted on them; and on these papers were written all the words of their language, in their several moods, tenses, and declensions; but without any order…The pupils, at his command, took each of them hold of an iron handle, whereof there were forty fixed round the edges of the frame; and giving them a sudden turn, the whole disposition of the words was entirely changed.  He then commanded six-and-thirty of the lads, to read the several lines softly, as they appeared upon the frame; and where they found three or four words together that might make part of a sentence, they dictated to the four remaining boys, who were scribes.

And Gulliver concludes: “Six hours a day the young students were employed in this labour; and the professor showed me several volumes in large folio, already collected, of broken sentences, which he intended to piece together, and out of those rich materials, to give the world a complete body of all arts and sciences.”

Two and a half centuries later, an updated version of this machine figured in Umberto Eco’s novel Foucault’s Pendulum, which is where I first encountered it. The book’s three protagonists, who work as editors for a publishing company in Milan, are playing in the early eighties with their new desktop computer, which they’ve nicknamed Abulafia, after the medieval cabalist. One speaks proudly of Abulafia’s usefulness in generating random combinations: “All that’s needed is the data and the desire. Take, for example, poetry. The program asks you how many lines you want in the poem, and you decide: ten, twenty, a hundred. Then the program randomizes the line numbers. In other words, a new arrangement each time. With ten lines you can make thousands and thousands of random poems.” This gives the narrator an idea:

What if, instead, you fed it a few dozen notions taken from the works of [occult writers]—for example, the Templars fled to Scotland, or the Corpus Hermeticum arrived in Florence in 1460—and threw in a few connective phrases like “It’s obvious that” and “This proves that?” We might end up with something revelatory. Then we fill in the gaps, call the repetitions prophecies, and—voila—a hitherto unpublished chapter of the history of magic, at the very least!

Taking random sentences from unpublished manuscripts, they enter such lines as “Who was married at the feast of Cana?” and “Minnie Mouse is Mickey’s fiancee.” When strung together, the result, in one of Eco’s sly jokes, is a conspiracy theory that exactly duplicates the thesis of Holy Blood, Holy Grail, which later provided much of the inspiration for The Da Vinci Code. “Nobody would take that seriously,” one of the editors says. The narrator replies: “On the contrary, it would sell a few hundred thousand copies.”

When I first read this as a teenager, I thought it was one of the great things in the world, and part of me still does. I immediately began to look for similar connections between random ideas, which led me to some of my best story ideas, and I still incorporate aspects of randomness into just about everything that I do. Yet there’s also a pathological element to this form of play that I haven’t always acknowledged. What makes it dangerous, as Eco understood, is the inclusion of such seemingly innocent expressions as “it’s obvious that” and “this proves that,” which instantly transforms a scenario into an argument. (On the back cover of the paperback edition of Foucault’s Pendulum, the promotional copy describes Abulafia as “an incredible computer capable of inventing connections between all their entires,” which is both a great example of hyping a difficult book and a reflection of how credulous we can be when it comes to such practices in real life.) We may not be able to rule out any particular combination of events, but not every explanatory system is equally valid, even if all it takes is a modicum of ingenuity to turn it into something convincing. I used to see the creation of conspiracy theories as a diverting game, or as a commentary on how we interpret the world around us, and I devoted an entire novel to exorcising my fascination with this idea. More recently, I’ve realized that this attitude was founded on the assumption that it was still possible to come to some kind of cultural consensus about the truth. In the era of InfoWars, Pizzagate, and QAnon, it no longer seems harmless. Not all patterns are real, and many of the horrors of the last century were perpetuated by conspiracy theorists who arbitrarily seized on one arrangement of the facts—and then acted on it accordingly. Reality itself can seem randomly generated, but our thoughts and actions don’t need to be.

Written by nevalalee

October 2, 2018 at 9:36 am

The surprising skepticism of The X-Files

leave a comment »

Gillian Anderson in "Jose Chung's From Outer Space"

Note: To celebrate the twenty-fifth anniversary of the premiere of The X-Files, I’m republishing a post that originally appeared, in a somewhat different form, on September 9, 2013.

Believe it or not, this week marks the twenty-fifth anniversary of The X-Files, which aired its first episode on September 10, 1993. As much as I’d like to claim otherwise, I didn’t watch the pilot that night, and I’m not even sure that I caught the second episode, “Deep Throat.” “Squeeze,” which aired the following week, is the first installment that I clearly remember seeing on its original broadcast, and I continued to tune in afterward, although only sporadically. In its early days, I had issues with the show’s lack of continuity: it bugged me to no end that after every weekly encounter with the paranormal—any one of which should have been enough to upend Scully’s understanding of the world forever—the two leads were right back where they were at the start of the next episode, and few, if any, of their cases were ever mentioned again. Looking back now, of course, it’s easy to see that this episodic structure was what allowed the show to survive, and that it was irrevocably damaged once it began to take its backstory more seriously. In the meantime, I learned to accept the show’s narrative logic on its own terms. And I’m very grateful that I did.

It’s no exaggeration to say that The X-Files has had a greater influence on my own writing than any work of narrative art in any medium. That doesn’t mean it’s my favorite work of art, or even my favorite television show—only that Chris Carter’s supernatural procedural came along at the precise moment in my young adulthood that I was most vulnerable to being profoundly influenced by a great genre series. I was thirteen when the show premiered, toward the end of the most pivotal year of my creative life. Take those twelve months away, or replace them with a different network of cultural influences, and I’d be a different person altogether. It was the year I discovered Umberto Eco, Stephen King, and Douglas R. Hofstadter; Oliver Stone’s JFK set me on a short but fruitful detour into the literature of conspiracy; I bought a copy of Very by the Pet Shop Boys, about which I’ll have a lot more to say soon; I acquired copies of Isaac Asimov’s Science Fiction Magazine and 100 Great Science Fiction Short Short Stories; and I took my first deep dive into the work of David Lynch and, later, Jorge Luis Borges. Some of these works have lasted, while others haven’t, but they all shaped who I became, and The X-Files stood at the heart of it all, with imagery drawn in equal part from Twin Peaks and Dealey Plaza and a playful, agnostic spirit that mirrored that of the authors I was reading at the same time.

Gillian Anderson and David Duchovny in The X-Files pilot

And this underlying skepticism—which may seem like a strange word to apply to The X-Files—was a big part of its appeal. What I found enormously attractive about the show was that although it took place in a world of aliens, ghosts, and vampires, it didn’t try to force these individual elements into one overarching pattern. Even in its later seasons, when it attempted, with mixed results, to weave its abduction and conspiracy threads into a larger picture, certain aspects remained incongruously unexplained. The same world shaped by the plans of the Consortium or Syndicate also included lake monsters, clairvoyants, and liver-eating mutants, all of whom would presumably continue to go about their business after the alien invasion occurred. It never tried to convert us to anything, because it didn’t have any answers. And what I love about it now, in retrospect, is the fact that this curiously indifferent attitude toward its own mysteries arose from the structural constraints of network television itself. Every episode had to stand on its own. There was no such thing as binge-watching. The show had to keep moving or die.

Which goes a long way toward explaining why even fundamentally skeptical viewers, like me, could become devoted fans, or why Mulder and Scully could appear on the cover of the Skeptical Inquirer. It’s true that Scully was never right, but it’s remarkable how often it seemed that she could be, which is due as much to the show’s episodic construction as to Gillian Anderson’s wonderful performance. (As I’ve mentioned before, Scully might be my favorite character on any television show.) Every episode changed the terms of the game, complete with a new supporting cast, setting, and premise—and after the advent of Darin Morgan, even the tone could be wildly variable. As a result, it was impossible for viewers to know where they stood, which made a defensive skepticism seem like the healthiest possible attitude. Over time, the mythology grew increasingly unwieldy, and the show’s lack of consistency became deeply frustrating, as reflected in its maddening, only occasionally transcendent reboot. The X-Files eventually lost its way, but not until after a haphazard, often dazzling initial season that established, in spite of what its creators might do in the future, that anything was possible, and no one explanation would ever be enough. And it’s a lesson that I never forgot.

Written by nevalalee

September 14, 2018 at 9:00 am

The stuff of thought

leave a comment »

On December 4, 1972, the ocean liner SS Statendam sailed from New York to Florida, where its passengers would witness the launch of Apollo 17, the final manned mission to the moon. The guests on the cruise included Isaac Asimov, Robert A. Heinlein, Frederik Pohl, Theodore Sturgeon, Norman Mailer, Katherine Anne Porter, and the newscaster Hugh Downs. It’s quite a story, and I’ve written about it elsewhere at length. What I’d like to highlight today, though, is what was happening a few miles away on shore, as Tom Wolfe recounts in the introduction to the paperback edition of The Right Stuff:

This book grew out of some ordinary curiosity. What is it, I wondered, that makes a man willing to sit up on top of an enormous Roman candle, such as a Redstone, Atlas, Titan, or Saturn rocket, and wait for someone to light the fuse? I decided on the simplest approach possible. I would ask a few astronauts and find out. So I asked a few in December of 1972 when they gathered at Cape Canaveral to watch the last mission to the moon, Apollo 17. I discovered quickly enough that none of them, no matter how talkative otherwise, was about to answer the question or even linger for more than a few seconds on the subject at the heart of it, which is to say, courage.

Wolfe’s “ordinary curiosity” led him to tackle a project that would consume him for the better part of a decade, driven by his discovery of “a rich and fabulous terrain that, in a literary sense, had remained as dark as the far side of the moon for more than half a century: military flying and the modern American officer corps.”

And my mind sometimes turns to the contrast between Wolfe, trying to get the astronauts to open up about their experiences, and the writers aboard the Statendam. You had Mailer, of course, who had written his own book on the moon, and the result was often extraordinary. It was more about Mailer himself than anything else, though, and during the cruise, he seemed more interested in laying out his theory of the thanatosphere, an invisible region around the moon populated by the spirits of the dead. Then you had such science fiction writers as Heinlein and Asimov, who would occasionally cross paths with real astronauts, but whose fiction was shaped by assumptions about the competent man that had been formed decades earlier. Wolfe decided to go to the source, but even he kept the pulps at the back of his mind. In his introduction, speaking of the trend in military fiction after World War I, he observes:

The only proper protagonist for a tale of war was an enlisted man, and he was to be presented not as a hero but as Everyman, as much a victim of war as any civilian. Any officer above the rank of second lieutenant was to be presented as a martinet or a fool, if not an outright villain, no matter whom he fought for. The old-fashioned tale of prowess and heroism was relegated to second- and third-rate forms of literature, ghostwritten autobiographies, and stories in pulp magazines on the order of Argosy and Bluebook.

Wolfe adds: “Even as late as the 1930s the favorite war stories in the pulps concerned World War I pilots.” And it was to pursue “the drama and psychology” of this mysterious courage in the real world that he wrote The Right Stuff.

The result is a lasting work of literary journalism, as well as one of the most entertaining books ever written, and we owe it to the combination of Wolfe’s instinctive nose for a story and his obsessiveness in following it diligently for years. Last year, in a review of John McPhee’s new collection of essays, Malcolm Harris said dryly: “I would recommend Draft No. 4 to writers and anyone interested in writing, but no one should use it as a professional guide uncritically or they’re liable to starve.” You could say much the same about Wolfe, who looks a lot like the kind of journalist we aren’t likely to see again, in part because the market has changed, but also because this kind of luck can be hard for anyone to sustain over the course of a career. Wolfe hit the jackpot on multiple occasions, but he also spent years on books that nobody read—Back to Blood, his last novel, cost its publisher a hundred dollars for every copy that it sold. (Toward the end, he could even seem out of his depth. It probably isn’t a coincidence that I never read I Am Charlotte Simmons, a novel about “Harvard, Yale, Princeton, Stanford, Duke, and a few other places all rolled into one” that was published a few years after I graduated from college. Wolfe’s insights into undergraduate life, delivered with his customary breathlessness, didn’t seem useful for understanding an experience that I had just undergone, and I’ve never forgotten the critic who suggested that the novel should have been titled I Am Easily Impressed.)

But that’s also the kind of risk required to produce major work. Wolfe’s movement from nonfiction to novels still feels like a loss, and I think that it deprived us of two or three big books of the kind that he could write better than anyone else. (It’s too bad that he never wrote anything about science fiction, which is a subject that could only be grasped by the kind of writer who could produce both The Right Stuff and The Electric Kool-Aid Acid Test.) Yet it isn’t always the monumental achievements that matter. In fact, when I think of what Wolfe has meant to me, it’s his offhand critical comments that have stuck in my head. The short introduction that he wrote to a collection of James M. Cain’s novels, in which he justifiably praised Cain’s “momentum,” has probably had a greater influence on my own style—or at least my aspirations for it—than any other single piece of criticism. His description of Umberto Eco as “a very good example of a writer who leads dozens of young writers into a literary cul-de-sac” is one that I’ll always remember, mostly because he might have been speaking of me. In college, I saw him give a reading once, shortly before the release of the collection Hooking Up. I was struck by his famous white suit, of course, but what I’ll never forget is the moment, just before he began to read, when he reached into his inside pocket and produced a pair of reading glasses—also spotlessly white. It was a perfect punchline, with the touch of the practiced showman, and it endeared Wolfe to me at times when I grew tired of his style and opinions. His voice and his ambition inspired many imitators, but at his best, it was the small stuff that set him apart.

The axioms of behavior

with one comment

Earlier this week, Keith Raniere, the founder of an organization known as Nxivm, was arrested in Mexico, to which he had fled last year in the wake of a devastating investigation published in the New York Times. The article described a shady operation that combined aspects of a business seminar, a pyramid scheme, and a sex cult, with public workshops shading into a “secret sisterhood” that required its members to provide nude photographs or other compromising materials and be branded with Raniere’s initials. (In an email obtained by the Times, Raniere reassured one of his followers: “[It was] not originally intended as my initials but they rearranged it slightly for tribute.”) According to the report, about sixteen thousand people have taken the group’s courses, which are marketed as leading to “greater self-fulfillment by eliminating psychological and emotional barriers,” and some went even further. As the journalist Barry Meier wrote:

Most participants take some workshops, like the group’s “Executive Success Programs,” and resume their lives. But other people have become drawn more deeply into Nxivm, giving up careers, friends and families to become followers of its leader, Keith Raniere, who is known within the group as “Vanguard”…Former members have depicted [Raniere] as a man who manipulated his adherents, had sex with them and urged women to follow near-starvation diets to achieve the type of physique he found appealing.

And it gets even stranger. In 2003, Raniere sued the Cult Education Institute for posting passages from his training materials online. In his deposition for the suit, which was dismissed just last year, Raniere stated:

I discovered I had an exceptional aptitude for mathematics and computers when I was twelve. It was at the age of twelve I read The Second Foundation [sic] by Isaac Asimov and was inspired by the concepts on optimal human communication to start to develop the theory and practice of Rational Inquiry. This practice involves analyzing and optimizing how the mind handles data. It involves mathematical set theory applied in a computer programmatic fashion to processes such as memory and emotion. It also involves a projective methodology that can be used for optimal communication and decision making.

Raniere didn’t mention any specific quotations from Asimov, but they were presumably along the lines of the following, which actually appears in Foundation and Empire, spoken by none other than the Mule:

Intuition or insight or hunch-tendency, whatever you wish to call it, can be treated as an emotion. At least, I can treat it so…The human mind works at low efficiency. Twenty percent is the figure usually given. When, momentarily, there is a flash of greater power it is termed a hunch, or insight, or intuition. I found early that I could induce a continual use of high brain-efficiency. It is a killing process for the person affected, but it is useful.

At this point, one might be tempted to draw parallels to other cults, such as Aum Shinrikyo, that are also said to have taken inspiration from Asimov’s work. In this case, however, the connection to the Foundation series seems tangential at best. A lot of us read science fiction at the golden age of twelve, and while we might be intrigued by psychohistory or mental engineering, few of us take it in the direction that Raniere evidently did. (As one character observes in Umberto Eco’s Foucault’s Pendulum: “People don’t get the idea of going back to burn Troy just because they read Homer.”) In fact, Raniere comes off a lot more like L. Ron Hubbard, at least in the version of himself that he presents in public. In the deposition, he provided an exaggerated account of his accomplishments that will strike those who know Hubbard as familiar:

In 1988, I was accepted into the Mega Society. The requirements to be accepted into the Mega Society were to have a demonstrated IQ of 176…In 1989, I was accepted into the Guinness Book of World Records under the category “Highest IQ.” I also left my position as a Computer Programmer/Analyst and resumed business consulting with the intention to raise money to start the “Life Learning Institute.” At this point in time I became fascinated with how human motivation affected behavior. I started to refine my projective mathematical theory of the human mind to include a motivational behavior equation.

And when Raniere speaks of developing “a set of consistent axioms of how human behavior interfaced with the world,” it’s just a variation on an idea that has been recycled within the genre for decades.

Yet it’s also worth asking why the notion of a “mathematical psychology” appeals to these manipulative personalities, and why many of them have repackaged these ideas so successfully for their followers. You could argue that Raniere—or even Charles Manson—represents the psychotic fringe of an impulse toward transformation that has long been central to science fiction, culminating in the figure of the superman. (It’s probably just a coincidence, but I can’t help noting that two individuals who have been prominently linked with the group, the actresses Kristin Kreuk and Allison Mack, both appeared on Smallville.) And many cults hold out a promise of change for which the genre provides a convenient vocabulary. As Raniere said in his deposition:

In mathematics, all things are proven based on axioms and a step by step systematic construction. Computers work the same way. To program a computer one must first understand the axioms of the computer language, and then the step by step systematic construction of the problem-solution methodology. Finally, one must construct the problem-solution methodology in a step by step fashion using the axioms of the language. I discovered the human mind works the same way and I formalized the process.

This sounds a lot like Hubbard, particularly in the early days of dianetics, in which the influence of cybernetics was particularly strong. But it also represents a limited understanding of what the human mind can be, and it isn’t surprising that it attracts people who see others as objects to be altered, programmed, and controlled. The question of whether such figures as Hubbard or Raniere really buy into their own teachings resists any definitive answer, but one point seems clear enough. Even if they don’t believe it, they obviously wish that it were true.

The stories of our lives

with one comment

Last week, I mentioned the evocative challenge that the writer and literary agent John Brockman recently posed to a group of scientists and intellectuals: “Ask the question for which you will be remembered.” I jokingly said that my own question would probably resemble the one submitted by the scholar Jonathan Gottschall: “Are stories bad for us?” As often happens with such snap decisions, however, this one turned out to be more revealing than I had anticipated. When I look back at my work as a writer, it’s hard to single out any overarching theme, but I do seem to come back repeatedly to the problem of reading the world as a text. My first novel, The Icon Thief, was openly inspired by Foucault’s Pendulum by Umberto Eco, which inverts the conventions of the conspiracy thriller to explore how we tell ourselves stories about history and reality. I didn’t go quite as far as Eco did, but it was a subject that I enjoyed, and it persisted to a lesser extent in my next two books. My science fiction stories tend to follow a formula that I’ve described as The X-Files in reverse, in which a paranormal interpretation of a strange event is supplanted by another that fits the same facts into a more rational pattern. And I’m currently finishing up a book that is secretly about how the stories that we read influence our behavior in the real world. As Isaac Asimov pointed out in his essay “The Sword of Achilles,” most readers are drawn to science fiction at a young age, and its values and assumptions subtly affect how they think and feel. If there’s a single thread that runs through just about everything I’ve written, then, it’s the question of how our tendency to see the world as a story—or a text—can come back to haunt us in unexpected ways.

As it happens, we’re all living right now through a vast social experiment that might have been designed to test this very principle. I got to thinking about this soon after reading an excellent essay, “The Weight of the Words,” by the political scientist Jacob T. Levy. He begins with a discussion of Trump’s “shithole countries” remark, which led a surprising number of commentators—on both the right and the left—to argue that the president’s words were less important than his actions. Levy summarizes this view: “Ignore the tweets. Ignore Trump’s inflammatory language. Ignore the words. What counts is the policy outcomes.” He continues:

I have a hard time believing that anyone really thinks like this as a general proposition…The longstanding view among conservatives was that Churchill’s “Iron Curtain” speech and Reagan’s call to “tear down this wall” were important events, words that helped to mobilize western resistance to Communism and to provide moral clarity about the stakes of that resistance.

On a more basic level, since it’s impossible for the government to accomplish everything by force, much of politics lies in emotional coercion, which suggest that words have power in themselves. Levy refers to Hannah Arendt’s argument in The Human Condition, in which a familiar figure appears:

The stature of the Homeric Achilles can be understood only if one sees him as “the doer of great deeds and the speakers of great words”…Thought was secondary to speech, but speech and action were considered to be coeval and coequal, of the same rank and the same kind; and this originally meant not only that most political action, in so far as it remains outside the sphere of violence, is indeed transacted in words, but more fundamentally that finding the right words at the right moment, quite apart from the information or communication they may convey, is action.

Levy then lists many of the obvious ways in which Trump’s words have had tangible effects—the erosion of America’s stature abroad, the undermining of trust in law enforcement and the civil service, the growth of tribalism and xenophobia, and the redefinition of what it means to be a Republican. (As Levy notes of Trump’s relationship to his supporters: “He doesn’t speak for them; how many of them had a view about ‘the deep state’ two years ago? He speaks to them, and it matters.”) Trump routinely undercuts the very notion of truth, in what seems like the ultimate example of the power of speech over the world of fact. And Levy’s conclusion deserves to be read whenever we need to be reminded of how this presidency differs from all the others that have come before:

The alleged realism of those who want to ignore words will often point to some past president whose lofty rhetoric obscured ugly policies. Whether those presidents are named “Reagan and George W. Bush” or “JFK and Barack Obama” varies in the obvious way, but the deflationary accounts are similar; there are blunders, crimes, abuses, and atrocities enough to find in the record of every American president. But all those presidents put forward a public rhetorical face that was better than their worst acts. This inevitably drives political opponents crazy: they despise the hypocrisy and the halo that good speeches put on undeserving heads. I’ve had that reaction to, well, every previous president in my living memory, at one time or another. But there’s something important and valuable in the fact that they felt the need to talk about loftier ideals than they actually governed by. They kept the public aspirations of American political culture pointed toward Reagan’s “shining city on a hill.”

He concludes of all of our previous presidents: “In words, even if not in deeds, they championed a free and fair liberal democratic order, the protection of civil liberties, openness toward the world, rejection of racism at home, and defiance against tyranny abroad. And their words were part of the process of persuading each generation of Americans that those were constitutively American ideals.” America, in short, is a story that Americans tell one another—and the world—about themselves, and when we change the assumptions behind this narrative, it has profound implications in practice. We treat others according to the roles that we’ve imagined for ourselves, or, more insidiously, that our society has imagined for us. Those roles are often restrictive, but they can also be liberating, both for good and for bad. (Levy perceptively notes that the only federal employees who don’t feel devalued these days are immigration and border agents.) And Levy sounds a warning that we would all do well to remember:

“Ignore the tweets, ignore the language, ignore the words” is advice that affects a kind of sophistication: don’t get distracted by the circus, keep your eye on what’s going on behind the curtain. This is faux pragmatism, ignoring what is being communicated to other countries, to actors within the state, and to tens of millions of fellow citizens. It ignores how all those actors will respond to the speech, and how norms, institutions, and the environment for policy and coercion will be changed by those responses. Policy is a lagging indicator; ideas and the speech that expresses them pave the way.

“Trump has spent a year on the campaign trail and a year in office telling us where he intends to take us,” Levy concludes. And we’re all part of this story now. But we should be even more worried if the words ever stop. As Arendt wrote more than half a century ago: “Only sheer violence is mute.”

My ten great books #10: Foucault’s Pendulum

with 3 comments

Foucault's Pendulum

When a novel has been a part of your life for over twenty years, your feelings for it tend to trace the same ups and downs as those of any other friendship. An initial burst of passionate enthusiasm is followed by a long period of comfortable familiarity; you gradually start to take it for granted; and you even find your emotions beginning to cool. Faced with the same unchanging text for so long, you begin to see its flaws as well as its virtues, and if its shortcomings seem similar to your own, you can even start to resent it a little, or to question what you ever saw in it. Few books have inspired as great a range of responses in me as Foucault’s Pendulum, which in many ways is the novel that had the greatest influence on the kind of fiction I’ve attempted for most of my career. I read it at what feels in retrospect like an absurdly young age: I was thirteen, more comfortable around books than around people, and I was drawn to Umberto Eco as an exemplar of the temperament that I hoped would come from a life spent in the company of ideas. “It is a tale of books, not of everyday worries,” Eco says in the prologue to The Name of the Rose, and every line he writes is suffused with a love of history, language, art, and philosophy. Foucault’s Pendulum takes the same tendency to an even higher level: it’s a novel that often seems to be about nothing but books, with characters who exist primarily as a vehicle for long, witty conservations, crammed with esoteric lore, and a bare sliver of a thriller plot to hold it all together. For a young man who wanted to know something about everything, it was enormously attractive, and it set me off on an intellectual foxhunt that has lasted for over two decades.

Much later, as I began to write fiction of my own, I began to see how dangerous an influence this was, and I found myself agreeing with Tom Wolfe, who famously called Eco “a very good example of a writer who leads dozens of young writers into a literary cul-de-sac.” After I’d gotten my early Eco pastiches out my system, I put the book away for a long time—although not after having read it to tatters—and I started to wonder how my writing life would have been different if I’d been sucked in by the example of, say, John Fowles or John Updike. It’s only within the last few years, after I finally wrote and published my own homage to this book’s peculiar magic, that I’ve finally felt free to enjoy and appreciate it on its own terms, as an odd, inimitable byway in the history of literature that just happened to play a central role in my own life. (If I’d encountered it a few years later, I wonder if I’d even be able to finish it—I’ve never been able to get through any of Eco’s later novels.) In its final measure, Foucault’s Pendulum is one of the best of all literary entertainments, a spirited tour of some of the oddest corners of the Western mind. It’s the most genial and welcoming of encyclopedic novels, as ironic as Gravity’s Rainbow when it comes to the limits of interpretation, but too charmed by texts and libraries for its lessons to hold any sting. In the course of his research, Eco reportedly read something like a thousand works of occult literature, winnowing out and saving the best parts, and the result is a book that vibrates with the joys of the musty and obscure. And it ultimately changed me for the better. I no longer want to be Umberto Eco. But I’m very glad that Eco did.

Written by nevalalee

May 19, 2017 at 9:00 am

%d bloggers like this: