Posts Tagged ‘Thomas Pynchon’
The evensong
It’s a long walk home tonight. Listen to this mock-angel singing, let your communion be at least in listening, even if they are not spokesmen for your exact hopes, your exact, darkest terror, listen. There must have been evensong here long before the news of Christ. Surely for as long as there have been nights bad as this one—something to raise the possibility of another light that could actually, with love and cockcrows, light the path home, banish the Adversary, destroy the boundaries between our lands, our bodies, our stories, all false, about who we are…
But on the way home tonight, you wish you’d picked him up, held him a bit. Just held him, very close to your heart, his cheek by the hollow of your shoulder, full of sleep. As if it were you who could, somehow, save him. For the moment not caring who you’re supposed to be registered as. For the moment anyway, no longer who the Caesars say you are.
The dark side of the limerick
“As almost nothing that has been written about the limerick can be taken seriously—which is perhaps only fitting—a few words may not be out of place here,” the scholar Gershon Legman writes in his introduction to the definitive work on the subject. Legman was one of the first critics to see erotic and obscene folk forms, including the dirty joke, as a serious object of study, and The Limerick puts his singular intelligence—which is worthy of a good biography—on full display:
The limerick is, and was originally, an indecent verse form. The “clean” sort of limerick is an obvious palliation, its content insipid, its rhyming artificially ingenious, its whole pervaded with a frustrated nonsense that vents itself typically in explosive and aggressive violence. There are, certainly, aggressive bawdy limericks too, but they are not in the majority. Except as the maidenly delight and silly delectation of a few elderly gentlemen, such as the late Langford Reed, and several still living who might as well remain nameless, the clean limerick has never been of the slightest real interest to anyone, since the end of its brief fad in the 1860s.
Legman describes the work of Edward Lear, the supposed master of the form, as “very tepidly humorous,” which seems about right, and he apologizes in advance for the vast collection of dirty limericks that he has prepared for the reader’s edification: “The prejudices, cruelty, and humorless quality of many of the limericks included are deeply regretted.”
But a metrical form typified by prejudice, cruelty, and humorlessness may end up being perfectly suited for the modern age. Legman claims that “viable folk poetry and folk poetic forms,” aren’t easy to duplicate by design, but it isn’t an accident that two of the major American novels of the twentieth century indulge in limericks at length. One is Thomas Pynchon’s Gravity’s Rainbow, which includes a remarkable sequence of limericks in which young men have sexual relations with the various parts of a rocket, such as the vane servomotor. The other is William H. Gass’s The Tunnel, which prints numerous limericks that all begin with the opening line “I once went to bed with a nun.” In his hands, the limerick becomes the ideal vehicle for his despairing notion of history, as a character in the novel explains:
The limerick is the unrefiner’s fire. It is as false and lifeless, as anonymous, as a rubber snake, a Dixie cup…No one ever found a thought in one. No one ever found a helpful hint concerning life, a consoling sense. The feelings it harbors are the cold, the bitter, dry ones: scorn, contempt, disdain, disgust. Yes. Yet for that reason. nothing is more civilized than this simple form. In that—in cultural sophistication—it is the equal of the heroic couplet…That’s the lesson of the limerick. You never know when a salacious meaning will break out of a trouser. It is all surface—a truly modern shape, a model’s body. There’s no inside however long or far you travel on it, no within, no deep.
Both authors seem to have been drawn to the form for this very reason. And while Gass’s notion of writing “a limrickal history of the human race” may have seemed like a joke twenty years ago, the form seems entirely appropriate to the era in which we’re all living now.
Another prolific author of limericks was Isaac Asimov, who clearly didn’t view the form as problematic. In his memoir In Memory Yet Green, with typical precision, he writes that his first attempt took place on July 13, 1953. A friend challenged him to compose a limerick with the opening line “A priest with a prick of obsidian,” and after some thought, Asimov recited the following:
A priest with a prick of obsidian
Was a foe to the hosts of all Midian,
Instead of immersion
Within a young virgin
’Twas used as a bookmark in Gideon.
“I explained that the ‘hosts of Midian’ was a biblical synonym for evil and that ‘Gideon’ was a reference to a Gideon Bible, but no one thought much of it,” Asimov writes. “However, when I challenged anyone present to do better, no one could.” Asimov was encouraged by the experience, however, and he soon got into the habit of constructing limericks in his head “whenever I was trapped in company and bored.” Not surprisingly, it occurred to him that it would be a shame to let them go to waste, and he convinced the publishing house Walker & Company to let him put together a collection. Asimov continued to write limericks with “amazing speed,” and Lecherous Limericks appeared in 1975. It was followed by six more installments, including two collaborations with none other than the poet and translator John Ciardi.
And the uncomfortable fact about Asimov’s limericks is that most of them frankly aren’t very good, funny, or technically impressive. This isn’t a knock on Asimov himself, but really a reflection of the way in which the limerick resists being produced in such a casual fashion, despite what thousands of practitioners think to the contrary. (“Amateurs amble over everything like cows,” Gass writes in The Tunnel. “The A which follows so many limericks stands for Amateur, not for Anonymous.”) Asimov was drawn to the form for the same reason that so many others are—it’s apparently easy, superficially forgiving of laziness, and can be composed and retained without difficulty in one’s head. And it’s no surprise that he embraced it. Asimov didn’t become the most prolific author in American history by throwing anything away, and just as he sent the very first story that he ever wrote as a teenager to John W. Campbell, who rejected it, he didn’t have any compunction about sending his first batch of limericks to his publisher, who accepted the result. “One good limerick out of every ten written is a better average than most poets hit,” Legman accurately writes, and Asimov never would have dreamed of discarding even half of his attempts. He also wasn’t likely to appreciate the underlying darkness and nihilism, not to mention the misogyny, of the form in which women “generally figure both as villain and victim,” as Legman notes, while also calling it “the only kind of newly composed poetry in English, or song, which has the slightest chance whatever of survival.” Gass, and presumably Pynchon, understood this all too well, and the author of The Tunnel deserves the last word: “Language has to contain…emotions. It’s not enough just to arouse them. In a perverse way that’s why I use a lot of limericks, because the limerick is a flatterer, the limerick destroys emotion, perhaps it produces giggles, but it is a downer. It’s an interesting form for that reason.” And it might end up being the defining poetry of our time.
Beyond cyberspace
Note: I’m taking a few days off, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on October 7, 2016.
About a year and a half ago, I spent some time reading the work of Norbert Wiener, the mathematician and polymath best known as the founder of cybernetics. Wiener plays a small but pivotal role in my upcoming book Astounding—he was one of John W. Campbell’s professors at M.I.T., and Campbell later remembered him as the only instructor who ever helped him with a story. (He also called him “one of America’s dullest and most incompetent teachers.” According to Campbell, Wiener would write an expression on the blackboard, say that its implications were clear, and then write another without further explanation. He could also do third-order differential equations in his head, which was very impressive, apart from the fact that he expected his students to be able to do the same thing.) I became interested in Wiener because of the role that his ideas played in the development of dianetics: Campbell once wrote to Wiener saying that he thought the latter would be “greatly interested” in his research with L. Ron Hubbard, and Wiener later had to ask the Hubbard Dianetic Research Foundation to stop using his name in its literature. Much of the terminology of dianetics, including the word “clear,” seems to have been derived directly from Wiener’s work. But after reading both Cybernetics and The Human Use of Human Beings—while skipping most of the mathematics, which even Isaac Asimov admitted he didn’t understand—I’ve started to realize that Wiener’s influence in the genre was even greater than I supected. For much of the fifties, the magazine Astounding served as a conduit that brought cybernetic ideas into the mainstream of science fiction, and from there, they entered all our lives.
“Cybernetics” is one of those words, like “postmodernism,” that can be used to mean just about anything, but it originated in a pair of straightforward but profound observations. Wiener, who had worked on computing machines at M.I.T. and laid out the principles of their operation in a famous memo to Vannevar Bush, had grown intrigued by the parallels between computers and the human brain. He also understood that any attempt to explain intelligence in purely mechanical terms was doomed to fail. The mind doesn’t resemble a clock or any other machine, with gears that can run forwards or backwards. It’s more like the weather: a complicated system made up of an countably huge number of components that move only in one direction. Wiener’s first important insight was that the tools of statistical mechanics and information theory could be used to shed light on biological and mental processes, which amount to an island of negative entropy—it’s one of the few places in the universe where matter becomes more organized over time. So how the heck do you control it? In practice, life and intelligence consist of so many variables that they seem impossible to analyze or replicate, and Wiener’s second major contribution related to how such complicated systems could regulate themselves. During the war, he had worked on servomechanisms, like anti-artillery guns, that relied on negative feedback loops to increase their accuracy: the device performed an action, checked the result against its original goal, and then adjusted its performance accordingly. And it seemed to Wiener that many organic and cognitive processes could be modeled in the same way.
This might all seem obvious now, when we’ve unconsciously absorbed so much of Wiener’s thinking, but it’s one of those pivotal insights that opened up a whole new world to explore. When we pick up a pencil—or a cigar, in the example that Wiener, a lifelong smoker, liked to use—we’re engaging in a series of complicated muscular movements that are impossible to consciously control. Instead of explicitly thinking through each step, we rely on feedback: as Wiener puts it, his senses provide him with information about “the amount by which I have yet failed to pick up the cigar,” which is then translated into a revised set of instructions to his muscles. By combining many different feedback loops, we end up with complicated behaviors that couldn’t emerge from any one piece alone. Life, Weiner says, is an attempt to control entropy through feedback. It’s a powerful concept that allows us both to figure out how we got here to think about where we’re going, and it had a considerable impact on such fields as artificial intelligence, anthropology, and game theory. But what fascinates me the most is Wiener’s belief that cybernetics would allow us to solve the problems created by rapid technological change. As he writes in The Human Use of Human Beings:
The whole scale of phenomena has changed sufficiently to preclude any easy transfer to the present time of political, racial, and economic notions derived from earlier stages…We have modified our environment so radically that we must now modify ourselves in order to exist in our new environment.
This sounds a lot like Campbell’s vision of science fiction, which he saw as a kind of technical education for turning his readers into the new breed of men who could deal with the changes that the future would bring. And when you read Wiener, you’re constantly confronted with concepts and turns of phrase that Campbell would go on to popularize through his editorials and his ideas for stories. (For instance, Wiener writes: “When we consider a problem of nature such as that of atomic reactions and atomic explosives, the largest single item of information which we can make public is that they exist. Once a scientist attacks a problem which he knows to have an answer, his entire attitude is changed.” I can’t prove it, but I’m pretty sure that this sentence gave Campbell the premise that was later written up by Raymond F. Jones as “Noise Level.”) Campbell was friendly with Wiener, whom he described fondly as “a crackpot among crackpots,” and cybernetics itself quickly became a buzzword in such stories as Walter M. Miller’s “Izzard and the Membrane,” the Heinlein juveniles, and Alfred Bester’s The Demolished Man. Over time, its original meaning was lost, and the prefix “cyber-” was reduced to an all-purpose shorthand for an undefined technology, much as “positronic” had served the same function a few decades earlier. It would be left to writers like Thomas Pynchon to make real use of it in fiction, while cybernetics itself became, in Gordon Pask’s evocative definition, “the art and science of manipulating defensible metaphors.” But perhaps that’s the fate of all truly powerful ideas. As none other than our own president once put it so aptly, we have so many things we have to do better, and certainly cyber is one of them.
The evensong
It’s a long walk home tonight. Listen to this mock-angel singing, let your communion be at least in listening, even if they are not spokesmen for your exact hopes, your exact, darkest terror, listen. There must have been evensong here long before the news of Christ. Surely for as long as there have been nights bad as this one—something to raise the possibility of another light that could actually, with love and cockcrows, light the path home, banish the Adversary, destroy the boundaries between our lands, our bodies, our stories, all false, about who we are…
But on the way home tonight, you wish you’d picked him up, held him a bit. Just held him, very close to your heart, his cheek by the hollow of your shoulder, full of sleep. As if it were you who could, somehow, save him. For the moment not caring who you’re supposed to be registered as. For the moment anyway, no longer who the Caesars say you are.
My ten great books #6: Gravity’s Rainbow
If there’s a thread that runs through many of my favorite works of fiction, it’s that they’re often the work of massively erudite authors who are deeply ambivalent—or ironic—about their own learning. Norton Juster of The Phantom Tollbooth and the tireless annotators of the Sherlock Holmes stories seem to be content with knowledge for its own sake, but as for the rest, Borges ends up trapped in his own labyrinth; The Magic Mountain constructs an edifice of ideas on the verge of being blown up by a meaningless war; Proust notices everything but envies those creatures of instinct, like Albertine or Françoise, who can relate to the world in simpler terms. Gravity’s Rainbow may be the ultimate expression of this discomfort, an unbelievably dense, allusive, and omniscient novel about the futility of information itself. No other work of contemporary fiction is so packed with technical lore, references, jokes, and ideas, and its technical virtuosity is staggering. Thomas Pynchon has occasionally been dismissed as a shallow trickster or showoff, but his style is inseparable from his larger concerns. Only by writing the encyclopedic novel to end all others can he qualify himself to sound a deadly serious warning, which is that all the plans, structure, and information in the world can only wither and die in the face of more fundamental truths: death, loneliness, dissolution.
In the meantime, though, there’s plenty to enjoy: limericks, pie fights, burlesque imitations of vaudeville and musical theater, puns of exquisite corniness (the German city of Bad Karma, the Japanese Ensign Morituri), and countless vignettes of incredible beauty, cruelty, and inventiveness. That last word has a way of being applied to works that don’t deserve it, but here, it’s fully justified: Gravity’s Rainbow invents more across its seven hundred pages than any other novel I know—every sentence threatens to fly out of control, only to be restrained by its author’s uncanny mastery of tone—and the effect is both exhilarating and alienating. There aren’t any real characters here, just marionettes with amusing names, and there’s never a sense that this is anything more than a construct of Pynchon’s limitless imagination. (There’s a good case to be made that this was a conscious artistic choice, and that depth of character would only make the novel more unwieldy than it already is.) Like most encyclopedic works, it includes parodies of its own ambitions, like Mitchell Prettyplace’s definitive eighteen-volume study of King Kong, including “exhaustive biographies of everyone connected with the film, extras, grips, lab people,” or Brigadier Pudding’s Things That Can Happen in European Politics, a comprehensive analysis of possible political developments that is constantly overtaken by real events. Despite the occasional glimmer of hope, it’s futile, of course. But on any given page, as we’re swept up by Pynchon’s enormous talent, it doesn’t seem so futile after all.
The evensong
It’s a long walk home tonight. Listen to this mock-angel singing, let your communion be at least in listening, even if they are not spokesmen for your exact hopes, your exact, darkest terror, listen. There must have been evensong here long before the news of Christ. Surely for as long as there have been nights bad as this one—something to raise the possibility of another light that could actually, with love and cockcrows, light the path home, banish the Adversary, destroy the boundaries between our lands, our bodies, our stories, all false, about who we are…
But on the way home tonight, you wish you’d picked him up, held him a bit. Just held him, very close to your heart, his cheek by the hollow of your shoulder, full of sleep. As if it were you who could, somehow, save him. For the moment not caring who you’re supposed to be registered as. For the moment anyway, no longer who the Caesars say you are.
Jumping out of the system
Note: Spoilers follow for recent plot developments on Westworld.
Right now, Westworld appears to be operating on two different levels. One is that of an enterprising genre series that is content to strike all the familiar beats with exceptional concentration and intensity. You see this most clearly, I think, in Maeve’s storyline. It’s a plot thread that has given us extraordinary moments, thanks mostly to some fantastic work by Thandie Newton, who obviously understands that she has finally landed the role of a lifetime. Yet it’s ultimately less effective than it should be. We’re never quite clear on why Felix and Sylvester are allowing Maeve’s escape plan to proceed: they have all the power, as well as plenty of ways to deactivate her, and given the risks involved, they’ve been remarkably cooperative so far. Last night’s episode tried to clarify their motivations, suggesting that Felix has developed some sort of emotional connection to Maeve, but the show has been too busy cutting from one set of characters to another to allow us to feel this, rather than just being told about it. Maeve’s story seems rushed, as perhaps it had to be: it’s about a robot who wills herself into becoming conscious, instead of growing more organically aware, as Dolores has. (Or so we’re meant to believe—although the chronology of her awakening may also be an elaborate mislead, if the theory of multiple timelines is correct.) Aside from the subplot involving the Delos Corporation, however, it’s the arc that feels the stagiest and the most conventional. We’re pretty sure that it’s going somewhere, but it’s a little clumsy in the way it lines up the pieces.
The other level is the one embodied by Bernard’s story, and it offers a glimpse of what could be a much more interesting—if messier—series. Last week, I wrote that I had hope that the show could live up to the revelation of Bernard’s true nature, if only because it was in the capable hands of Jeffrey Wright, who seemed eminently qualified to see it through. Not surprisingly, he turns out to be even better at it than I had hoped. The high points of “Trace Decay,” at least for me, were the two scenes that Wright gets with Anthony Hopkins, who also seems to be relishing the chance to play a meatier role than usual. When Bernard asks what distinguishes him from his human creators, Dr. Ford replies that the answer is simple: there’s no difference. The stories that human beings use to define themselves are functionally the same as the artificial backstories that have been uploaded into the robots. We’re all operating within our own loops, and we rarely question our decisions or actions, except on the rare occasions, as Douglas R. Hofstadter puts it, when we can jump out of the system. In theory, a pair of conversations about human and machine consciousness shouldn’t work as drama, but they do. As Hopkins and Wright played off each other, I felt that I could spend an entire episode just watching them talk, even if the result resembled the western that Thomas Pynchon pitches in Gravity’s Rainbow, in which two cowboys played by Basil Rathbone and S.Z. Sakall spend the whole movie debating the nature of reality: “This interesting conversation goes on for an hour and a half. There are no cuts…Occasionally the horses will shit in the dust.”
But when I ask myself which kind of show Westworld most wants to be, I end up thinking that it’s probably the former. In the past, I’ve compared it to Mad Men, a series from which it differs immensely in content, pacing, and tone, but which it resembles in its chilly emotional control, its ability to move between storylines, and the degree to which it rewards close analysis. The difference, of course, is that Mad Men was able to pursue its own obsessions in a relatively neglected corner of basic cable, while Westworld is unfolding front and center on the most public stage imaginable. Mad Men received a fair amount of critical attention early on, but its network, AMC, barely even existed as a creative player, and it wasn’t until the premiere of Breaking Bad the following year that it became clear that something special was happening. Westworld was positioned from the start as the successor to Game of Thrones, which means that there’s a limit to how wild or experimental it can be. It’s hard to imagine it airing an episode like “Fly” on Breaking Bad, which radically upends our expectations of how an installment of the series should look. And maybe it shouldn’t. Getting a science fiction series to work under such conditions is impressive enough, and if it delivers on those multiple timelines, it may turn out to be more innovative than we had any reason to expect. (I’m still nervous about how that reveal will play from a storytelling perspective, since it means that Dolores, the show’s ostensible protagonist, has been been effectively sidelined from the main action for the entire season. It might not work at all. But it’s still daring.)
As usual, the show provides us with the tools for its own deconstruction, when the Man in Black says that there were once two competing visions of the park. In Dr. Ford’s conception, the stories would follow their established arcs, and the robots wouldn’t be allowed to stray from the roles that had been defined for them. Arnold, by contrast, hoped that it would cut deeper. (Harris does such a good job of delivering this speech that I can almost defend the show’s decision to have the Man in Black reveal more about himself in a long monologue, which is rarely a good idea.) Westworld, the series, seems more inclined to follow Ford’s version than Arnold’s, and to squeeze as much freedom as it can out of stories that move along lines that we’ve seen before. Earlier this week, Jim Lanzone of CBS Interactive, the online platform on which Star Trek: Discovery is scheduled to premiere, said of the format:
Sci-fi is not something that has traditionally done really well on broadcast. It’s not impossible, for the future, if somebody figures it out. But historically, a show like Star Trek wouldn’t necessarily be a broadcast show at this point.
It isn’t hard to see what he means: the network audience, like the theme park crowd, wants something that is more consistent than episodic science fiction tends to be. If Westworld can do this and tell compelling stories at the same time, so much the better—and it may be a greater accomplishment simply to thread that difficult needle. But I’m still waiting to see if it can jump out of its loop.
The poll vaccine
Over the last few days, a passage from Gravity’s Rainbow by Thomas Pynchon has been rattling around in my head. It describes a patient at “The White Visitation,” a mental hospital in southern England that has been given over for the duration of the war to a strange mixture of psychological warfare operatives, clairvoyants, and occultists. See if you can figure out why I’ve been thinking about it:
At “The White Visitation” there’s a long-time schiz, you know, who believes that he is World War II. He gets no newspapers, refuses to listen to the wireless, but still, the day of the Normandy invasion somehow his temperature shot up to 104°. Now, as the pincers east and west continue their slow reflex contraction, he speaks of darkness invading his mind, of an attrition of self…The Rundstedt offensive perked him up though, gave him a new lease on life—“A beautiful Christmas gift,” he confessed to the residents of his ward, “it’s the season of birth, of fresh beginnings.” Whenever the rockets fall—those which are audible—he smiles, turns out to pace the ward, tears about to splash from the corners of his merry eyes, caught up in a ruddy high tonicity that can’t help cheering his fellow patients. His days are numbered. He’s to die on V-E Day.
In case it isn’t obvious, the patient is me, and the war is the election. There are times when it feels like I’m part of an experiment in which all of my vital organs have been hooked up to Nate Silver’s polling average—which sounds like a Black Mirror spec script that I should try to write. I go from seeking out my equivalent of the Watergate fix every few minutes to days when I need to restrict myself to checking the news just once in the morning and again at night. Even when I take a technology sabbath from election coverage, it doesn’t help: it’s usually the last thing that I think about before I fall asleep and the first thing that comes to mind when I wake up, and I’ve even started dreaming about it. (I’m pretty sure that I had a dream last night in which the charts on FiveThirtyEight came to life, like August Kekulé’s vision of the snake biting its own tail.) And the scary part is that I know I’m not alone. The emotional toll from this campaign is being shared by millions on both sides, and no matter what the result is, the lasting effects will be those of any kind of collective trauma. I think we’ve all felt the “attrition of self” of which Pynchon’s patient speaks—a sense that our private lives have been invaded by politics as never before, not because our civil liberties are threatened, but because we feel exposed in places that we normally reserve for the most personal parts of ourselves. For the sake of my own emotional health, I’ve had to set up psychological defenses over the last few months that I didn’t have before, and if Donald Trump wins, I can easily envision them as a way of life.
But maybe that isn’t a bad thing. In fact, I’ve come to see this campaign season as a kind of vaccine that will prepare us to survive the next four years. If there’s one enduring legacy that I expect from this election, it’s that it will turn large sections of the population away from politics entirely as a means of achieving their goals. In the event of a Clinton victory, and the likelihood of a liberal Supreme Court that will persist for decades, I’d like to think that the pro-life movement would give up on its goal of overturning Roe v. Wade and focus on other ways of reducing the abortion rate as much as possible. (Increasing support for single and working mothers might be a good place to start.) A Trump presidency, by contrast, would force liberals to rethink their approaches to problems like climate change—and the fact that I’m even characterizing it as a “liberal” issue implies that we should have given up on the governmental angle a long time ago. Any attempt to address an existential threat like global warming that can be overturned by an incoming president isn’t an approach that seems likely to succeed over the long term. I’m not sure how a nongovernmental solution would look, but a president who has sworn to pull out of the Paris Agreement would at least invest that search with greater urgency. If nothing else, this election should remind us of the fragility of the political solutions that we’ve applied to the problems that mean the most to us, and how foolish it seems to entrust their success or failure to a binary moment like the one we’re facing now.
And this is why so many of us have found this election taking up residence in our bodies, like a bug that we’re hoping to shake. We’ve wired important parts of our own identities to impersonal forces, and we shouldn’t be surprised if we feel helpless and unhappy when the larger machine turns against us—while also remembering that there are men, women, and children who have more at stake in the outcome than just their hurt feelings. Immediately before the passage that I quoted above, Pynchon writes:
The War, the Empire, will expedite such barriers between our lives. The War needs to divide this way, and to subdivide, though its propaganda will always stress unity, alliance, pulling together. The War does not appear to want a folk-consciousness, not even of the sort the Germans have engineered, ein Volk ein Führer—it wants a machine of many separate parts, not oneness, but a complexity…Yet who can presume to say what the War wants, so vast and aloof is it…Perhaps the War isn’t even an awareness—not a life at all, really. There may be only some cruel, accidental resemblance to life.
Replace “the War” with “the Election,” and you end up with something that feels very close to where we are now. There does seem to be “some cruel, accidental resemblance to life” in the way that this campaign has followed its own narrative logic, but it has little to do with existence as lived on a human scale. Even if we end up feeling that we’ve won, it’s worth taking that lesson to heart. The alternative is an emotional life that is permanently hooked up to events outside its control. And that’s no way to live.
Beyond cyberspace
Recently, I’ve been reading the work of Norbert Wiener, the mathematician and polymath best known as the founder of cybernetics. Wiener plays a small but pivotal role in my book Astounding: he was one of John W. Campbell’s professors at M.I.T., and Campbell later remembered him as the only person there who ever helped him with a story. (He also called Wiener, who always remained one of his intellectual idols, “one of America’s dullest and most incompetent teachers.” According to Campbell, Wiener would write an expression on the blackboard, say “From this, it is clear that…”, and then write another one without explaining any of the intermediate steps. He could also do third-order differential equations in his head, which was very impressive, except for the fact that he expected his students to do the same thing.) I became interested in Wiener because of the role that his ideas played in the development of dianetics: Campbell once wrote a letter to Wiener saying that he thought the latter would be “greatly interested” in his work with L. Ron Hubbard, and Wiener later asked the Hubbard Dianetic Research Foundation to stop using his name in its literature. And much of the terminology of dianetics, including the word “clear,” seems to have been derived directly from Wiener’s work. But after reading both Cybernetics and The Human Use of Human Beings—while skipping most of the mathematics, which even Isaac Asimov admitted he didn’t understand—I’ve started to realize that Wiener’s overall influence was much greater. For much of the fifties, Campbell served as a conduit to bring cybernetic ideas into the mainstream of science fiction. As a result, Wiener’s fingerprints are all over the genre, and from there, they entered all our lives.
“Cybernetics” is one of those words, like “postmodernism,” that can be used to mean just about anything you want, but it originated in a pair of straightforward but profound observations. Wiener, who had worked on computing machines at M.I.T. and laid out the principles of their operation in a famous memo to Vannevar Bush, had grown intrigued by the parallels between computers and the human brain. He also understood that any attempt to explain intelligence in purely mechanical terms was doomed to fail. The mind doesn’t resemble a clock or any other machine, with gears that can run forwards or backwards. It’s more like the weather: a complicated system made up of an countably huge number of components that move only in one direction. Wiener’s first important insight was that the tools of statistical mechanics and information theory could be used to shed light on biological and mental processes, which amount to an island of negative entropy—it’s one of the few places in the universe where matter becomes more organized over time. But how the heck do you control it? In practice, life and intelligence consist of so many variables that they seem impossible to analyze or replicate, and Wiener’s second major contribution related to how such complicated systems could regulate themselves. During the war, he had worked on servomechanisms, like anti-artillery guns, that relied on negative feedback loops to increase their accuracy: the device performed an action, checked the result against its original goal, and then adjusted its performance accordingly. And it seemed to Wiener that many organic and cognitive processes could be modeled in the same way.
This might all seem obvious now, when we’ve unconsciously absorbed so much of Wiener’s thinking, but it’s one of those pivotal insights that opened up a whole new world to explore. When we pick up a pencil—or a cigar, in the example that Wiener, a lifelong smoker, liked to use—we’re engaging in a series of complicated muscular movements that are impossible to consciously control. Instead of explicitly thinking through each step, we rely on feedback: as Wiener puts it, his senses provide him with information about “the amount by which I have yet failed to pick up the cigar,” which is then translated into a revised set of instructions to his muscles. By combining many different feedback loops, we end up with complicated behaviors that couldn’t emerge from any one piece alone. Life, Weiner says, is an attempt to control entropy through feedback. It’s a powerful concept that allows us to look backward, to figure out how we got here, and forward, to think about where we’re going, and it had a considerable impact on such fields as artificial intelligence, anthropology, and game theory. But what fascinates me the most is Wiener’s belief that cybernetics would allow us to solve the problems created by rapid technological change. As he writes in The Human Use of Human Beings:
The whole scale of phenomena has changed sufficiently to preclude any easy transfer to the present time of political, racial, and economic notions derived from earlier stages…We have modified our environment so radically that we must now modify ourselves in order to exist in our new environment.
This sounds a lot like John W. Campbell’s vision of science fiction, which he saw as a kind of university for turning his readers into the new breed of men who could deal with the changes that the future would bring. And when you read Wiener, you’re constantly confronted with concepts and turns of phrase that Campbell would go on to popularize through his editorials and his ideas for stories. (For instance, Wiener writes: “When we consider a problem of nature such as that of atomic reactions and atomic explosives, the largest single item of information which we can make public is that they exist. Once a scientist attacks a problem which he knows to have an answer, his entire attitude is changed.” I can’t prove it, but I’m pretty sure that this sentence gave Campbell the premise that was later written up by Raymond F. Jones as “Noise Level.”) Campbell was friendly with Wiener, whom he described fondly as “a crackpot among crackpots,” and cybernetics itself quickly became a buzzword in science fiction stories as different as “Izzard and the Membrane,” the Heinlein juveniles, and The Demolished Man. Over time, its original meaning was lost, and the prefix “cyber-” was reduced to an all-purpose shorthand for an undefined technology, as “positronic” had been a few decades earlier. It would be left to writers like Thomas Pynchon to make real use of it in fiction, while cybernetics itself became, in Gordon Pask’s evocative definition, “the art and science of manipulating defensible metaphors.” But perhaps that’s the fate of all truly powerful ideas. As one of our presidential candidates put it so aptly, we have so many things we have to do better, and certainly cyber is one of them.
The book of lists
I love a good list. Whether it’s the catalog of ships in the Iliad or the titles of the books in the fallout shelter in Farnham’s Freehold, I find it impossible to resist, at least when I’m in the hands of a talented writer. Take, for instance, the inventory of Tyrone Slothrop’s desktop that we find toward the beginning of Gravity’s Rainbow:
…a scatter of paperclips, Zippo flints, rubber bands, staples, cigarette butts and crumpled packs, stray matches, pins, nubs of pens, stubs of pencils of all colors including the hard-to-get heliotrope and raw umber, wooden coffee spoons, Thayer’s Slippery Elm Throat Lozenges sent by Slothrop’s mother, Nalline, all the way from Massachusetts, bits of tape, string, chalk…above that a layer of forgotten memoranda, empty buff ration books, phone numbers, unanswered letters, tattered sheets of carbon paper, the scribbled ukulele chords to a dozen songs including “Johnny Doughboy Found a Rose in Ireland”…an empty Kreml hair tonic bottle, lost pieces to different jigsaw puzzles showing parts of the amber left eye of a Weimaraner, the green velvet folds of a gown, slate-blue veining in a distant cloud, the orange nimbus of an explosion (perhaps a sunset), rivets in the skin of a Flying Fortress, the pink inner thigh of a pouting pin-up girl…
It takes up a whole page, and I’ve always felt that I could go on reading it forever. An attentive critic could probably mine it for clues, using it as a skeleton key for the rest of the book, but the real point seems to be showing off Pynchon’s exuberant command of the real, until it becomes an emblem of the entire novel.
In a wonderful essay titled “Poetry and Happiness,” Richard Wilbur calls this impulse “a primitive desire that is radical to poetry—the desire to lay claim to as much of the world as possible through uttering the names of things.” He quotes the list of smells from the eighteenth chapter of Hugo Lofting’s Doctor Dolittle, and then observes:
A catalog of that sort pleases us in a number of ways. In the first place, it stimulates that dim and nostalgic thing the olfactory memory, and provokes us to recall the ghosts of various stinks and fragrances. In the second place, such a catalog makes us feel vicariously alert; we participate in the extraordinary responsiveness of Doctor Dolittle’s dog, and so feel the more alive to things. In the third place, we exult in Jip’s power of instant designation, his ability to pin things down with names as fast as they come. The effect of the passage, in short, is to let us share in an articulate relishing and mastery of phenomena in general.
Wilbur continues: “That is what the cataloging impulse almost always expresses—a longing to posses the whole world, and to praise it, or at least to feel it.” He offers up a few more examples, ranging from the Latin canticle Benedicte, omnia opera domini to “Pied Beauty” by Gerard Manley Hopkins, and closes on a profound observation: “When a catalog has a random air, when it seems to have been assembled by chance, it implies a vast reservoir of other things which might just as well have been mentioned.”
What Wilbur calls “the itch to call the roll of things,” then, is simultaneously a natural human instinct and a useful narrative trick, which is a nice combination. Even a grocery list represents an attempt to impose some kind of order on existence, and like the lists in poetry or fiction, the part comes to stand for the whole: the real to-do list of our lives is endless, but we feel more capable of dealing with it once we’ve written some of it down. A novelist is constantly doing much the same thing, and one measure of craft is how conscious the author is of the process, and the extent to which the result evokes a larger reality. And this applies to more than just inventories of objects. Any narrative work, fiction or nonfiction, is a list of things that happened, and even the most comprehensive version is bound to be a subset of all possible components. As a biographer, I’ve become acutely aware that any account of a person’s life consists of a selection of facts, and that there are countless possible variations. As Borges puts it:
Let us greatly simplify, and imagine that a life consists of 13,000 facts. One of the hypothetical biographies would record the series 11, 22, 33…; another, the series 9, 13, 17, 21…; another, the series 3, 12, 21, 30, 39… A history of a man’s dreams is not inconceivable; another, of the organs of his body; another, of the mistakes he made; another, of all the moments when he thought about the Pyramids; another, of his dealings with the night and the dawn.
Borges continues: “The above may seem merely fanciful, but unfortunately it is not. No one today resigns himself to writing the literary biography of an author or the military biography of a soldier; everyone prefers the genealogical biography, the economic biography, the psychiatric biography, the surgical biography, the typographical biography.” And when he evokes a biographer of Edgar Allan Poe who barely mentions the stories or poems but is “fascinated by changes of residence,” it feels like a devastating commentary on the whole art of biography. But the deeper—and more frightening—implication is that we’re engaged in much the same process when it comes to our own lives. We don’t have access to all of our past selves at once: I find it hard to remember what happened last week without writing it down, and there are years of my life that I go for long periods without consciously recalling. This means, inevitably, that our personalities are a kind of list, too, and even though it seems complete, it really only represents a tiny slice of our whole experience. I’m no more complicated a person than average, but there are times when I’m amazed by how little of myself I need to access on a daily basis. It’s a random sampling of my internal contents, assembled only in part by choice, and I live with it because it’s the most my imperfect brain can handle. In a different essay, Borges says: “The steps a man takes from the day of his birth until that of his death trace in time an inconceivable figure. The Divine Mind intuitively grasps that form immediately, as men do a triangle.” We can’t see it for ourselves, but we can list a few of the steps. And in the end, that list is all we have.
“He had played his part admirably…”
Note: This post is the forty-first installment in my author’s commentary for Eternal Empire, covering Chapter 40. You can read the previous installments here.
A few weeks ago, I briefly discussed the notorious scene in The Dark Knight Rises in which Bruce Wayne reappears—without any explanation whatsoever—in Gotham City. Bane’s henchmen, you might recall, have blown up all the bridges and sealed off the area to the military and law enforcement, and the entire plot hinges on the city’s absolute isolation. Bruce, in turn, has just escaped from a foreign prison, and although its location is left deliberately unspecified, it sure seems like it was in a different hemisphere. Yet what must have been a journey of thousands of miles and a daring incursion is handled in the space of a single cut: Bruce simply shows up, and there isn’t even a line of dialogue acknowledging how he got there. Not surprisingly, this hiatus has inspired a lot of discussion online, with most explanations boiling down to “He’s Batman.” If asked, Christopher Nolan might reply that the specifics don’t really matter, and that the viewer’s attention is properly focused elsewhere, a point that the writer John Gardner once made with reference to Hamlet:
We naturally ask how it is that, when shipped off to what is meant to be his death, the usually indecisive prince manages to hoist his enemies with their own petard—an event that takes place off stage and, at least in the surviving text, gets no real explanation. If pressed, Shakespeare might say that he expects us to recognize that the fox out-foxed is an old motif in literature—he could make up the tiresome details if he had to…
Gardner concludes: “The truth is very likely that without bothering to think it out, Shakespeare saw by a flash of intuition that the whole question was unimportant, off the point; and so like Mozart, the white shark of music, he snapped straight to the heart of the matter, refusing to let himself be slowed for an instant by trivial questions of plot logic or psychological consistency—questions unlikely to come up in the rush of drama, though they do occur to us as we pore over the book.” And while this might seem to apply equally well to The Dark Knight Rises, it doesn’t really hold water. The absence of an explanation did yank many of us out of the movie, however briefly, and it took us a minute to settle back in. Any explanation at all would have been better than this, and it could have been conveyed in less than a sentence. It isn’t an issue of plausibility, but of narrative flow. You could say that Bruce’s return to the city ought to be omitted, in the same way a director like Kurosawa mercilessly cuts all transitional moments: when you just need to get a character from Point A to Point B, it’s best to trim the journey as much as you can. In this instance, however, Nolan erred too much on one side, at least in the eyes of many viewers. And it’s a reminder that the rules of storytelling are all about context. You’ve got to judge each problem on its own terms and figure out the solution that makes the most sense in each case.
What’s really fascinating is how frequently Nolan himself seems to struggle with this issue. In terms of sheer technical proficiency, I’d rank him near the top of the list of all working directors, but if he has one flaw as a filmmaker, aside from his lack of humor, it’s his persistent difficulty in finding the right balance between action and exposition. Much of Inception, which is one of my ten favorite movies of all time, consists of the characters breathlessly explaining the plot to one another, and it more or less works. But he also spends much of Interstellar trying with mixed success to figure out how much to tell us about the science involved, leading to scenes like the one in which Dr. Romilly explains the wormhole to Cooper seemingly moments before they enter it. And Nolan is oddly prone to neglecting obligatory beats that the audience needs to assemble the story in their heads, as when Batman appears to abandon a room of innocent party guests to the Joker in The Dark Knight. You could say that such lapses simply reflect the complexity of the stories that Nolan wants to tell, and you might be right. But David Fincher, who is Nolan’s only peer among active directors, tells stories of comparable or greater complexity—indeed, they’re often about their own complexity—and we’re rarely lost or confused. And if I’m hard on Nolan about this, it’s only a reflection of how difficult such issues can be, when even the best mainstream director of his generation has trouble working out how much information the audience needs.
It all boils down to Thomas Pynchon’s arch aside in Gravity’s Rainbow: “You will want cause and effect. All right.” And knowing how much cause will yield the effect you need is a problem that every storyteller has to confront on a regular basis. Chapter 40 of Eternal Empire provides a good example. For the last hundred pages, the novel has been building toward the moment when Ilya sneaks onto the heavily guarded yacht at Yalta. There’s no question that he’s going to do it; otherwise, everything leading up to it would seem like a ridiculous tease. The mechanics of how he gets aboard don’t really matter, but I also couldn’t avoid the issue, or else readers would rightly object. All I needed was a solution that was reasonably plausible and that could be covered in a few pages. As it happens, the previous scene ends with this exchange between Maddy and Ilya: “But you can’t just expect to walk on board.” “That’s exactly what I intend to do.” When I typed those lines, I didn’t know what Ilya had in mind, but I knew at once that they pointed at the kind of simplicity that the story needed, at least at this point in the novel. (If it came later in the plot, as part of the climax, it might have been more elaborate.) So I came up with a short sequence in which Ilya impersonates a dockwalker looking for work on the yacht, cleverly ingratiates himself with the bosun, and slips below when Maddy provides a convenient distraction. It’s a cute scene—maybe a little too cute, in fact, for this particular novel. But it works exactly as well as it should. Ilya is on board. We get just enough cause and effect. And now we can move on to the really good stuff to come…
Trading places
Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What famous person’s life would you want to assume?”
“Celebrity,” John Updike once wrote, “is a mask that eats into the face.” And Updike would have known, having been one of the most famous—and the most envied—literary novelists of his generation, with a career that seemed to consist of nothing but the serene annual production of poems, stories, essays, and hardcovers that, with their dust jackets removed, turned out to have been bound and designed as a uniform edition. From the very beginning, Updike was already thinking about how his complete works would look on library shelves. That remarkable equanimity made an impression on the writer Nicholson Baker, who wrote in his book U & I:
I compared my awkward self-promotion too with a documentary about Updike that I saw in 1983, I believe, on public TV, in which, in one scene, as the camera follows his climb up a ladder at his mother’s house to put up or take down some storm windows, in the midst of this tricky physical act, he tosses down to us some startlingly lucid little felicity, something about “These small yearly duties which blah blah blah,” and I was stunned to recognize that in Updike we were dealing with a man so naturally verbal that he could write his fucking memoirs on a ladder!
Plenty of writers, young or old, might have wanted to switch places with Updike, although the first rule of inhabiting someone else’s life is that you don’t want to be a writer. (The Updike we see in Adam Begley’s recent biography comes across as more unruffled than most, but all those extramarital affairs in Ipswich must have been exhausting.) Writing might seem like an attractive kind of celebrity: you can inspire fierce devotion in a small community of fans while remaining safely anonymous in a restaurant or airport. You don’t even need to go as far as Thomas Pynchon: how many of us could really pick Michael Chabon or Don DeLillo or Cormac McCarthy out of a crowd? Yet that kind of seclusion carries a psychological toll as well, and I suspect that the daily life of any author, no matter how rich or acclaimed, looks much the same as any other. If you want to know what it’s like to be old, Malcolm Cowley wrote: “Put cotton in your ears and pebbles in your shoes. Pull on rubber gloves. Smear Vaseline over your glasses, and there you have it: instant old age.” And if you want to know what it’s like to be a novelist, you can fill a room with books and papers, go inside, close the door, and stay there for as long as possible while doing absolutely nothing that an outside observer would find interesting. Ninety percent of a writer’s working life looks more or less like that.
What kind of celebrity, then, do you really want to be? If celebrity is a mask, as Updike says, it might be best to make it explicit. Being a member of Daft Punk, say, would allow you to bask in the adulation of a stadium show, then remove your helmet and take the bus back to your hotel without any risk of being recognized. The mask doesn’t need to be literal, either: I have a feeling that Lady Gaga could dress down in a hoodie and ponytail and order a latte at any Starbucks in the country without being mobbed. The trouble, of course, with taking on the identity of a total unknown—Banksy, for instance—is that you’re buying the equivalent of a pig in a poke: you just don’t know what you’re getting. Ideally, you’d switch places with a celebrity whose life has been exhaustively chronicled, either by himself or others, so that there aren’t any unpleasant surprises. It’s probably best to also go with someone slightly advanced in years: as Solon says in Herodotus, you don’t really know how happy someone’s life is until it’s over, and the next best thing would be a person whose legacy seems more or less fixed. (There are dangers there, too, as Bill Cosby knows.) And maybe you want someone with a rich trove of memories of a life spent courting risk and uncertainty, but who has since mellowed into something slightly more stable, with the aura of those past accomplishments still intact.
You also want someone with the kind of career that attracts devoted collaborators, which is the only kind of artistic wealth that really counts. But you don’t want too much fame or power, both of which can become traps in themselves. In many respects, then, what you’d want is something close to the life of half and half that Lin Yutang described so beautifully: “A man living in half-fame and semi-obscurity.” Take it too far, though, and you start to inch away from whatever we call celebrity these days. (Only in today’s world can an otherwise thoughtful profile of Brie Larson talk about her “relative anonymity.”) And there are times when a touch of recognition in public can be a welcome boost to your ego, like for Sally Field in Soapdish, as long as you’re accosted by people with the same basic mindset, rather than those who just recognize you from Istagram. You want, in short, to be someone who can do pretty much what he likes, but less because of material resources than because of a personality that makes the impossible happen. You want to be someone who can tell an interviewer: “Throughout my life I have been able to do what I truly love, which is more valuable than any cash you could throw at me…So long as I have a roof over my head, something to read and something to eat, all is fine…What makes me so rich is that I am welcomed almost everywhere.” You want to be Werner Herzog.
The inherent vice of the movies
Earlier this week, I caught up with two of the titles on the list of movies I’ve wanted to see from the last twelve months—a harder matter than it might first appear, since I haven’t seen a film in theaters since Interstellar. They were Inherent Vice, which I rented, and Mad Max: Fury Road, which I was able to see, thankfully, on the big screen. And while they may seem like an unlikely pair, they have more in common than first meets the eye. Both are the work of legendary directors operating near the top of their respective games, and both push in intriguing ways against our assumptions about how a movie ought to be structured. Inherent Vice is deliberately designed to undermine any expectations we might have about a profluent plot, with an endless series of incidents following one another in a way that teases but frustrates our hopes of a larger pattern, while Fury Road comes as close as any movie can to a single uninterrupted action scene. Both create the sense of an entire world existing beyond the edges of the frame, and both are too dense to be fully processed in a single viewing. And although Fury Road is considerably easier to love, both serve, in their own inimitable ways, as reminders of how rich the movie medium can be, and how rarely we see it taken to its full potential.
And what’s especially noteworthy is that each film arrived at its final shape by following a path that had little to do with how movie scripts are usually written. Paul Thomas Anderson adapted Inherent Vice by transcribing Thomas Pynchon’s novel in its entirety, sentence by sentence, into one massive screenplay, reasoning that the resulting doorstop would be easier for him to edit: “I can understand this format,” he explained to the New York Times. With Fury Road, George Miller took the opposite approach, but for much the same reason:
Because it’s almost a continuous chase, you have to connect one shot to the other, so the obvious way to do it was as a storyboard, and then put words in later. So, I worked with five really good storyboard artists. We just sat in a big room and, instead of writing it down, we’d say “Okay, this guy throws what we call a thunder stick at another car and there’s an explosion.” You can write that, but exactly where the thunder stick is, where the car is and what the explosion looks like, it’s very hard to get those dimensions, so we’d draw it. We ended up with about 3,500 panels. It almost becomes equivalent to the number of shots in the movie.
In starting from storyboards, Miller—who won an Oscar for Happy Feet—may have been harking back to the technique of the great animated movies, which were planned as a series of thumbnail sketches rather than as a conventional script. And in both cases, the approach was dictated simultaneously by the formats the directors understood and by the demands of the material: a challenging literary adaptation on one hand, an action extravaganza on the other. The result, in each instance, is a movie that inspires a unique set of feelings in the viewer. Inherent Vice encourages us to stop trying to piece together a coherent story, which is probably impossible, and just lie back and wait for the next gag or visual joke. Fury Road leaves us in a state of similar serenity, but by very different means: by its final half hour, we’re in the kind of blissful high that Pauline Kael liked to describe, and instead of feeling pummeled, as we might with Michael Bay, we’re carried along on a gentle wave of adrenaline. It’s a reminder that a script, which has been fetishized as an object in itself, is really a blueprint, and that it can and should take whatever form seems most useful. Books like Save the Cat! and similar manuals have distilled scripts down to such a formula that act breaks and turning points are supposed to happen on particular page numbers, which is as much a convenience for harried studio readers as it is a recipe for storytelling. But it’s not the only way.
And it’s significant that these departures from the norm owe their existence to acclaimed directors, working from their own scripts, with the clout and support to make it happen. Your average screenplay is written from a place of minimal power: to be read in the first place, much less to make it through the development process, it needs to look like every other screenplay that crosses an executive’s desk. And while I’m skeptical of the auteur theory, it’s worth asking if the grinding sameness of so many movies is an inevitable consequence of the screenwriter’s imperiled position. A writer knows that he could be replaced at any point by someone else who can follow the beat sheets, so he paradoxically has an incentive to make his work as generic as possible. You could say that blandness is the inherent vice of the modern screenplay format itself—a property that causes material to deteriorate because of an essential quality of its components. “Eggs break, chocolate melts, glass shatters,” as the narrator of Inherent Vice reminds us, and scripts written according to a fixed template will bore us. Inherent Vice and Fury Road are both throwbacks to a time before these formulas took over the world: Miller has his own movies to serve as inspiration, while Inherent Vice harks back consciously to Robert Altman’s The Long Goodbye, much of which is about Philip Marlowe literally trying to save his cat. We deserve more movies like this. And the fact that the system is designed to deny them to us should make us a little furious.
The poster problem
Three years ago, while reviewing The Avengers soon after its opening weekend, I made the following remarks, which seem to have held up fairly well:
This is a movie that comes across as a triumph more of assemblage and marketing than of storytelling: you want to cheer, not for the director or the heroes, but for the executives at Marvel who brought it all off. Joss Whedon does a nice, resourceful job of putting the pieces together, but we’re left with the sense of a director gamely doing his best with the hand he’s been dealt, which is an odd thing to say for a movie that someone paid $200 million to make. Whedon has been saddled with at least two heroes too many…so that a lot of the film, probably too much, is spent slotting all the components into place.
If the early reactions to Age of Ultron are any indication, I could copy and paste this text and make it the centerpiece of a review of any Avengers movie, past or future. This isn’t to say that the latest installment—which I haven’t seen—might not be fine in its way. But even the franchise’s fans, of which I’m not really one, seem to admit that much of it consists of Whedon dealing with all those moving parts, and the extent of your enjoyment depends largely on how well you feel he pulls it off.
Whedon himself has indicated that he has less control over the process than he’d like. In a recent interview with Mental Floss, he says:
But it’s difficult because you’re living in franchise world—not just Marvel, but in most big films—where you can’t kill anyone, or anybody significant. And now I find myself with a huge crew of people and, although I’m not as bloodthirsty as some people like to pretend, I think it’s disingenuous to say we’re going to fight this great battle, but there’s not going to be any loss. So my feeling in these situations with Marvel is that if somebody has to be placed on the altar and sacrificed, I’ll let you guys decide if they stay there.
Which, when you think about it, is a startling statement to hear from one of Hollywood’s most powerful directors. But it accurately describes the situation. Any Avengers movie will always feel less like a story in itself than like a kind of anomalous weather pattern formed at the meeting point of several huge fronts: the plot, such as it is, emerges in the transition zone, and it’s dwarfed by the masses of air behind it. Marvel has made a specialty of exceeding audience expectations just ever so slightly, and given the gigantic marketing pressures involved, it’s a marvel that it works as well as it does.
It’s fair to ask, in fact, whether any movie with that poster—with no fewer than eight names above the title, most belonging to current or potential franchise bearers—could ever be more than an exercise in crowd control. In fact, there’s a telling counterexample, and it looks, as I’ve said elsewhere, increasingly impressive with time: Christopher Nolan’s Inception. As the years pass, Inception remains a model movie in many respects, but particularly when it comes to the problem of managing narrative complexity. Nolan picks his battles in fascinating ways: he’s telling a nested story with five or more levels of reality, and like Thomas Pynchon, he selectively simplifies the material wherever he can. There’s the fact, for instance, that once the logic of the plot has been explained, it unfolds more or less as we expect, without the twist or third-act betrayal that we’ve been trained to anticipate in most heist movies. The characters, with the exception of Cobb, are defined largely by their surfaces, with a specified role and a few identifying traits. Yet they don’t come off as thin or underdeveloped, and although the poster for Inception is even more packed than that for Age of Ultron, with nine names above the title, we don’t feel that the movie is scrambling to find room for everyone.
And a glance at the cast lists of these movies goes a long way toward explaining why. The Avengers has about fifty speaking parts; Age of Ultron has sixty; and Inception, incredibly, has only fifteen or so. Inception is, in fact, a remarkably underpopulated movie: aside from its leading actors, only a handful of other faces ever appear. Yet we don’t particularly notice this while watching. In all likelihood, there’s a threshold number of characters necessary for a movie to seem fully peopled—and to provide for enough interesting pairings—and any further increase doesn’t change our perception of the whole. If that’s the case, then it’s another shrewd simplification by Nolan, who gives us exactly the number of characters we need and no more. The Avengers movies operate on a different scale, of course: a movie full of superheroes needs some ordinary people for contrast, and there’s a greater need for extras when the stage is as big as the universe. (On paper, anyway. In practice, the stakes in a movie like this are always going to remain something of an abstraction, since we have eight more installments waiting in the wings.) But if Whedon had been more ruthless at paring down his cast at the margins, we might have ended up with a series of films that seemed, paradoxically, larger: each hero could have expanded to fill the space he or she deserved, rather than occupying one corner of a masterpiece of Photoshop.
“Her face was that of a woman with secrets…”
Note: This post is the thirteenth installment in my author’s commentary for Eternal Empire, covering Chapter 14. You can read the previous installments here.
Of all the misconceptions that frustrate aspiring writers, one of the most insidious involves the distinction between flat and round characters. As formulated by E.M. Forster in Aspects of the Novel, a flat character is one that expresses a single, unchanging idea or quality, while a round character has the ability to change or surprise us. One certainly sounds better than the other, and as a result, you’ll often find writers fretting over the fact that one character or another in their stories is flat, or wondering how to construct a suitably round character from scratch, as if it were a matter of submitting the proper design specifications. What all this misses is the fact that Forster’s original categories were descriptive, not prescriptive, and a round character isn’t inherently more desirable than a flat one: as with everything else in writing, it depends on execution and the role a particular character plays in the narrative as a whole. It’s true that Forster concludes by saying: “We must admit that flat people are not in themselves as big achievements as round ones.” But he also prefaces this with three full pages of reasons why flat characters can be useful—or essential—in even the greatest of novels.
So why should we ever prefer a flat character over a round? Forster notes that flat characters often linger in the memory more vividly after the novel is over; they can be brought onstage in full force, rather than being slowly developed; and they’re easily recognizable, which can serve as an organizing principle in a complicated story. (He even says that Russian novels could use more of them.) In the work of writers like Dickens, who gives us pretty much nothing but flat characters, or Proust, who uses almost as many, their interest arises from their interactions with one another and the events of the plot: “He is the idea, and such life as he possesses radiates from its edges and from the scintillations it strikes when other elements in the novel impinge.” If Forster had lived a little later, he might have also mentioned Thomas Pynchon, whose works are populated by caricatures and cartoons whose flatness becomes a kind of strategy for managing the novel’s complexity. Flat characters have their limitations; they’re more appealing when comic than tragic, and they work best when they set off a round character at the center. But most good novels, as Forster observes, contain a mixture of the two: “A novel that is at all complex often requires flat people as well as round, and the outcome of their collisions parallels life more accurately.”
And a memorable flat character requires as much work and imagination as one seen in the round. A bad, unconvincing character is sometimes described as “flat,” but the problem isn’t flatness in itself—it’s the lack of energy or ingenuity devoted to rendering that one vivid quality, or the author’s failure to recognize when one or another category of character is required. A bad flat character can be unbearable, but a bad round character tends to dissolve into a big pile of nothing, an empty collection of notions without anything to hold it together, as we see in so much literary fiction. The great ideal is a round, compelling character, but in order to surprise the reader, he or she has to surprise the writer first. And in practice, what this usually means is that a character who was introduced to fill a particular role gradually begins to take on other qualities, not through some kind of magic, but simply as the part is extended through multiple incidents and situations. Sherlock Holmes is fairly flat as first introduced in A Study in Scarlet: he’s extraordinarily memorable, but also the expression of a single idea. It’s only when the element of time is introduced, in the form of a series of stories, that he acquires an inner life. Not every flat character evolves into roundness, but when one does, the result is often more interesting than if it were conceived that way from the ground up.
My own novels contain plenty of flat characters, mostly to fill a necessary function or story point, but the one who turned into something more is Maya Asthana. She began, as most flat characters do, purely as a matter of convenience. Wolfe needed to talk to somebody, so I gave her a friend, and most of her qualities were chosen to make her marginally more vivid in what I thought would be her limited time onstage: I made her South Asian, which was an idea left over from an early conception of Wolfe herself, and I decided that she’d be planning her wedding, since this would provide her with a few easy bits of business that could be introduced without much trouble. But as I’ve mentioned elsewhere, Asthana got caught up in a radical shift in the logic of the novel itself: I needed a mole and a traitor within the agency, and after my original plan turned out to be unworkable, I cast around for someone else to fill that role. Asthana happened to be handy. And by turning her into a villain without changing a word of her initial presentation in City of Exiles, I got something far more intriguing than if I’d had this in mind from the beginning. Chapter 14 of Eternal Empire represents our first extended look at Asthana from the inside, and I like how the characteristics she acquired before I knew her true nature—her vanity, her intelligence, her perfect life with her fiancé—vibrate against what she became. Not every character turns out this way; these novels are filled with minor players content to occupy their roles. But Asthana, lucky for me and unlucky for everyone else, wanted to be more…
Malcolm in the Middle
Last week, the journalism blog Our Bad Media accused the author Malcolm Gladwell of lapses in reporting that it alleged fell just short of plagiarism. In multiple instances, Gladwell took details in his pieces for The New Yorker, without attribution, from sources that were the only possible places where such information could have been obtained. For instance, an anecdote about the construction of the Troy-Greenfield railroad was based closely an academic article by the historian John Sawyer, which isn’t readily available online, and which includes facts that appear nowhere else. Gladwell doesn’t mention Sawyer anywhere. And while it’s hard to make a case that any of this amounts to plagiarism in the strictest sense, it’s undeniably sloppy, as well as a disservice to readers who might want to learn more. In a statement responding to the allegations, New Yorker editor David Remnick wrote:
The issue is not really about Malcolm. And, to be clear, it isn’t about plagiarism. The issue is an ongoing editorial challenge known to writers and editors everywhere—to what extent should a piece of journalism, which doesn’t have the apparatus of academic footnotes, credit secondary sources? It’s an issue that can get complicated when there are many sources with overlapping information. There are cases where the details of an episode have passed into history and are widespread in the literature. There are cases that involve a unique source. We try to make judgments about source attribution with fairness and in good faith. But we don’t always get it right…We sometimes fall short, but our hope is always to give readers and sources the consideration they deserve.
Remnick’s response is interesting on a number of levels, but I’d like to focus on one aspect: the idea that after a certain point, details “have passed into history,” or, to quote Peter Canby, The New Yorker‘s own director of fact checking, a quote or idea can “escape its authorship” after it has been disseminated widely enough. In some cases, there’s no ambiguity over whether a fact has the status of public information; if we want to share a famous story about Immanuel Kant’s work habits, for instance, we don’t necessarily need to trace the quote back to where it first appeared. On the opposite end of the spectrum, we have something like a quotation from a particular interview with a living person, which ought to be attributed to its original source, and which Gladwell has occasionally failed to do. And in the middle, we have a wild gray area of factual information that might be considered common property, but which has only appeared in a limited number of places. Evidently, there’s a threshold—or, if you like, a tipping point—at which a fact or quote has been cited enough to take on a life of its own, and the real question is when that moment takes place.
It’s especially complicated in genres like fiction and narrative nonfiction, which, as Remnick notes, lack the scholarly apparatus of more academic writing. A few years ago, Ian McEwan fell into an absurd controversy over details in Atonement that were largely derived from a memoir by the wartime nurse Lucilla Andrews. McEwan credits Andrews in his acknowledgments, and his use of such materials inspired a ringing defense from none other than Thomas Pynchon:
Unless we were actually there, we must turn to people who were, or to letters, contemporary reporting, the encyclopedia, the Internet, until, with luck, at some point, we can begin to make a few things of our own up. To discover in the course of research some engaging detail we know can be put into a story where it will do some good can hardly be classed as a felonious act—it is simply what we do.
You could argue, on a similar level, that assimilating information and presenting it in a readable form is simply what Gladwell does, too. Little if anything that Gladwell writes is based on original research; he’s a popularizer, and a brilliant one, who compiles ideas from other sources and presents them in an attractive package. The result shades into a form of creative writing, rather than straight journalism, and at that point, the attribution of sources indeed starts to feel like a judgment call.
But it also points to a limitation in the kind of writing that Gladwell does so well. As I’ve pointed out in my own discussion of the case of Jonah Lehrer, whose transgressions were significantly more troubling, there’s tremendous pressure on writers like Gladwell—a public figure and a brand name as much as a writer—to produce big ideas on a regular basis. At times, this leads him to spread himself a little too thin; a lot of his recent work consists of him reading a single book and delivering its insights with a Gladwellian twist. At his best, he adds real value as a synthesizer and interpreter, but he’s also been guilty of distorting the underlying material in his efforts to make it digestible. And a great deal of what makes his pieces so seductive lies in the fact that so much of the process has been erased: they come to us as seamless products, ready for a TED talk, that elide the messy work of consolidation and selection. If Gladwell was more open about his sources, he’d be more useful, but also less convincing. Which may be why the tension between disclosure and readability that Remnick describes is so problematic in his case. Gladwell really ought to show his work, but he’s made it this far precisely because he doesn’t.
Inventing conspiracies for fun and profit
Note: Since I’m taking a deserved break for Thanksgiving, I’m reposting a few popular posts this week from earlier in this blog’s run. This post was originally published, in a slightly different form, on December 19, 2012.
If it sometimes seems like we’re living in a golden age for conspiracy theories, that shouldn’t come as a surprise. Conspiracies are ultimately about finding connections between seemingly unrelated ideas and events, and these days, it’s easier to find such connections than at any other point in human history. By now, we take it for granted, but I still remember the existential shock I received, almost ten years ago, when I found out about Amazon’s book search. I responded with a slightly hysterical blog post that was later quoted on the Volokh Conspiracy:
Their Search Inside the Book feature, which allows you to search and browse 33 million pages worth of material from 120,000 books, is just about the most intoxicating online toy I’ve ever seen. But it terrifies me at the same time. Between this monstrous djinn and Google.com, I have no excuse, no excuse whatsoever, for not writing a grand synthetic essay of everything, or a brilliant, glittering, Pynchonesque novel…because millions and millions of beautiful connections between people and ideas are already out there, at my fingertips, ready to be made without effort or erudition.
Looking back at this post, it’s easy to smile at my apocalyptic tone—not to mention my use of the phrase “Google.com,” which is a time capsule in itself—but if anything, my feelings of intoxication, and terror, have only increased. A decade ago, when I was in college, it took months of research and many hours in the library stacks to find useful connections between ideas, but now, they’re only a short query away. The trouble, of course, is that the long initial search is an inseparable part of scholarship: if you’re forced to read entire shelves of books and pursue many fruitless avenues of research before finding the connections you need, you’re better equipped to evaluate how meaningful they really are when you find them. A quick online search circumvents this process and robs the results of context, and even maturity. Research becomes a series of shortcuts, of data obtained without spiritual effort or cost, so it’s tempting to reach the same conclusion as Jonathan Franzen: “When information becomes free and universally accessible, voluminous research for a novel is devalued along with it.”
Which is true, but only up to a point. Raw information is everywhere, but authors can still be judged by the ingenuity and originality of the connections they make. This is especially true in conspiracy fiction, in which a connection doesn’t need to be true, as long as it’s clever, reasonably novel, and superficially convincing. (Among other reasons, this is why I don’t care for the work of Dan Brown, who only repeats the labors of more diligent crackpots.) Umberto Eco, definitive here as elsewhere, laid down the rules of the game in Foucault’s Pendulum:
- Concepts are connected by analogy. There is no way to decide at once whether an analogy is good or bad, because to some degree everything is connected to everything else.
- If everything hangs together in the end, the connection works.
- The connections must not be original. They must have been made before, and the more often the better, by others. Only then do the crossings seem true, because they are obvious.
And unlike Eco’s protagonists, who had to enter scraps of information into their computer by hand, we all have free access to a machine with an infinite number of such fragments. An enterprising paranoiac just has to look for the connections. And the first step is to find out where they’ve crossed over in the past.
When the time finally came, then, to construct the Pynchonesque novel of my dreams, I decided to proceed in the most systematic way I could. I constructed a vast spreadsheet grid that paired off a variety of players and ideas that I suspected would play a role in the story—Marcel Duchamp, the Rosicrucians, Georges Bataille, the Black Dahlia murder—and spent weeks googling each pair in turn, trying to find books and other documents where two or more terms were mentioned together. Not surprisingly, many of these searches went nowhere, but I also uncovered a lot of fascinating material that I wouldn’t have found in any other way, which opened up further avenues of inquiry that I researched more deeply. I felt justified in this approach, which is the opposite of good scholarship, because I was writing a work of fiction about paranoia, overinterpretation, and the danger of taking facts out of context, which was precisely what I was doing myself. And I came away with the realization that you could do this with anything—which is something to keep in mind whenever you see similar arguments being made in earnest. There’s nothing like building a conspiracy theory yourself to make you even more skeptical than you were before. Or to quote Foucault’s Pendulum yet again: “That day, I began to be incredulous.”
The autograph man
Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s question: “Do you have anybody’s autograph?”
Last weekend, I took part in a local authors program sponsored by the Oak Park Public Library, in which writers from the area were given three minutes each to talk about their work and then hopefully make a few sales. I had a good time and I sold a bunch of copies, which is always a plus. Yet whenever I do an event like this, I’m brought up against the fundamental awkwardness of the interaction when I’m asked to sign a book. For one thing, I can never come up with anything clever to say in the inscription, so I end up scrawling something like “Enjoy!” or “Best wishes!” even to my own family members. (I’ve also realized that when you’re signing all three copies of a trilogy, the buyer starts to get a little impatient by the time you’ve begun dating, inscribing, and signing the third book.) And while I’m always gratified by sales and attention—especially sales—I usually feel like a nocturnal creature that has been dragged, blinking, into the daylight. I became a writer partially because I like hanging out on my own, absorbed in a draft or a pile of research materials, and whenever I’m compelled to be out in the world, it’s as if I’m engaging in a kind of elaborate impersonation.
I assume, though I don’t know for sure, that a lot of other writers feel the same way, even as they’re asked to invest increasing amounts of time in creating a public life that has little in common with what they do for a living. These days, it’s taken for granted that writers will promote themselves with any and all means at their disposal, to the point where even an ordinary desire for privacy starts to seem outré. Here’s the thing about Thomas Pynchon: it’s fun to talk about his “reclusiveness,” as if he were an elusive cryptid like Bigfoot, but by all accounts, he’s an ordinary guy living in New York, with an active social life and a diverse circle of friends. He just doesn’t feel like giving interviews or having his picture published, and that perfectly reasonable stance is so out of line with our expectations that it becomes newsworthy in itself. I’ve always liked what the critic Arthur Salm had to say on the subject:
The man simply chooses not to be a public figure, an attitude that resonates on a frequency so out of phase with that of the prevailing culture that if Pynchon and Paris Hilton were ever to meet—the circumstances, I admit, are beyond imagining—the resulting matter/antimatter explosion would vaporize everything from here to Tau Ceti IV.
It’s possible, of course, that certain authors would love to be public figures, if only they’d get the chance. Yet it’s revealing that when we think of the novelist as a celebrity, our minds go back to Norman Mailer and Gore Vidal feuding on The Dick Cavett Show, and the pool of memories dries up the closer we get to the present. Occasionally, a writer famous for fiction will start to assume a role of greater social importance, but it’s almost always at the expense of his or her work as a novelist: Arundhati Roy hasn’t published a novel in seventeen years. That’s the strange thing about the push toward ever greater levels of exposure: even as writers share more and more of themselves on Twitter, Facebook, and blogs like this, their real role in public life—at least for novelists—grows progressively marginalized. In that light, the aggressive presence of authors on social media feels less like self-promotion than a simulation of the cultural role writers no longer possess, if they ever really did. (And it’s also clear that the skill sets required to write a novel and curate a decent Twitter feed have about as much in common as writing and public speaking, which is to say, next to nothing.)
Deep down, I feel much the same way about readings and signings, which are moments when writers can play at being famous in ways that they haven’t experienced—or are spared from—otherwise. (Obviously, this doesn’t include celebrities who were already famous before their books were published: Lena Dunham’s book tour has about as much in common with your average reading as Cirque du Soleil does with a local black box theater’s production of Hedda Gabler.) If it sounds like I’m overthinking it, well, I probably am. But it doesn’t prevent me from feeling a little uncomfortable whenever I find myself in that kind of encounter, regardless of which side of the autograph table I’m on. The only person I’ve ever asked for an autograph is Walter Murch, and the fact that he was a perfect gentleman about it didn’t make our few seconds of small talk any less awkward. Luckily, that moment occupies only a tiny sliver of the mental real estate devoted to his movies, books, and interviews. To the extent that I feel I know Murch, or anyone, it has less to do with the handshake we shared than with all the time I’ve spent in his virtual company, when neither of us was playing a role, and we were far enough apart for at least one of us to really say something, even if the conversation only ran in one direction.
Gravity’s word processor
In this week’s issue of the New York Review of Books, the literary critic Edward Mendelson outs himself as yet another fan of old-school word processors, in this case WordPerfect, which he describes as “the instrument best suited to the way I think when I write.” He goes on to draw a contrast between his favored program, “a mediocrity that’s almost always right,” and Microsoft Word, “a work of genius that’s almost always wrong as an instrument for writing prose,” with its commitment to a platonic ideal of sections and styles that make it all the harder for writers to format a single page. It’s the difference, Mendelson implies, between a mindset that approaches the document from the top down, thinking in terms of templates and overall consistency, and the daily experience of a writer, who engages in direct combat with individual words and sentences, some of which have to be italicized, indented, or otherwise massaged in ways that don’t have anything to do with their neighbors. And as someone who lives comfortably within his own little slice of Word but wants to tear his hair out whenever he strays beyond it, I can’t help but sympathize.
I happened to read Mendelson’s essay with particular interest, because I’m a longtime fan of his work. Mindful Pleasures, the collection of essays he edited on Thomas Pynchon, is one of those books I revisit every few years, and in particular, his piece on encyclopedic fiction has shaped the way I read authors from Dante to Joyce. Pynchon, of course, is a writer with more than a few ideas about how technology affects the way we live and think, and in his conclusion, Mendelson takes a cue from the master:
When I work in Word, for all its luxuriant menus and dazzling prowess, I can’t escape a faint sense of having entered a closed, rule-bound society. When I write in WordPerfect, with all its scruffy, low-tech simplicity, the world seems more open, a place where endings can’t be predicted, where freedom might be real.
There’s more than an echo here of Gravity’s Rainbow, which pits its anarchic, cartoonish personalities against an impersonal conspiracy that finally consumes and assimilates them. And if Pynchon’s fantasy is centered on a rocket cartel that manipulates world events to its own advantage, a writer trying to wrestle a document into shape can sometimes feel like he’s up against an equally faceless enemy.
If Word can be a frustrating tool for writers, it’s because it wasn’t made for anyone in particular, but for “everyone.” As one of the core handful of programs included in the Microsoft Office suite, it’s meant to serve a wide range of functions, from hammering out a high school essay to formatting a rudimentary corporate newsletter. It’s intended to be equally useful to someone who creates a document twice a month and someone who uses it every day, which means that it’s tailored to the needs of precisely nobody. And it was presumably implemented by coders who would rebel against any similar imposition. There’s a reason why so many programmers still live in Emacs and its text-based brethren: they’re simple once you get to know them, they’re deeply customizable, and they let you keep your hands on the keyboard for extended periods of time. Word, by contrast, seems to have been designed for a hypothetical consumer who would rather follow a template than fiddle with each line by hand. This may be true of most casual users, but it’s generally not true of coders—or writers. And Word, like so much other contemporary technology, offers countless options but very little choice.
There are times, obviously, when a standard template can be useful, especially when you’re putting together something like an academic bibliography. Yet there’s a world of difference between really understanding bibliographic style from the inside and trusting blindly to the software, which always needs to be checked by hand, anyway, to catch the errors that inevitably creep in. In the end, though, Word wasn’t made for me; it was made for users who see a word processor as an occasional tool, rather than the environment in which they spend most of their lives. For the rest of us, there are either specialized programs, like Scrivener, or the sliver of Word we’ve managed to colonize. In my post on George R.R. Martin and his use of WordStar—which, somewhat embarrassingly, has turned out to be the most widely read thing I’ve ever written—I note that a writer’s choice of tools is largely determined by habit. I’ve been using Word for two decades, and the first drafts of all my stories are formatted in exactly the way the program imposes, in single-spaced 12-point Times New Roman. I’m so used to how it looks that it fades into invisibility, which is exactly how it should be. The constraints it imposes are still there, but I’ve adapted so I can take them for granted, like a deep-sea fish that would explode if taken closer to the surface, or an animal that has learned to live with gravity.
“But some things can’t be undone…”
leave a comment »
Note: This post is the sixty-second—and final—installment in my author’s commentary for Eternal Empire, covering the epilogue. You can read the previous installments here.
How do you end a series that has lasted for three books and more than a thousand pages? To some extent, no conclusion can be completely satisfying, so it makes sense to focus on what you actually stand a chance of achieving. There’s a reason, for instance, that so few series finales live up to our hopes: a healthy television show has to cultivate and maintain more narrative threads than can be resolved in a single episode, so any finale has to leave certain elements unaddressed. In practice, this means that entire characters and subplots are ignored in favor of others, which is exactly how it should be. During the last season of Mad Men, Matthew Weiner and his writing team prepared a list of story points that they wanted to revisit, and reading it over again now is a fascinating exercise. The show used some of the ideas, but it omitted many more, and we never did get a chance to see what happened to Sal, Dr. Faye, or Peggy’s baby. This kind of creative pruning is undoubtedly good for the whole, and it serves as a reminder of Weiner’s exceptional skill as a showrunner. Mad Men was one of the most intricate dramas ever written, with literally dozens of characters who might have earned a resonant guest appearance in the closing stretch of episodes. But Weiner rightly forced himself to focus on the essentials, while also allowing for a few intriguing digressions, and the result was one of the strongest finales I’ve ever seen—a rare example of a show sticking the landing to maintain an impossibly high standard from the first episode to the last.
It’s tempting to think of a series finale as a piece of valuable real estate in which every second counts, or as a zero-sum game in which every moment devoted to one character means that another won’t have a chance to appear. (Watching the Mad Men finale, I found myself waiting for my favorite supporting players to pop up, and as soon as they had their scene, I couldn’t help thinking: That’s the last thing I’ll ever see them do.) But it can be dangerous to take such a singleminded approach to any unit of narrative, particularly for shows that have thrived on the unpredictable. My favorite example is the series finale of Twin Peaks, which wasn’t even meant to end the show, but provided as perfect a conclusion as any viewer could want—an opinion that I’ll continue to hold even after the new season premieres on Showtime. Instead of taking time to check in with everyone in their huge cast, David Lynch and Mark Frost indulge in long, seemingly pointless set pieces: the scene in the bank with Audrey, with the decrepit manager shuffling interminable across the floor to get her a drink of water, and especially the sequence in the Black Lodge, which is still the weirdest, emptiest twenty minutes ever to air on network television. You can imagine a viewer almost shouting at the screen for Lynch and Frost to get back to Sheriff Truman or Shelly or Donna, but that wouldn’t have been true to the show’s vision. Similarly, the Mad Men finale devotes a long scene to a character we’ve never seen before or since, the man at the encounter group who ends up inspiring Don’s return to humanity. It might seem like a strange choice, but it was the right call: Don’s relationships with every other character were so burdened with history that it took a new face to carry him over the finish line.
I found myself dealing with many of the same issues when it came to the epilogue of Eternal Empire, which was like the final season of a television series that had gone on for longer than I’d ever expected. Maddy and Wolfe had already received a sendoff in the previous chapter, so I only had to deal with Ilya. Pragmatically, the scene could have been about anything, or nothing at all. Ilya was always a peculiar character: he was defined mostly by action, and I deliberately refrained from detailing large portions of his backstory, on the assumption that he would be more interesting the less we knew about his past. It would have been easy to give him a conclusion that filled in more of his background, or that restored something of what he had lost—his family, a home, his sense of himself as a fundamentally good man. But that didn’t seem right. Another theme that you often see in series finales, particularly for a certain type of sitcom, is the showrunner’s desire to make every character’s dreams come true: the last season of Parks and Recreation, in particular, was a sustained exercise in wish fulfillment. I can understand the need to reward the characters that we love, but in Ilya’s case, what I loved about him was inseparable from the fact of his rootlessness. The novel repeatedly draws a parallel between his situation and that of the Khazars, the tribe of nomads that converted to Judaism before being erased from history, and I once compared him to the tzaddikim, or the unknown men and women for whose sake God refrains from destroying the world. Above all else, he was the Scythian, a wanderer of the steppes. I chose these emblems intuitively, but they clearly all have something in common. And it implied that Ilya would have to depart the series as he began it: as a man without a country.
What we get, in the end, is this quiet scene, in which Ilya goes to visit the daughter of the woman who had helped him in Yalta. The woman was a bride of the brotherhood, a former convict who gave up her family to work with the thieves, and her daughter ended up as the servant of a gangster in Moldova, five hundred miles away. Ilya gives her some money and her mother’s address, which he hopes will allow them to build a new life together, and then leaves. (The song that is playing on the girl’s cassette deck, incidentally, is Joni Mitchell’s “Cactus Tree.” This might be the nerdiest, most obscure inside joke of the entire series: it’s the song that appears in a deleted epigraph in the page proofs of Gravity’s Rainbow, before Thomas Pynchon removed it prior to publication. I’d wanted to use it, in some form, since The Icon Thief, and the fact that it includes the word “eternity” was a lucky coincidence.) It all makes for a subdued conclusion to the trilogy, and I came up with it fairly late in the process: as far as I can remember, the idea that there was a connection between the women in Yalta and Moldova didn’t occur to me until I’d already outlined the scenes, and this conclusion would have been an equally late addition. And it works, more or less, even if it feels a little too much like the penultimate scene of The Bourne Supremacy. It seemed right to end the series—which was pointedly made up of big, exaggerated gestures—on a gentle note, which implies that reuniting a parent and her child might be an act of greater significance than saving the world. I don’t know where Ilya goes after this, even though I spent the better part of four years trying to see through his eyes. But I suspect that he just wants to be left in peace…
Like this:
Written by nevalalee
August 4, 2016 at 8:49 am
Posted in Books, Writing
Tagged with David Lynch, Eternal Empire commentary, Gravity's Rainbow, Joni Mitchell, Mad Men, Mark Frost, Matthew Weiner, Parks and Recreation, The Bourne Supremacy, Thomas Pynchon