Posts Tagged ‘Leo Tolstoy’
Lord Rowton…says that he once asked Disraeli what was the most remarkable, the most self-sustained and powerful sentence he knew. Dizzy paused for a moment, and then said, “Sufficient unto the day is the evil thereof.”
—Augustus J.C. Hare, The Story of My Life
Disraeli was a politician and a novelist, which is an unusual combination, and he knew his business. Politics and writing have less to do with each other than a lot of authors might like to believe, and the fact that you can create a compelling world on paper doesn’t mean that you can do the same thing in real life. (One of the hidden themes of Astounding is that the skills that many science fiction writers acquired in organizing ideas on the page turned out to be notably inadequate when it came to getting anything done during World War II.) Yet both disciplines can be equally daunting and infuriating to novices, in large part because they both involve enormously complicated projects—often requiring years of effort—that need to be approached one day at a time. A single day’s work is rarely very satisfying in itself, and you have to cling to the belief that countless invisible actions and compromises will somehow result in something real. It doesn’t always happen, and even if it does, you may never get credit or praise. The ability to deal with the everyday tedium of politics or writing is what separates professionals from amateurs. And in both cases, the greatest accomplishments are usually achieved by freaks who can combine an overarching vision with a finicky obsession with minute particulars. As Eugène-Melchior de Vogüé, who was both a diplomat and literary critic, said of Tolstoy, it requires “a queer combination of the brain of an English chemist with the soul of an Indian Buddhist.”
And if you go into either field without the necessary degree of patience, the results can be unfortunate. If you’re a writer who can’t subordinate yourself to the routine of writing on a daily basis, the most probable outcome is that you’ll never finish your novel. In politics, you end up with something very much like what we’ve all observed over the last few weeks. Regardless of what you might think about the presidential refugee order, its rollout was clearly botched, thanks mostly to a president and staff that want to skip over all the boring parts of governing and get right to the good stuff. And it’s tempting to draw a contrast between the incumbent, who achieved his greatest success on reality television, and his predecessor, a detail-oriented introvert who once thought about becoming a novelist. (I’m also struck, yet again, by the analogy to L. Ron Hubbard. He spent most of his career fantasizing about a life of adventure, but when he finally got into the Navy, he made a series of stupid mistakes—including attacking two nonexistent submarines off the coast of Oregon—that ultimately caused him to be stripped of his command. The pattern repeated itself so many times that it hints at a fundamental aspect of his personality. He was too impatient to deal with the tedious reality of life during wartime, which failed to live up to the version he had dreamed of himself. And while I don’t want to push this too far, it’s hard not to notice the difference between Hubbard, who cranked out his fiction without much regard for quality, and Heinlein, a far more disciplined writer who was able to consciously tame his own natural impatience into a productive role at the Philadelphia Navy Yard.)
Which brings us back to the sentence that impressed Disraeli. It’s easy to interpret it as an admonition not to think about the future, which isn’t quite right. We can start by observing that it comes at the end of what The Five Gospels notes is possibly “the longest connected discourse that can be directly attributed to Jesus.” It’s the one that asks us to consider the birds of the air and the lilies of the field, which, for a lot of us, prompts an immediate flashback to The Life of Brian. (“Consider the lilies?” “Uh, well, the birds, then.” “What birds?” “Any birds.” “Why?” “Well, have they got jobs?”) But whether or not you agree with the argument, it’s worth noticing that the advice to focus on the evils of each day comes only after an extended attempt at defining a larger set of values—what matters, what doesn’t, and what, if anything, you can change by worrying. You’re only in a position to figure out how best to spend your time after you’ve considered the big questions. As the physician William Osler put it:
[My ideal is] to do the day’s work well and not to bother about tomorrow. You may say that is not a satisfactory ideal. It is; and there is not one which the student can carry with him into practice with greater effect. To it more than anything else I owe whatever success I have had—to this power of settling down to the day’s work and trying to do it well to the best of my ability, and letting the future take care of itself.
This has important implications for both writers and politicians, as well as for progressives who wonder how they’ll be able to get through the next twenty-four hours, much less the next four years. When you’re working on any important project, even the most ambitious agenda comes down to what you’re going to do right now. In On Directing Film, David Mamet expresses it rather differently:
Now, you don’t eat a whole turkey, right? You take off the drumstick and you take a bite of the drumstick. Okay. Eventually you get the whole turkey done. It’ll probably get dry before you do, unless you have an incredibly good refrigerator and a very small turkey, but that is outside the scope of this lecture.
A lot of frustration in art, politics, and life in general comes from attempting to swallow the turkey in one bite. Jesus, I think, was aware of the susceptibility of his followers to grandiose but meaningless gestures, which is why he offered up the advice, so easy to remember and so hard to follow, to simultaneously focus on the given day while keeping the kingdom of heaven in mind. Nearly every piece of practical wisdom in any field is about maintaining that double awareness. Fortunately, it goes in both directions: small acts of discipline aid us in grasping the whole, and awareness of the whole tells us what to do in the moment. As R.H. Blyth says of Zen: “That is all religion is: eat when you are hungry, sleep when you are tired.” And don’t try to eat the entire turkey at once.
“The history of the world is but the biography of great men,” Thomas Carlyle once wrote, and although this statement was criticized almost at once, it accurately captures the way many of us continue to think about historical events, both large and small. There’s something inherently appealing about the idea that certain exceptional personalities—Alexander the Great, Julius Caesar, Napoleon—can seize and turn the temper of their time, and we see it today in attempts to explain, say, the personal computing revolution though the life of someone like Steve Jobs. The alternate view, which was expressed forcefully by Herbert Spencer, is that history is the outcome of impersonal social and economic forces, in which a single man or woman can do little more than catalyze trends that are already there. If Napoleon had never lived, the theory goes, someone very much like him would have taken his place. It’s safe to say that any reasonable view of history has to take both theories into account: Napoleon was extraordinary in ways that can’t be fully explained by his environment, even if he was inseparably a part of it. But it’s also worth remembering that much of our fascination with such individuals arises from our craving for narrative structures, which demand a clear hero or villain. (The major exception, interestingly, is science fiction, in which the “protagonist” is often humanity as a whole. And the transition from the hard science fiction of the golden age to messianic stories like Dune, in which the great man reasserts himself with a vengeance, is a critical turning point in the genre’s development.)
You can see a similar divide in storytelling, too. One school of thought implicitly assumes that a story is a delivery system for great scenes, with the rest of the plot serving as a scaffold to enable a handful of awesome moments. Another approach sees a narrative as a series of small, carefully chosen details designed to create an emotional effect greater than the sum of its parts. When it comes to the former strategy, it’s hard to think of a better example than Game of Thrones, a television series that often seems to be marking time between high points: it can test a viewer’s patience, but to the extent that it works, it’s because it constantly promises a big payoff around the corner, and we can expect two or three transcendent set pieces per season. Mad Men took the opposite tack: it was made up of countless tiny but riveting choices that gained power from their cumulative impact. Like the theories of history I mentioned above, neither type of storytelling is necessarily correct or complete in itself, and you’ll find plenty of exceptions, even in works that seem to fall clearly into one category or the other. It certainly doesn’t mean that one kind of story is “better” than the other. But it provides a useful way to structure our thinking, especially when we consider how subtly one theory shades into the other in practice. The director Howard Hawks famously said that a good movie consisted of three great scenes and no bad scenes, which seems like a vote for the Game of Thrones model. Yet a great scene doesn’t exist in isolation, and the closer we look at stories that work, the more important those nonexistent “bad scenes” start to become.
I got to thinking about this last week, shortly after I completed the series about my alternative movie canon. Looking back at those posts, I noticed that I singled out three of these movies—The Night of the Hunter, The Limey, and Down with Love—for the sake of one memorable scene. But these scenes also depend in tangible ways on their surrounding material. The river sequence in The Night of the Hunter comes out of nowhere, but it’s also the culmination of a language of dreams that the rest of the movie has established. Terence Stamp’s unseen revenge in The Limey works only because we’ve been prepared for it by a slow buildup that lasts for more than twenty minutes. And Renée Zellweger’s confessional speech in Down with Love is striking largely because of how different it is from the movie around it: the rest of the film is relentlessly active, colorful, and noisy, and her long, unbroken take stands out for how emphatically it presses the pause button. None of the scenes would play as well out of context, and it’s easy to imagine a version of each movie in which they didn’t work at all. We remember them, but only because of the less showy creative decisions that have already been made. And at a time when movies seem more obsessed than ever with “trailer moments” that can be spliced into a highlight reel, it’s important to honor the kind of unobtrusive craft required to make a movie with no bad scenes. (A plot that consists of nothing but high points can be exhausting, and a good story both delivers on the obvious payoffs and maintains our interest in the scenes when nothing much seems to be happening.)
Not surprisingly, writers have spent a lot of time thinking about these issues, and it’s noteworthy that one of the most instructive examples comes from Leo Tolstoy. War and Peace is nothing less than an extended criticism of the great man theory of history: Tolstoy brings Napoleon onto the scene expressly to emphasize how insignificant he actually is, and the novel concludes with a lengthy epilogue in which the author lays out his objections to how history is normally understood. History, he argues, is a pattern that emerges from countless unobservable human actions, like the sum of infinitesimals in calculus, and because we can’t see the components in isolation, we have to content ourselves with figuring out the laws of their behavior in the aggregate. But of course, this also describes Tolstoy’s strategy as a writer: we remember the big set pieces in War and Peace and Anna Karenina, but they emerge from the diligent, seemingly impersonal collation of thousands of tiny details, recorded with what seems like a minimum of authorial interference. (As Victor Shklovsky writes: “[Tolstoy] describes the object as if he were seeing it for the first time, an event as if it were happening for the first time.”) And the awesome moments in his novels gain their power from the fact that they arise, as if by historical inevitability, from the details that came before them. Anna Karenina was still alive at the end of the first draft, and it took her author a long time to reconcile himself to the tragic climax toward which his story was driving him. Tolstoy had good reason to believe that great scenes, like great men, are the product of invisible forces. But it took a great writer to see this.
After we see an object several times, we begin to recognize it. The object is in front of us and we know about it, but we do not see it—hence we cannot say anything significant about it…Tolstoy makes the familiar seem strange by not naming the familiar object. He describes the object as if he were seeing it for the first time, an event as if it were happening for the first time. In describing something he avoids the accepted names of its parts and instead names corresponding parts of other objects…
Tolstoy described the dogmas and rituals he attacked as if they were unfamiliar, substituting everyday meanings for the customarily religious meanings of the words common in church ritual. Many persons were painfully wounded; they considered it blasphemy to present as strange and monstrous what they accepted as sacred. Their reaction was due chiefly to the technique through which Tolstoy perceived and reported his environment. And after turning to what he had long avoided, Tolstoy found that his perceptions had unsettled his faith.
The technique of defamiliarization is not Tolstoy’s alone. I cited Tolstoy because his work is generally known.
In old age, I accept unhappy endings in Shakespearean tragedy, Flaubert, and Tolstoy, but back away from them in lesser works. Desdemona, Cordelia, Emma Bovary, and Anna Karenina are slain by their creators, and we are compelled to absorb the greatness of the loss. Perhaps it trains us to withstand better the terrible deaths of friends, family, and lovers, and to contemplate more stoically our own dissolution. But I increasingly avoid most movies with unhappy endings, since few among them aesthetically earn the suffering they attempt to inflict upon us.
I’m starting to feel the same way. For most of my life, I’ve never shied away from works of art with unhappy endings: in movies, the list begins and ends with Vertigo, the greatest of all sucker punches ever inflicted on an audience, and includes films as different as The Red Shoes, The Third Man, and Dancer in the Dark. When I’m given a choice between ambiguous interpretations, as in Inception, I’m often inclined to go with the darker reading. But as time goes on, I’ve found that I prefer happy endings, both from a purely technical standpoint and as a matter of personal taste.
Which isn’t to say that unhappy endings can’t work. Yesterday, I cited Bruno Bettelheim on the subject of fairy tales, which invariably end on an unambiguously happy note to encourage children to absorb their implicit lessons about life. As adults, our artistic needs are more complicated, if not entirely dissimilar. An unhappy ending of the sort that we find in the myth of Oedipus or Madame Bovary is psychological training of a different sort, preparing us, as Bloom notes, for the tragic losses that we all eventually experience. Just as scary movies acquaint us with feelings of terror that we’d rarely feel under ordinary circumstances, great works of art serve as a kind of exercise room for the emotions, expanding our capacity to feel in ways that would never happen if we only drew on the material of our everyday lives. If the happy endings in fairy tales prepare and encourage children to venture outside the safe confines of family into the wider world, unhappy endings in adult fiction do the opposite: they turn our attention inward, forcing us to scrutinize aspects of ourselves that we’ve been trained to avoid as we focus on our respectable adult responsibilities.
In order for this to work, though, that unhappiness has to be authentically earned, and the number of works that pull it off is vanishingly small. Endings, whether happy or unhappy, are very hard, and a lot of writers, including myself, are often unsure if they’ve found the right way to end a story. But given that uncertainty, it’s wisest, when you don’t know the answer, to err on positive side, and to ignore the voice that insists that an unhappy ending is somehow more realistic and uncompromising. In fact, a bleak, unearned ending is just as false to the way the world works as an undeserved happy one, and at greater cost to the reader. A sentimental happy ending may leave us unsatisfied with the author’s work, but that’s nothing compared to our sense of being cheated by a dark conclusion that arises from cynicism or creative exhaustion. Simply as a matter of craft, stories work best when they’re about the restoration of order, and one that ends with the characters dead or destroyed by failure technically meets that requirement. But for most writers, I’d argue that being able to restore a positive order to the tangle of complications they’ve created is a sign of greater artistic maturity.
And while it’s nice to believe that a happy or unhappy ending should flow naturally from the events that came before, a casual look at the history of literature indicates that this isn’t the case. Anna Karenina survived in Tolstoy’s first draft. Until its final act, Romeo and Juliet isn’t so different in tone from many of Shakespeare’s comedies, and if the ending had been changed to happily reunite the two lovers, it’s likely that we’d have trouble imagining it in any other way—although it’s equally likely that we’d file it permanently among his minor plays. On the opposite end of the spectrum, The Winter’s Tale is saved from becoming a tragedy only by the most arbitrary, unconvincing, and deeply moving of authorial contrivances. In practice, the nature of an ending is determined less by the inexorable logic of the plot than by the author’s intuition when the time comes to bring the story to a close, and as we’ve seen, it can often go either way. A writer has no choice but to check his gut to see what feels right, and I don’t think it’s too much to say that the burden lies with the unhappy ending to prove that it belongs there. Any halfway competent writer can herd his characters into the nearest available chasm. But when in doubt, get them out.
There’s an unspoken assumption among many readers and critics that a good author should base his work entirely on personal experience, either derived from his own life or those of people he knows, and that it’s a sign of weakness to be overly dependent on research. If it’s clear that a writer has relied heavily on secondary sources to tell a story, or, worse, if the nature of those sources is readily detectable, it’s sometimes treated as a sort of lapse, even as an embarrassment. It’s generally agreed, for instance, that Tolstoy’s material on the Freemasons in War and Peace was based on his reading, not on firsthand information: he wasn’t a Mason himself, and other Masons wouldn’t be likely to share any details with him directly, so the scenes depicting Pierre’s initiation—which are believed to be fundamentally accurate—were derived from a handful of books. I’ve read critics who treat this as an objective flaw in an otherwise unimpeachable masterpiece, as if the knowledge that Tolstoy had to do a bit of research undermines our impression of him as an omniscient sage of the human world. And this flies in the face of the fact that all of War and Peace is a monumental work of research and construction, since it contains so much that Tolstoy never could have witnessed himself.
And this applies as much, if not more so, to contemporary authors. Ian McEwan, for example, based large sections of Atonement on the memoirs of Lucilla Andrews, who served as a nurse during the London blitz. McEwan wasn’t shy about giving credit to Andrews—he mentions her in his acknowledgments—but when a few readers pointed out how certain details in his novel seemed to be taken directly from her work, there was a mild outcry, with some even calling it a form of plagiarism. I doubt that anyone would have raised the issue if McEwan had conducted interviews with Andrews directly, but the revelation that parts of his story were transparently indebted to another book made some readers uncomfortable. The plagiarism charge was ridiculous, of course, as none other than Thomas Pynchon, a monster of research himself, made clear in an open letter to his publisher:
Unless we were actually there, we must turn to people who were, or to letters, contemporary reporting, the encyclopedia, the Internet, until, with luck, at some point, we can begin to make a few things of our own up. To discover in the course of research some engaging detail we know can be put into a story where it will do some good can hardly be classed as a felonious act—it is simply what we do.
Pynchon’s assessment of research as a kind of period of consolidation until “we can begin to make a few things of our own up” is absolutely correct, and library research is part of nearly every ambitious novelist’s bag of tricks. Research, as I’ve noted elsewhere, is less about factual accuracy than about providing the material for dreams, a gathering of “engaging details” that can furnish and feather the fictional nest we’ve created. (That last phrase is Anthony Lane’s, discussing Gustave Flaubert’s own voluminous research for Salammbo.) That’s true of literary as well as popular fiction: Saul Bellow had never been to Africa when he wrote Henderson the Rain King, but he was able to draw on travel accounts, textbooks, his own experience as a student of anthropology, and above all his own peerless imagination to create a remarkably convincing story, as even Norman Mailer admitted: “I don’t know if any other American writer has done Africa so well.” And it’s particularly indispensable for a novelist working in a field like suspense, where so much of the narrative necessarily deals with aspects of human life—murder, crime, conspiracy—that few writers have the luxury or desire to experience directly.
This was particularly true of City of Exiles, which I knew from the start would include long sequences set in the British prison system. I didn’t have any expectation of spending much time there myself, so I was forced to fall back on a handful of useful secondary sources: the memoirs of Charles Bronson, best known these days as the subject of a movie starring Tom Hardy, and especially the diaries of the suspense novelist Jeffrey Archer, who was sent to prison for perjury. We first see the result in Chapter 8, in which Powell and Wolfe pay a visit to Belmarsh to see the imprisoned gangster Vasylenko. Most of the details here, like the corridor that changes from lavender to green to blue as you enter a secure area, or the description of the interview room, walled with glass on all four sides like a fish tank, were taken from Archer’s book, and I draw on it repeatedly for all of the prison material that follows. I’m not sure if admitting this counts as a breach in the contract between an author and his readers—a suspense novelist, after all, is often expected to know something about everything—but I don’t see any harm in acknowledging my sources. Without their help, I wouldn’t have been able to write this novel at all. And we’re going to be spending a lot of time behind bars…
Note: To celebrate the third anniversary of this blog, I’ll be spending the week reposting some of my favorite pieces from early in its run. This post originally appeared, in a somewhat different form, on June 6, 2011.
Being an agnostic means all things are possible, even God, even the Holy Trinity. This world is so strange that anything may happen, or may not happen. Being an agnostic makes me live in a larger, a more fantastic kind of world, almost uncanny. It makes me more tolerant.
Of all religious or philosophical convictions, agnosticism, at first glance, is the least interesting to defend. Like political moderates, agnostics get it from both sides, most of all from committed atheists, who tend to regard permanent agnosticism, in the words of Richard Dawkins, as “fence-sitting, intellectual cowardice.” And yet many of my heroes, from Montaigne to Robert Anton Wilson, have identified themselves with agnosticism as a way of life. (Wilson, in particular, called himself an agnostic mystic, which is what you get when an atheist takes a lot of psychedelic drugs.) And while a defense of the philosophical aspects of agnosticism is beyond the scope of this blog—for that, I can direct you to Thomas Huxley, or even to a recent posting by NPR’s Adam Frank, whose position is not far removed from my own—I think I can talk, very tentatively, about its pragmatic benefits, at least from a writer’s point of view.
I started thinking about this again after reading a blog post by Bookslut’s Jessa Crispin, who relates that she was recently talking about the mystical inclinations of W.B. Yeats when a self-proclaimed atheist piped up: “I always get sad for Yeats for his occult beliefs.” As Crispin discusses at length, such a statement is massively condescending, and also weirdly uninsightful. Say what you will about Yeats’s interest in occultism, but there’s no doubt that he found it spectacularly useful. It provided him with symbolic material and a means of engaging the unseen world that most poets are eventually called to explore. The result was a body of work of permanent importance, and one that wouldn’t exist, at least not in its present form, if his life had assumed a different shape. Was it irrational? Sure. But Wallace Stevens aside, strictly rational behavior rarely produces good poets.
I’ve probably said this before, but I’ll say it again: the life of any writer—and certainly that of a poet—is so difficult, so impractical on a cosmic scale, that there’s often a perverse kind of pragmatism in the details. A writer’s existence may look messy from the outside, but that mess is usually the result of an attempt to pick out what is useful from life and reject the rest, governed by one urgent question: Can I use this? If a writer didn’t take his tools wherever he found them, he wouldn’t survive, at least not as an artist. Which is why any kind of ideology, religious or otherwise, can be hard for a writer to maintain. Writers, especially novelists, tend to be dabblers, not so much out of dilettantism—although that can be a factor as well—as from an endless, obsessive gleaning, a rummaging in the world’s attic for useful material, in both art and life. And this process of feathering one’s nest tends to inform a writer’s work as well. What Christopher Hitchens says of Ian McEwan is true of many novelists:
I think that he did, at one stage in his life, dabble a bit in what’s loosely called “New Age,” but in the end it was the rigorous side that won out, and his novels are almost always patrolling some difficult frontier between the speculative and the unseen and the ways in which material reality reimposes itself.
Agnosticism is also useful for another reason, as Borges points out above: tolerance. A novelist needs to write with empathy about people very different from himself, and to vicariously live all kinds of lives, which is harder to do through the lens of an intractable philosophy. We read Dante and Tolstoy despite, not because of, their ideological convictions, and much of the fire of great art comes from the tension between those convictions and the artist’s reluctant understanding of the world. For a writer, dogma is, or should be, the enemy—including dogma about agnosticism itself. In the abstract, it can seem clinical, but in practice, it’s untidy and makeshift, like the rest of a writer’s life. It’s useful only when it exposes itself to a lot of influences and generates a lot of ideas, most unworkable, but some worthy of being pursued. Like democracy, it’s a compromise solution, the best of a bad lot. It doesn’t work all that well, but for a writer, at least for me, it comes closer to working than anything else.
Note: To celebrate the third anniversary of this blog, I’ll be spending the week reposting some of my favorite pieces from early in its run. This post originally appeared, in a somewhat different form, on December 17, 2010.
As the New York Times recently pointed out, Google’s new online book database, which allows users to chart the evolving frequency of words and short phrases over 5.2 million digitized volumes, is a wonderful toy. You can look at the increasing frequency of George Carlin’s seven dirty words, for example—not surprisingly, they’ve all become a lot more common over the past few decades—or chart the depressing ascent of the word “alright.” Most seductively of all, perhaps, you can see at a glance how literary reputations have risen or fallen over time.
Take the five in the graph above, for instance. It’s hard not to see that, for all the talk of the death of Freud, he’s doing surprisingly well, and even passed Shakespeare in the mid-’70s (around the same time, perhaps not coincidentally, as Woody Allen’s creative peak). Goethe experienced a rapid fall in popularity in the mid-’30s, though he had recovered nicely by the end of World War II. Tolstoy, by contrast, saw a modest spike sometime around the Big Three conference in Tehran, and a drop as soon as the Soviet Union detonated its first atomic bomb. And Kafka, while less popular during the satisfied ’50s, saw a sudden surge in the paranoid decades thereafter:
Obviously, it’s possible to see patterns anywhere, and I’m not claiming that these graphs reflect real historical cause and effect. But it’s fun to think about. Even more fun is to look at the relative popularity of five leading American novelists of the last half of the twentieth century:
The most interesting graph is that for Norman Mailer, who experiences a huge ascent up to 1970, when his stature as a cultural icon was at his peak (just after his run for mayor of New York). Eventually, though, his graph—like those of Gore Vidal, John Updike, Philip Roth, and Saul Bellow—follows the trajectory that we’d suspect for that of an established, serious author: a long, gradual rise followed by a period of stability, as the author enters the official canon. Compare this to a graph of four best-selling novelists of the 1970s:
For Harold Robbins, Jacqueline Susann, Irving Wallace, and Arthur Hailey—and if you don’t recognize their names, ask your parents—we see a rapid rise in popularity followed by an equally rapid decline, which is what we might expect for authors who were once hugely popular but had no lasting value. And it’ll be interesting to see what this graph will look like in fifty years for, say, Stephenie Meyer or Dan Brown, and in which category someone like Jonathan Franzen or J.K. Rowling will appear. Only time, and Google, will tell.