Posts Tagged ‘David Mamet’
In his book A New Theory of Urban Design, which was published thirty years ago, the architect Christopher Alexander opens with a consideration of the basic problem confronting all city planners. He draws an analogy between the process of urban design and that of creating a work of art or studying a biological organism, but he also points out their fundamental differences:
With a city, we don’t have the luxury of either of these cases. We don’t have the luxury of a single artist whose unconscious process will produce wholeness spontaneously, without having to understand it—there are simply too many people involved. And we don’t have the luxury of the patient biologist, who may still have to wait a few more decades to overcome his ignorance.
What happens in the city, happens to us. If the process fails to produce wholeness, we suffer right away. So, somehow, we must overcome our ignorance, and learn to understand the city as a product of a huge network of processes, and learn just what features might make the cooperation of these processes produce a whole.
And wherever he writes “city,” you can replace it with any complicated system—a nation, a government, an environmental crisis—that seems too daunting for any individual to affect on his or her own, and toward which it’s easy to despair over our own helplessness, especially, as Alexander notes, when it’s happening to us.
Alexander continues: “We must therefore learn to understand the laws which produce wholeness in the city. Since thousands of people must cooperate to produce even a small part of a city, wholeness in the city will only be created to the extent that we can make these laws explicit, and can then introduce them, openly, explicitly, into the normal process of urban development.” We can pause here to note that this is as good an explanation as any of why rules play a role in all forms of human activity. It’s easy to fetishize or dismiss the rules to the point where we overlook why they exist in the first place, but you could say that they emerge whenever we’re dealing with a process that is too complicated for us to wing it. Some degree of improvisation enters into much of what we do, and in many cases—when we’re performing a small task for the first time with minimal stakes—it’s fine to make it up as we go along. The larger, more important, or more complex the task, however, the more useful it becomes to have a few guidelines on which we can fall back whenever our intuition or conscience fails us. Rules are nice because they mean that we don’t constantly have to reason from first principles whenever we’re faced with a choice. They often need to be amended, supplemented, or repealed, and we should never stop interrogating them, but they’re unavoidable. Every time we discard a rule, we implicitly replace it with another. And it can be hard to strike the right balance between a reasonable skepticism of the existing rules and an understanding of why they’re pragmatically good to have around.
Before we can develop a set of rules for any endeavor, however, it helps to formulate what Alexander calls “a single, overriding rule” that governs the rest. It’s worth quoting him at length here, because the challenge of figuring out a rule for urban design is much the same as that for any meaningful project that involves a lot of stakeholders:
The growth of a town is made up of many processes—processes of construction of new buildings, architectural competitions, developers trying to make a living, people building additions to their houses, gardening, industrial production, the activities of the department of public works, street cleaning and maintenance…But these many activities are confusing and hard to integrate, because they are not only different in their concrete aspects—they are also guided by entirely different motives…One might say that this hodgepodge is highly democratic, and that it is precisely this hodgepodge which most beautifully reflects the richness and multiplicity of human aspirations.
But the trouble is that within this view, there is no sense of balance, no reasonable way of deciding how much weight to give the different aims within the hodgepodge…For this reason, we propose to begin entirely differently. We propose to imagine a single process…one which works at many levels, in many different ways…but still essentially a single process, in virtue of the fact that it has a single goal.
And Alexander arrives at a single, overriding rule that is so memorable that I seem to think about it all the time: “Every increment of construction must be made in such a way as to heal the city.”
But it isn’t hard to understand why this rule isn’t more widely known. It’s difficult to imagine invoking it at a city planning meeting, and it has a mystical ring to it that I suspect makes many people uncomfortable. Yet this is less a shortcoming in the rule itself than a reflection of the kind of language that we need to develop an intuition about what other rules to follow. Alexander argues that most of us have a “a rather good intuitive sense” of what this rule means, and he points out: “It is, therefore, a very useful kind of inner voice, which forces people to pay attention to the balance between different goals, and to put things together in a balanced fashion.” The italics are mine. Human beings have trouble keeping all of their own rules in their heads at once, much less those that apply to others, so our best bet is to develop an inner voice that will guide us when we don’t have ready access to the rules for a specific situation. (As David Mamet says of writing: “Keep it simple, stupid, and don’t violate the rules that you do know. If you don’t know which rule applies, just don’t muck up the more general rules.”) Most belief systems amount to an attempt to cultivate that voice, and if Alexander’s advice has a religious overtone, it’s because we tend to associate such admonitions with the contexts in which they’ve historically arisen. “Love your enemies” is one example. “Desire is suffering” is another. Such precepts naturally give rise to other rules, which lead in turn to others, and one of the shared dangers in city planning and religion is the failure to remember the underlying purpose when faced with a mass of regulations. Ideally, they serve as a system of best practices, but they often have no greater goal than to perpetuate themselves. And as Alexander points out, it isn’t until you’ve taken the time to articulate the one rule that governs the rest that you can begin to tell the difference.
Lord Rowton…says that he once asked Disraeli what was the most remarkable, the most self-sustained and powerful sentence he knew. Dizzy paused for a moment, and then said, “Sufficient unto the day is the evil thereof.”
—Augustus J.C. Hare, The Story of My Life
Disraeli was a politician and a novelist, which is an unusual combination, and he knew his business. Politics and writing have less to do with each other than a lot of authors might like to believe, and the fact that you can create a compelling world on paper doesn’t mean that you can do the same thing in real life. (One of the hidden themes of Astounding is that the skills that many science fiction writers acquired in organizing ideas on the page turned out to be notably inadequate when it came to getting anything done during World War II.) Yet both disciplines can be equally daunting and infuriating to novices, in large part because they both involve enormously complicated projects—often requiring years of effort—that need to be approached one day at a time. A single day’s work is rarely very satisfying in itself, and you have to cling to the belief that countless invisible actions and compromises will somehow result in something real. It doesn’t always happen, and even if it does, you may never get credit or praise. The ability to deal with the everyday tedium of politics or writing is what separates professionals from amateurs. And in both cases, the greatest accomplishments are usually achieved by freaks who can combine an overarching vision with a finicky obsession with minute particulars. As Eugène-Melchior de Vogüé, who was both a diplomat and literary critic, said of Tolstoy, it requires “a queer combination of the brain of an English chemist with the soul of an Indian Buddhist.”
And if you go into either field without the necessary degree of patience, the results can be unfortunate. If you’re a writer who can’t subordinate yourself to the routine of writing on a daily basis, the most probable outcome is that you’ll never finish your novel. In politics, you end up with something very much like what we’ve all observed over the last few weeks. Regardless of what you might think about the presidential refugee order, its rollout was clearly botched, thanks mostly to a president and staff that want to skip over all the boring parts of governing and get right to the good stuff. And it’s tempting to draw a contrast between the incumbent, who achieved his greatest success on reality television, and his predecessor, a detail-oriented introvert who once thought about becoming a novelist. (I’m also struck, yet again, by the analogy to L. Ron Hubbard. He spent most of his career fantasizing about a life of adventure, but when he finally got into the Navy, he made a series of stupid mistakes—including attacking two nonexistent submarines off the coast of Oregon—that ultimately caused him to be stripped of his command. The pattern repeated itself so many times that it hints at a fundamental aspect of his personality. He was too impatient to deal with the tedious reality of life during wartime, which failed to live up to the version he had dreamed of himself. And while I don’t want to push this too far, it’s hard not to notice the difference between Hubbard, who cranked out his fiction without much regard for quality, and Heinlein, a far more disciplined writer who was able to consciously tame his own natural impatience into a productive role at the Philadelphia Navy Yard.)
Which brings us back to the sentence that impressed Disraeli. It’s easy to interpret it as an admonition not to think about the future, which isn’t quite right. We can start by observing that it comes at the end of what The Five Gospels notes is possibly “the longest connected discourse that can be directly attributed to Jesus.” It’s the one that asks us to consider the birds of the air and the lilies of the field, which, for a lot of us, prompts an immediate flashback to The Life of Brian. (“Consider the lilies?” “Uh, well, the birds, then.” “What birds?” “Any birds.” “Why?” “Well, have they got jobs?”) But whether or not you agree with the argument, it’s worth noticing that the advice to focus on the evils of each day comes only after an extended attempt at defining a larger set of values—what matters, what doesn’t, and what, if anything, you can change by worrying. You’re only in a position to figure out how best to spend your time after you’ve considered the big questions. As the physician William Osler put it:
[My ideal is] to do the day’s work well and not to bother about tomorrow. You may say that is not a satisfactory ideal. It is; and there is not one which the student can carry with him into practice with greater effect. To it more than anything else I owe whatever success I have had—to this power of settling down to the day’s work and trying to do it well to the best of my ability, and letting the future take care of itself.
This has important implications for both writers and politicians, as well as for progressives who wonder how they’ll be able to get through the next twenty-four hours, much less the next four years. When you’re working on any important project, even the most ambitious agenda comes down to what you’re going to do right now. In On Directing Film, David Mamet expresses it rather differently:
Now, you don’t eat a whole turkey, right? You take off the drumstick and you take a bite of the drumstick. Okay. Eventually you get the whole turkey done. It’ll probably get dry before you do, unless you have an incredibly good refrigerator and a very small turkey, but that is outside the scope of this lecture.
A lot of frustration in art, politics, and life in general comes from attempting to swallow the turkey in one bite. Jesus, I think, was aware of the susceptibility of his followers to grandiose but meaningless gestures, which is why he offered up the advice, so easy to remember and so hard to follow, to simultaneously focus on the given day while keeping the kingdom of heaven in mind. Nearly every piece of practical wisdom in any field is about maintaining that double awareness. Fortunately, it goes in both directions: small acts of discipline aid us in grasping the whole, and awareness of the whole tells us what to do in the moment. As R.H. Blyth says of Zen: “That is all religion is: eat when you are hungry, sleep when you are tired.” And don’t try to eat the entire turkey at once.
Forty years ago, the cinematographer Garrett Brown invented the Steadicam. It was a stabilizer attached to a harness that allowed a camera operator, walking on foot or riding in a vehicle, to shoot the kind of smooth footage that had previously only been possible using a dolly. Before long, it had revolutionized the way in which both movies and television were shot, and not always in the most obvious ways. When we think of the Steadicam, we’re likely to remember virtuoso extended takes like the Copacabana sequence in Goodfellas, but it can also be a valuable tool even when we aren’t supposed to notice it. As the legendary Robert Elswit said recently to the New York Times:
“To me, it’s not a specialty item,” he said. “It’s usually there all the time.” The results, he added, are sometimes “not even necessarily recognizable as a Steadicam shot. You just use it to get something done in a simple way.”
Like digital video, the Steadicam has had a leveling influence on the movies. Scenes that might have been too expensive, complicated, or time-consuming to set up in the conventional manner can be done on the fly, which has opened up possibilities both for innovative stylists and for filmmakers who are struggling to get their stories made at all.
Not surprisingly, there are skeptics. In On Directing Film, which I think is the best book on storytelling I’ve ever read, David Mamet argues that it’s a mistake to think of a movie as a documentary record of what the protagonist does, and he continues:
The Steadicam (a hand-held camera), like many another technological miracle, has done injury; it has injured American movies, because it makes it so easy to follow the protagonist around, one no longer has to think, “What is the shot?” or “Where should I put the camera?” One thinks, instead, “I can shoot the whole thing in the morning.”
This conflicts with Mamet’s approach to structuring a plot, which hinges on dividing each scene into individual beats that can be expressed in purely visual terms. It’s a method that emerges naturally from the discipline of selecting shots and cutting them together, and it’s the kind of hard work that we’re often tempted to avoid. As Mamet adds in a footnote: “The Steadicam is no more capable of aiding in the creation of a good movie than the computer is in the writing of a good novel—both are labor-saving devices, which simplify and so make more attractive the mindless aspects of creative endeavor.” The casual use of the Steadicam seduces directors into conceiving of the action in terms of “little plays,” rather than in fundamental narrative units, and it removes some of the necessity of disciplined thinking beforehand.
But it isn’t until toward the end of the book that Mamet delivers his most ringing condemnation of what the Steadicam represents:
“Wouldn’t it be nice,” one might say, “if we could get this hall here, really around the corner from that door there; or to get that door here to really be the door that opens on the staircase to that door there? So we could just movie the camera from one to the next?”
It took me a great deal of effort and still takes me a great deal and will continue to take me a great deal of effort to answer the question thusly: no, not only is it not important to have those objects literally contiguous; it is important to fight against this desire, because fighting it reinforces an understanding of the essential nature of film, which is that it is made of disparate shorts, cut together. It’s a door, it’s a hall, it’s a blah-blah. Put the camera “there” and photograph, as simply as possible, that object. If we don’t understand that we both can and must cut the shots together, we are sneakily falling victim to the mistaken theory of the Steadicam.
This might all sound grumpy and abstract, but it isn’t. Take Birdman. You might well love Birdman—plenty of viewers evidently did—but I think it provides a devastating confirmation of Mamet’s point. By playing as a single, seemingly continuous shot, it robs itself of the ability to tell the story with cuts, and it inadvertently serves as an advertisement of how most good movies come together in the editing room. It’s an audacious experiment that never needs to be tried again. And it wouldn’t exist at all if it weren’t for the Steadicam.
But the Steadicam can also be a thing of beauty. I don’t want to discourage its use by filmmakers for whom it means the difference between making a movie under budget and never making it at all, as long as they don’t forget to think hard about all of the constituent parts of the story. There’s also a place for the bravura long take, especially when it depends on our awareness of the unfaked passage of time, as in the opening of Touch of Evil—a long take, made without benefit of a Steadicam, that runs the risk of looking less astonishing today because technology has made this sort of thing so much easier. And there’s even room for the occasional long take that exists only to wow us. De Palma has a fantastic one in Raising Cain, which I watched again recently, that deserves to be ranked among the greats. At its best, it can make the filmmaker’s audacity inseparable from the emotional core of the scene, as David Thomson observes of Goodfellas: “The terrific, serpentine, Steadicam tracking shot by which Henry Hill and his girl enter the Copacabana by the back exit is not just his attempt to impress her but Scorsese’s urge to stagger us and himself with bravura cinema.” The best example of all is The Shining, with its tracking shots of Danny pedaling his Big Wheel down the deserted corridors of the Overlook. It’s showy, but it also expresses the movie’s basic horror, as Danny is inexorably drawn to the revelation of his father’s true nature. (And it’s worth noting that much of its effectiveness is due to the sound design, with the alternation of the wheels against the carpet and floor, which is one of those artistic insights that never grows dated.) The Steadicam is a tool like any other, which means that it can be misused. It can be wonderful, too. But it requires a steady hand behind the camera.
When you’re working on any long writing project, whether it’s fiction or nonfiction, you’re eventually forced to deal with the problem of information management. In contrast to what a lot of readers might imagine, most writing—at least for someone like me—doesn’t consist of waiting for inspiration to strike while you’re staring at a blank page. A lot of the work and hard thinking has taken place prior to the physical act of writing a first draft, and more will come later, during the revision process. The rough draft becomes a kind of bottleneck through which ideas have to pass to get from one step to the next, and the challenge is less about coming up with good stuff in the moment than about ordering the material that you already have. If you’ve spent three months thinking about a project and six weeks in the actual writing, which isn’t an unreasonable proportion, you’re faced with the task of mapping one collection of thoughts onto another. The first set is amorphous, disorganized, and accumulated over a long stretch of time; the other needs to be set down in some orderly fashion, in a shorter period, and without forgetting anything important. As a result, many of the tools that writers develop to keep their thoughts straight are really designed to enable a lossless transfer of data in the transition between the chaos of conception and the more linear writing stage.
Over time, I’ve come up with various tricks to keep this information under control. The trouble, as with so much else in life, is that the approaches that work well when you’re first starting out don’t always hold up when you graduate to more complicated projects. Early on, for instance, I used hundreds of index cards to plot out my novels, supplemented with handwritten notes and mind maps, on the belief—which I still hold—that the tactile qualities of pen on paper would generate ideas in themselves. Later, as the individual pieces became too numerous to manage, I switched to keeping track of it all in a series of text files. Without thinking too much about it, I began to use TextEdit, the default text editor that comes packaged with my MacBook. And somewhat to my surprise, I’ve realized that I use it more than any other piece of software. For the actual manuscript, I still use Microsoft Word, but for almost everything else, I turn to TextEdit without hesitation. Why? It opens instantaneously when I click on its icon, as opposed to the ten seconds or so that Word takes to boot up, which makes it ideal as a notepad for jotting down quick thoughts. Even for longer writing sessions, its lack of bells and whistles appeals to me for much the same reason that WordStar still attracts loyalists like George R.R. Martin. There’s nothing between me and the words. In fact, I’m typing the first draft of this blog post on it right now.
But the most interesting use I’ve made of TextEdit is as a kind of filing system for notes. Nearly all of the information I’ve assembled for Astounding, for example, currently lives on one of four text files. One contains general biographical information about John W. Campbell and my other subjects; another consists of notes gleaned from going through 12,000 pages of his correspondence; another holds similar thoughts from reading through four decades of back issues of Astounding, Unknown, and Analog; and the last houses my notes on the hundreds of science fiction stories and novels that I’m reading or rereading for this project. My notes on back issues and stories are arranged chronologically, so I can scroll down and see patterns at a glance, but with the others, I don’t bother with any kind of order: I just type the notes in the first available spot, and I don’t really care where they end up. The result is a set of huge files—the one for biographical details alone is 60,000 words long. But it doesn’t really matter how big it is, because it’s searchable. If I’m looking for a particular piece of information, I just enter a query, either in the search box within TextEdit itself or through Spotlight, which searches the entire hard drive at once. It’s very fast and generally reliable, as long as I know what to look for, and assuming that I was smart enough to peg my notes to some obvious search term in the first place. It’s as if I’ve created a small, highly specialized slice of the Internet that only returns results that have previously passed through my brain.
Needless to say, there are limitations to this approach. My ability to find anything is predicated on my capacity to remember that it exists in the first place, which is harder than it sounds, given the thousands of discrete facts that I need to keep straight for a project like this. Every few months or so, I’ll sit down and read through the whole bulk of my notes in their entirety, which takes several days, just to refresh my memory about what I’ve got. It isn’t a perfect solution, but it’s arguably better than trying to do the same with handwritten notes. There’s also a real loss when it comes to the physical manipulation of ideas: I still do mind maps and write down ideas in my notebook whenever possible, and I can even do a rough version of shuffling the pieces in TextEdit by copying and pasting chunks of text until they fall into an order that makes sense. (Much of this, I imagine, would be possible in programs like Scrivener, but I prefer my more flexible approach.) A lot of it also depends on how much I can keep organized in my own head. It’s impossible to imagine writing a whole book at once in this fashion, but as David Mamet once said, you eat a turkey one bite at a time. I’m familiar enough with my own attention span to know how much I can handle—usually the equivalent of three chapters or so—at any given moment. So far, it seems to be working pretty well. Taking notes, as I’ve said elsewhere, amounts to a message that you send from the past to the future. And while I still miss my cards sometimes, I’ve found that it’s easier to just text myself.
Last week, I mentioned what I’ve come to see as the most valuable piece of writing wisdom I know, which is David Mamet’s advice in Some Freaks “to go one achievable step at a time.” You don’t try to do everything at once, which is probably impossible anyway. Instead, there are days in which you do “careful” jobs that are the artistic equivalent of housekeeping—research, making outlines of physical actions, working out the logic of the plot—and others in which you perform “inventive” tasks that rely on intuition. This seems like common sense: it’s hard enough to be clever or imaginative as it is, without factoring in the switching costs associated with moving from one frame of mind to another. The writer Colin Wilson believed that the best ideas emerge when your left and right hemispheres are moving at the same rate, which tends to occur in moments of either reverie or high excitement. This is based on an outdated model of how the brain works, but the phenomenon it describes is familiar enough, and it’s just a small step from there to acknowledging that neither ecstatic nor dreamlike mental states are particularly suited for methodical work. When you’re laying the foundations for future creative activity, you usually end up somewhere in the middle, in a state of mind that is focused but not heightened, less responsive to connections than to units, and concerned more with thoroughness than with inspiration. It’s an important stage, but it’s also the last place where you’d expect real insights to appear.
Clearly, a writer should strive to work with, rather than against, this natural division of labor. It’s also easy to agree with Mamet’s advice that it’s best to tackle one kind of thinking per day. (Mental switching costs of any kind are usually minimized when you’ve had a good night’s sleep in the meantime.) The real question is how to figure out what sort of work you should be doing at any given moment, and, crucially, whether it’s possible to predict this in advance. Any writer can tell you that there’s an enormous difference between getting up in the morning without any idea of what you’re doing that day, which is the mark of an amateur, and having a concrete plan—which is why professional authors use such tools as outlines and calendars. Ideally, it would be nice to know when you woke up whether it was going to be a “careful” day or an “inventive” day, which would allow you to prepare yourself accordingly. Sometimes the organic life cycle of a writing project supplies the answer: depending on where you are in the process, you engage in varying proportions of careful or inventive thought. But every stage requires some degree of both. As Mamet implies, you’ll often alternate between them, although not as neatly as in his hypothetical example. And while it might seem pointless to allocate time for inspiration, which appears according to no fixed schedule, you can certainly create the conditions in which it’s more likely to appear. But how do you know when?
I’ve come up with a simple test to answer this question: I ask myself how much time I expect to spend sitting down. Usually, before a day begins, I have a pretty good sense of how much sitting or standing I’ll be doing, and that’s really all I need to make informed decisions about how to use my time. There are some kinds of creative work that demand sustained concentration at a desk or in a seated position. This includes most of the “careful” tasks that Mamet describes, but also certain forms of intuitive, nonlinear thinking, like making a mind map. By contrast, there are other sorts of work that not only don’t require you to be at your desk, but are actively stifled by it: daydreaming, brooding over problems, trying to sketch out large blocks of the action. You often do a better job of it when you’re out taking a walk, or in the bus, bath, or bed. When scheduling creative work, then, you should start by figuring out what your body is likely to be doing that day, and then use this to plan what to do with your mind. Your brain has no choice but to tag along with your body when it’s running errands or standing in line at the bank, but if you structure your time appropriately, those moments won’t go to waste. And it’s often such external factors, rather than the internal logic of where you should be in the process, that determine what you should be doing.
At first glance, this doesn’t seem that much different from the stock advice that you should utilize whatever time you have available, whether you’re washing the dishes or taking a shower. But I think it’s a bit more nuanced than this, and that it’s more about matching the work to be done to the kind of time you have. If you try to think systematically and carefully while taking a walk in the park, you’ll feel frustrated when your mind wanders to other subjects. Conversely, if you try to daydream at your desk, not only are you likely to feel boxed in by your surroundings, but you’re also wasting valuable time that would be better spent on work that only requires the Napoleonic virtues of thoroughness and patience. Inspiration can’t be forced, and you don’t know in advance if you’re better off being careful or inventive on any given day—but the amount of time that you’ll be seated provides an important clue. (You can also reverse the process, and arrange to be seated as little as possible on days when you hope to get some inventive thinking done. For most of us, unfortunately, this isn’t entirely under our control, which makes it all the more sensible to take advantage of such moments when they present themselves.) And it doesn’t need to be planned beforehand. If you’re at work on a problem and you’re not sure what kind of thinking you should be doing, you can look at yourself and ask: Am I sitting down right now? And that’s all the information you need.
It’s been said that all of the personal financial advice that most people need to know can fit on a single index card. In fact, that’s pretty much true—which didn’t stop the man who popularized the idea from writing a whole book about it. But the underlying principle is sound enough. When you’re dealing with a topic like your own finances, instead of trying to master a large body of complicated material, you’re better off focusing on a few simple, reliable rules until you aren’t likely to break them by mistake. Once you’ve internalized the basics, you can move on. The tricky part is identifying the rules that will get you the furthest per unit of effort. In practice, no matter what we’re doing, nearly all of us operate under only a handful of conscious principles at any given moment. We just can’t keep more than that in our heads at any one time. (Unconscious principles are another matter, and you could say that intuition is another word for all the rules that we’ve absorbed to the point where we don’t need to think about them explicitly.) If the three or four rules that you’ve chosen to follow are good ones, it puts you at an advantage over a rival who is working with an inferior set. And while this isn’t enough to overcome the impact of external factors, or dumb luck, it makes sense to maximize the usefulness of the few aspects that you can control. This implies, in turn, that you should think very carefully about a handful of big rules, and let experience and intuition take care of the rest.
Recently, I’ve been thinking about what I’d include on a similar index card for a writer. In my own writing life, a handful of principles have far outweighed the others. I’ve spent countless hours discussing the subject on this blog, but you could throw away almost all of it: a single index card’s worth of advice would have gotten me ninety percent of the way to where I am now. For instance, there’s the simple rule that you should never go back to read what you’ve written until you’ve finished a complete rough draft, whether it’s a short story, an essay, or a novel—which is more responsible than any other precept for the fact that I’m still writing at all. The principle that you should cut at least ten percent from a first draft, in turn, is what helped me sell my first stories, and in my experience, it’s more like twenty percent. Finally, there’s the idea that you should structure your plot as a series of objectives, and that you should probably make some kind of outline to organize your thoughts before you begin. This is arguably more controversial than the other two, and outlines aren’t for everybody. But they’ve allowed me to write more intricate and ambitious stories than I could have managed otherwise, and they make it a lot easier to finish what I’ve started. (The advice to write an outline is a little like the fifth postulate of Euclid: it’s uglier than the others, and you get interesting results when you get rid of it, but most of us are afraid to drop it completely.)
Then we get to words of wisdom that aren’t as familiar, but which I think every writer should keep in mind. If I had to pick one piece of advice to send back in time to my younger self, along with the above, it’s what David Mamet says in Some Freaks:
As a writer, I’ve tried to train myself to go one achievable step at a time: to say, for example, “Today I don’t have to be particularly inventive, all I have to be is careful, and make up an outline of the actual physical things the character does in Act One.” And then, the following day to say, “Today I don’t have to be careful. I already have this careful, literal outline, and I all have to do is be a little bit inventive,” et cetera, et cetera.
It isn’t as elegantly phased as I might like, but it gets at something so important about the writing process that I’ve all but memorized it. A real writer has to be good at everything, and it’s unclear why we should expect all those skills to manifest themselves in a single person. As I once wrote about Proust: “It seems a little unfair that our greatest writer on the subject of sexual jealousy and obsession should also be a genius at describing, say, a seascape.” How can we reasonably expect our writers to create suspense, tell stories about believable characters, advance complicated ideas, and describe the bedroom curtains?
The answer—and while it’s obvious, it didn’t occur to me for years—is that the writer doesn’t need to do all of this at once. A work of art is experienced in a comparative rush, but it doesn’t need to be written that way. (As Homer Simpson was once told: “Very few cartoons are broadcast live. It’s a terrible strain on the animators’ wrists.”) You do one thing at a time, as Mamet says, and divide up your writing schedule so that you don’t need to be clever and careful at the same time. This applies to nonfiction as well. When you think about the work that goes into writing, say, a biography, it can seem absurd that we expect a writer to be the drudge who tracks down the primary sources, the psychologist who interprets the evidence, and the stylist who writes it up in good prose. But these are all roles that a writer plays at different points, and it’s a mistake to conflate them, even as each phase informs all the rest. Once you’ve become a decent stylist and passable psychologist, you’re also a more efficient drudge, since you’re better at figuring out what is and isn’t useful. Which implies that a writer isn’t dealing with just one index card of rules, but with several, and you pick and choose between them based on where you are in the process. Mamet’s point, I think, is that this kind of switching is central to getting things done. You don’t try to do everything simultaneously, and you don’t overthink whatever you’re doing at the moment. As Mamet puts it elsewhere: “Keep it simple, stupid, and don’t violate the rules that you do know. If you don’t know which rule applies, just don’t muck up the more general rules.”
For what does it profit a man to gain the whole world and forfeit his soul?
Whether or not you’re a believer, you eventually end up with your own idea of who Jesus might have been. I like to think of him as the ultimate pragmatist. If you accept his central premise—that the kingdom of heaven, whatever it is, is something that is happening right now—then his ethical system, as impossible as it might seem for most of us to follow, becomes easier to understand. It’s about eliminating distractions, focusing on what really counts, and removing sources of temptation before they have a chance to divert us from the true goal. Poverty, as Michael Grant puts it in Jesus: An Historian’s Review of the Gospels, is a practical solution to a concrete problem: “Excessive wealth might be a positive disadvantage, since its too lavish enjoyment could distract its possessors from the overriding vital matter at hand.” And as Grant observes elsewhere:
Certainly, “blessed are the meek”…but that is because “they shall inherit the earth.” Since nothing less than this is at stake, a contentious spirit is wholly out of place, for it will only distract attention and energy from the preeminent task. It is not even worth hating your enemies…In the urgent circumstances, Jesus believed, it was a sheer waste of time. Love them instead, just as much as you love everyone else; pray for those who persecute you, turn the other cheek. For why not avoid hostilities and embroilments which, beside the infinitely larger issue, are ultimately irrelevant and distracting?
“Love your enemies,” in other words, is nothing but sensible advice. Which doesn’t it make it any easier to do it for real, rather than merely paying it lip service, when it strikes us as inconvenient.
Take the case of Donald Trump. It’s fair to say that I feel less love toward Trump than I do toward any other American public figure of my lifetime. At my best, I just want to go back to the days when I could safely ignore him; at my worst, I want him to suffer some kind of humiliating, career-ending comeuppance, although I’m well aware that real life rarely affords such satisfactions. (If anything, it’s more likely to give us the opposite.) I’m also uncomfortably conscious that this is exactly the kind of reaction that he wants to evoke from me. It’s a victory. No matter what happens in this election, Trump has added perceptibly to the world’s stockpile of hate, resentment, and alienation. Hating him and what he stands for is easy; what isn’t so easy is trying to respond in ways that don’t merely feed into the cycle of hatred. The answer—and I wish it were different—is right there in front of us. We’re told to love our enemies. Jesus, the pragmatic philosopher, knew that there wasn’t time for anything else. But when I think about doing the same with Trump, I feel a bit like Meg Murry in A Wrinkle in Time, when she realizes that love is the only weapon that will work against IT, the hideous brain that rules the planet of Camazotz:
If she could give love to IT perhaps it would shrivel up and die, for she was sure that IT could not withstand love. But she, in all her weakness and foolishness and baseness and nothingness, was incapable of loving IT. Perhaps it was not too much to ask of her, but she could not do it.
The italics, as always, are mine. It isn’t too much to ask. But it’s one thing to acknowledge this, and quite another to grant that we’re obliged to do it for someone like Donald Trump.
So here’s my best shot. Trump grew up wanting nothing more than to please his own demanding father. Early in his career, he was just one real estate developer among many. He ended up concluding that the only values worth pursuing were the acquisition of money and power, abstracted from any possible benefit except as a way of keeping score. What’s worse, he received plenty of validation that his assumptions were correct. He’s never had any reason to grow or change. Instead, as we all do, he’s become more like himself as he’s aged, while categorizing the human beings around him as sources of income, enemies, or potential enablers. Behind his bluster, he’s deeply insecure, as we all are. He refuses to take responsibility for his actions, he can’t admit a mistake, and he blames everyone but himself when things go wrong. (When he says that the first debate was “rigged” because someone tampered with his mike and the moderator was against him, I’m reminded of what David Mamet says in On Directing Film: “Two reasons are equal to no reasons—it’s like saying: ‘I was late because the bus drivers are on strike and my aunt fell downstairs.’”) He seems unhappy. It’s hard to imagine him taking pleasure in reading a book, preparing a meal, or really anything aside from trolling the electorate and putting his name on buildings and planes. He appears to have no affection for anyone or anything, except perhaps his own children. And he’s the creation of forces that even he can’t control. He’s succeeded beyond his wildest expectations, but only by becoming the full-time monster that was only there in flashes before. Trump uses the system, but it also uses him. He has transformed himself into exactly what he hopes people want him to be, and he’s condemned to do it forever. And when the end comes—”As it must to all men,” the newsreel narrator reminds us in Citizen Kane—he’ll have to ask himself whether it was worth it.
I know that this comes perilously close to what the onlookers say after seeing Marge Simpson’s nude portrait of Mr. Burns: “He’s bad, but he’ll die. So I like it.” But it’s the best I can do. I can’t love Trump, but I can sort of forgive him, and pity him, for becoming what he was told to be, and for abandoning what makes us human and valuable—empathy, compassion, humility—in favor of an identity assembled from who we are at our worst. In a way, I’m even grateful to him, for much the same reason that George Saunders expressed in The New Yorker: “Although, to me, Trump seems the very opposite of a guardian angel, I thank him for this: I’ve never before imagined America as fragile, as an experiment that could, within my very lifetime, fail. But I imagine it that way now.” If Trump didn’t exist, it would have been necessary to invent him. He’s a better cautionary tale than any I could have imagined, because he won the trappings of success at a spiritual cost that isn’t tragic so much as deeply sad. He’s like Charles Foster Kane, without any of the qualities that make Kane so misleadingly attractive. When I think of the abyss of his ego, which draws like a battery on the love of his supporters and flails helplessly in every other situation, it feels like the logical extension of a career spent in the pursuit of wealth and celebrity divorced from any other consideration beyond himself. Like all mortals, Trump had exactly one chance to live a meaningful life, with greater resources than most of us ever get, and this is what he did with it. The closest I can come to loving him is the acknowledgment that I might have done the same, if I had been born with his circumstances and incentives. He’s not so different from me, as I fear I might have been in his shoes. And if I love Trump, in some weird way, it’s because I’m thankful I’m not him.