Posts Tagged ‘David Mamet’
If you wanted to construct the most prolific writer who ever lived, working from first principles, what features would you include? (We’ll assume, for the purposes of this discussion, that he’s a man.) Obviously, he would need to be capable of turning out clean, publishable prose at a fast pace and with a minimum of revision. He would be contented—even happy—within the physical conditions of writing itself, which requires working indoors at a desk alone for hours on end. Ideally, he would operate within a genre, either fiction or nonfiction, that lent itself to producing pages fairly quickly, but with enough variety to prevent burnout, since he’d need to maintain a steady clip every day for years. His most productive period would coincide with an era that gave him steady demand for his work, and he would have a boundless energy that was diverted early on toward the goal of producing more books. If you were particularly clever, you’d even introduce a psychological wrinkle: the act of writing would become his greatest source of satisfaction, as well as an emotional refuge, so that he would end up taking more pleasure in it than almost anything else in life. Finally, you’d provide him with cooperative publishers and an enthusiastic, although not overwhelming, readership, granting him a livelihood that was comfortable but not quite lavish enough to be distracting. Wind him up, let him run unimpeded for three or four decades, and how many books would you get? In the case of Isaac Asimov, the total comes to something like five hundred. Even if it isn’t quite enough to make him the most productive writer of all time, it certainly places him somewhere in the top ten. And it’s a career that followed all but axiomatically from the characteristics that I’ve listed above.
Let’s take these points one at a time. Asimov, like all successful pulp writers, learned how to crank out decent work on deadline, usually limiting himself to a first draft and a clean copy, with very little revision that wasn’t to editorial order. (And he wasn’t alone here. The pulps were an unforgiving school, and they quickly culled authors who weren’t able to write a sentence well enough the first time.) From a young age, Asimov was also drawn to enclosed, windowless spaces, like the kitchen at the back of his father’s candy store, and he had a persistent daydream about running a newsstand in the subway, where he could put up the shutter and read magazines in peace. After he began to write for a living, he was equally content to work in his attic office for up to ten hours a day. Yet it wasn’t fiction that accounted for the bulk of his output—which is a common misconception about his career—but a specific kind of nonfiction. Asimov was a prolific fiction writer, but no more so than many of his contemporaries. It was in nonfiction for general readers that he really shone, initially with such scientific popularizations as The Chemicals of Life and Inside the Atom. At first, his work drew on his academic and professional background in chemistry and biochemistry, but before long, he found that he was equally adept at explaining concepts from the other sciences, as well as such unrelated fields as history and literature. His usual method was to work straight from reference books, dictionaries, and encyclopedias, translating and organizing their concepts for a lay audience. As he once joked to Martin Gardner: “You mean you’re in the same racket I am? You just read books by the professors and rewrite them?”
This kind of writing is harder than it sounds. Asimov noted, correctly, that he added considerable value in arranging and presenting the material, and he was better at it than just about anyone else. (A faculty member at Boston University once observed him at work and exclaimed: “Why, you’re just copying the dictionary!” Asimov, annoyed, handed the dictionary to him and said: “Here. The dictionary is yours. Now go write the book.”) But it also lent itself admirably to turning out a lot of pages in a short period of time. Unlike fiction, it didn’t require him to come up with original ideas from scratch. As soon as he had enough projects in the hopper, he could switch between them freely to avoid becoming bored by any one subject. He could write treatments of the same topic for different audiences and cannibalize unsold material for other venues. In the years after Sputnik, there was plenty of demand for what he had to offer, and he had a ready market for short articles that could be collected into books. And since these were popular treatments of existing information, he could do all of the work from the comfort of his own office. Asimov hated to fly, and he actively avoided assignments that would require him to travel or do research away from home. Before long, his productivity became a selling point in itself, and when his wife told him that life was passing him by, Asimov responded: “If I do manage to publish a hundred books, and if I then die, my last words are likely to be, ‘Only a hundred!’” Writing became a place of security, both from life’s small crises and as an escape from an unhappy marriage, and it was also his greatest source of pleasure. When his daughter asked him what he would do if he had to choose between her and writing, Asimov said: “Why, I would choose you, dear.” But he adds: “But I hesitated—and she noticed that, too.”
Asimov was a complicated man—certainly more so than in the version of himself that he presented to the public—and he can’t be reduced to a neat set of factors. He wasn’t a robot. But those five hundred books represent an achievement so overwhelming that it cries out for explanation, and it wouldn’t exist if certain variables, both external and internal, hadn’t happened to align. In terms of his ability and ambition, Asimov was the equal of Campbell, Heinlein, or Hubbard, but in place of their public entanglements, he channeled his talents into a safer direction, where it grew to gargantuan proportions that only hint at how monstrous that energy and passion really were. (He was also considerably younger than the others, as well as more naturally cautious, and I’d like to believe that he drew a negative lesson from their example.) The result, remarkably, made him the most beloved writer of them all. It was a cultural position, outside the world of science fiction, that was due almost entirely to the body of his nonfiction work as a whole. He never had a bestseller until late in his career, but the volume and quality of his overall output were enough to make him famous. Asimov was the Mule, the unassuming superman of the Foundation series, but he conquered a world from his typewriter. He won the game. And when I think of how his talent, productivity, and love of enclosed spaces combined to produce a fortress made of books, I think of what David Mamet once said to The Paris Review. When asked to explain why he wrote, Mamet replied: “I’ve got to do it anyway. Like beavers, you know. They chop, they eat wood, because if they don’t, their teeth grow too long and they die. And they hate the sound of running water. Drives them crazy. So, if you put those two ideas together, they are going to build dams.”
You had your whole life to prepare for this moment. Why aren’t you ready?
—David Mamet, Spartan
When you’re raising a toddler who can’t wait to exercise her little legs, it can be hard to teach her to stop when you say so. If you’re anything like me, you find yourself shouting “Stop!” when she gets within fifteen feet of the curb, even if there aren’t any cars for miles. The trouble is that you end up repeating yourself so often that any particular instance doesn’t carry any weight. (I’ve since learned that I get a faster response when I say “Freeze,” which is what her coach says to her at gym class.) About a year ago, when my daughter consistently refused to listen to me, I tried to explain why it mattered. There wasn’t any danger now, but if there were, there wouldn’t be any time to talk about it, so she had to get used to doing what I said—which is the same logic, I gather, that underlies much of basic training. In a lot of ways, it’s the best reason why we should try to teach our kids to obey at all. Nine out of ten times, it doesn’t really make a difference, but the tenth time, or the hundredth, it might. This obviously applies to issues of safety, but also to social behavior. I tell Beatrix, truthfully, that she can’t make me like her any less, but that may not be true of everyone, so she might as well practice being nice to me. I provide a rationale whenever I can, but I also try to make the case that she needs to do what I say immediately, and that we can discuss the reasoning later. It doesn’t always work, and like every parent, I often find myself laying down arbitrary rules. But as I’ve said to Beatrix more than once: “Someday it might be important.”
And for whatever reason, the notion has stuck with me. We spend most of our lives preparing for a future test or trial, and we don’t know in advance what it will be. Thomas Henry Huxley once said:
Perhaps the most valuable result of all education is the ability to make yourself do the thing you have to do, when it ought to be done, whether you like it or not; it is the first lesson that ought to be learned; and, however early a man’s training begins, it is probably the last lesson that he learns thoroughly.
He’s right, of course. But it’s even better to do the thing you have to do before it ought to be done. Education itself is a kind of guess about what we think will be useful down the line, and it’s almost never valuable in the moment. (If it is, it isn’t education, but on-the-job training, which is a very different concept.) In many cases, it never becomes applicable at all. It’s often been said that a liberal education is more about learning how to think than about mastering a particular body of information, which is true enough. But it’s also a justification, imposed retroactively, for the fact that we have little idea what a particular human being will need to know. This is even true for fields outside the liberal arts, which is how we get such dubious screening methods as the whiteboard interview, which is a sort of ritual performance that has nothing in common with how coding actually works. If we knew what we needed, we’d test for it. But we don’t.
As a result, much of life comes down to a series of judgment calls about how best to prepare for whatever might be coming. You could even say that this is why most of us prefer to work for money, which can be stockpiled and exchanged for future needs that we can’t predict. Money is useful because it partially absolves us of having to foresee everything. A surprising number of issues can be resolved by throwing money at the problem, and if you’ve ever thought about stocking a survival retreat, even as a daydream, you know how difficult it can be to anticipate your needs for even a year in the future. But it’s also a choice that we make constantly when it comes to the information we acquire. Some of this material we can safely outsource, and there’s no particular reason to stock our brains with facts, like how to get the length of a Python string, that we can always look up when necessary. As Indiana Jones’s dad once said, I write it down so I don’t have to remember it. (You’ll occasionally hear arguments in favor of rote memorization as an educational tool, but its value seems to lie mostly in giving students something to do while they mature in other ways, and there are probably better uses of that time.) But some forms of knowledge need to be internalized, and it can be hard to know how best to allocate our limited energies. I was going to say that it never hurts to learn how to write, but you probably shouldn’t trust me. Anyone who gives you advice in print presumably thinks that writing is important, and maybe we should pay more attention to those who don’t write down what they have to teach us.
And a lot of it comes down to whose advice you’re willing to take. When it came to choosing a college major, I depended on a piece of advice that seems pretty shaky in retrospect. More recently, I spent a month doing CrossFit, mostly because a studio had opened a block away from my house, and its pitch comes down to the idea that someday it might be important. As its official description states:
Overall, the aim of CrossFit is to forge a broad, general and inclusive fitness supported by measurable, observable and repeatable results. The program prepares trainees for any physical contingency—not only for the unknown but for the unknowable, too. Our specialty is not specializing.
The premise of CrossFit—which, incidentally, is obsessed with whiteboards—is that you’re subjecting yourself to pain in the present to avoid a moment of regret later on, when you’re stuck, say, in a burning car. I respect that, but I also quit after a few weeks, after deciding that its expected value wasn’t high enough to justify it. Maybe I’ll be sorry later. But risk, by definition, is predictable in the aggregate and utterly unforeseeable for any one individual, and it rarely takes the form for which we’ve been practicing. Some of our hunches on the subject are better than others, and it makes sense to prepare for risk in a way that enhances the present. (As I’ve pointed out before, the consolation prize for failing to become an astronaut is a really good job.) But you never know. And when I tell my daughter that this might all be important one day, I’m really talking to myself.
In his book A New Theory of Urban Design, which was published thirty years ago, the architect Christopher Alexander opens with a consideration of the basic problem confronting all city planners. He draws an analogy between the process of urban design and that of creating a work of art or studying a biological organism, but he also points out their fundamental differences:
With a city, we don’t have the luxury of either of these cases. We don’t have the luxury of a single artist whose unconscious process will produce wholeness spontaneously, without having to understand it—there are simply too many people involved. And we don’t have the luxury of the patient biologist, who may still have to wait a few more decades to overcome his ignorance.
What happens in the city, happens to us. If the process fails to produce wholeness, we suffer right away. So, somehow, we must overcome our ignorance, and learn to understand the city as a product of a huge network of processes, and learn just what features might make the cooperation of these processes produce a whole.
And wherever he writes “city,” you can replace it with any complicated system—a nation, a government, an environmental crisis—that seems too daunting for any individual to affect on his or her own, and toward which it’s easy to despair over our own helplessness, especially, as Alexander notes, when it’s happening to us.
Alexander continues: “We must therefore learn to understand the laws which produce wholeness in the city. Since thousands of people must cooperate to produce even a small part of a city, wholeness in the city will only be created to the extent that we can make these laws explicit, and can then introduce them, openly, explicitly, into the normal process of urban development.” We can pause here to note that this is as good an explanation as any of why rules play a role in all forms of human activity. It’s easy to fetishize or dismiss the rules to the point where we overlook why they exist in the first place, but you could say that they emerge whenever we’re dealing with a process that is too complicated for us to wing it. Some degree of improvisation enters into much of what we do, and in many cases—when we’re performing a small task for the first time with minimal stakes—it’s fine to make it up as we go along. The larger, more important, or more complex the task, however, the more useful it becomes to have a few guidelines on which we can fall back whenever our intuition or conscience fails us. Rules are nice because they mean that we don’t constantly have to reason from first principles whenever we’re faced with a choice. They often need to be amended, supplemented, or repealed, and we should never stop interrogating them, but they’re unavoidable. Every time we discard a rule, we implicitly replace it with another. And it can be hard to strike the right balance between a reasonable skepticism of the existing rules and an understanding of why they’re pragmatically good to have around.
Before we can develop a set of rules for any endeavor, however, it helps to formulate what Alexander calls “a single, overriding rule” that governs the rest. It’s worth quoting him at length here, because the challenge of figuring out a rule for urban design is much the same as that for any meaningful project that involves a lot of stakeholders:
The growth of a town is made up of many processes—processes of construction of new buildings, architectural competitions, developers trying to make a living, people building additions to their houses, gardening, industrial production, the activities of the department of public works, street cleaning and maintenance…But these many activities are confusing and hard to integrate, because they are not only different in their concrete aspects—they are also guided by entirely different motives…One might say that this hodgepodge is highly democratic, and that it is precisely this hodgepodge which most beautifully reflects the richness and multiplicity of human aspirations.
But the trouble is that within this view, there is no sense of balance, no reasonable way of deciding how much weight to give the different aims within the hodgepodge…For this reason, we propose to begin entirely differently. We propose to imagine a single process…one which works at many levels, in many different ways…but still essentially a single process, in virtue of the fact that it has a single goal.
And Alexander arrives at a single, overriding rule that is so memorable that I seem to think about it all the time: “Every increment of construction must be made in such a way as to heal the city.”
But it isn’t hard to understand why this rule isn’t more widely known. It’s difficult to imagine invoking it at a city planning meeting, and it has a mystical ring to it that I suspect makes many people uncomfortable. Yet this is less a shortcoming in the rule itself than a reflection of the kind of language that we need to develop an intuition about what other rules to follow. Alexander argues that most of us have a “a rather good intuitive sense” of what this rule means, and he points out: “It is, therefore, a very useful kind of inner voice, which forces people to pay attention to the balance between different goals, and to put things together in a balanced fashion.” The italics are mine. Human beings have trouble keeping all of their own rules in their heads at once, much less those that apply to others, so our best bet is to develop an inner voice that will guide us when we don’t have ready access to the rules for a specific situation. (As David Mamet says of writing: “Keep it simple, stupid, and don’t violate the rules that you do know. If you don’t know which rule applies, just don’t muck up the more general rules.”) Most belief systems amount to an attempt to cultivate that voice, and if Alexander’s advice has a religious overtone, it’s because we tend to associate such admonitions with the contexts in which they’ve historically arisen. “Love your enemies” is one example. “Desire is suffering” is another. Such precepts naturally give rise to other rules, which lead in turn to others, and one of the shared dangers in city planning and religion is the failure to remember the underlying purpose when faced with a mass of regulations. Ideally, they serve as a system of best practices, but they often have no greater goal than to perpetuate themselves. And as Alexander points out, it isn’t until you’ve taken the time to articulate the one rule that governs the rest that you can begin to tell the difference.
Lord Rowton…says that he once asked Disraeli what was the most remarkable, the most self-sustained and powerful sentence he knew. Dizzy paused for a moment, and then said, “Sufficient unto the day is the evil thereof.”
—Augustus J.C. Hare, The Story of My Life
Disraeli was a politician and a novelist, which is an unusual combination, and he knew his business. Politics and writing have less to do with each other than a lot of authors might like to believe, and the fact that you can create a compelling world on paper doesn’t mean that you can do the same thing in real life. (One of the hidden themes of Astounding is that the skills that many science fiction writers acquired in organizing ideas on the page turned out to be notably inadequate when it came to getting anything done during World War II.) Yet both disciplines can be equally daunting and infuriating to novices, in large part because they both involve enormously complicated projects—often requiring years of effort—that need to be approached one day at a time. A single day’s work is rarely very satisfying in itself, and you have to cling to the belief that countless invisible actions and compromises will somehow result in something real. It doesn’t always happen, and even if it does, you may never get credit or praise. The ability to deal with the everyday tedium of politics or writing is what separates professionals from amateurs. And in both cases, the greatest accomplishments are usually achieved by freaks who can combine an overarching vision with a finicky obsession with minute particulars. As Eugène-Melchior de Vogüé, who was both a diplomat and literary critic, said of Tolstoy, it requires “a queer combination of the brain of an English chemist with the soul of an Indian Buddhist.”
And if you go into either field without the necessary degree of patience, the results can be unfortunate. If you’re a writer who can’t subordinate yourself to the routine of writing on a daily basis, the most probable outcome is that you’ll never finish your novel. In politics, you end up with something very much like what we’ve all observed over the last few weeks. Regardless of what you might think about the presidential refugee order, its rollout was clearly botched, thanks mostly to a president and staff that want to skip over all the boring parts of governing and get right to the good stuff. And it’s tempting to draw a contrast between the incumbent, who achieved his greatest success on reality television, and his predecessor, a detail-oriented introvert who once thought about becoming a novelist. (I’m also struck, yet again, by the analogy to L. Ron Hubbard. He spent most of his career fantasizing about a life of adventure, but when he finally got into the Navy, he made a series of stupid mistakes—including attacking two nonexistent submarines off the coast of Oregon—that ultimately caused him to be stripped of his command. The pattern repeated itself so many times that it hints at a fundamental aspect of his personality. He was too impatient to deal with the tedious reality of life during wartime, which failed to live up to the version he had dreamed of himself. And while I don’t want to push this too far, it’s hard not to notice the difference between Hubbard, who cranked out his fiction without much regard for quality, and Heinlein, a far more disciplined writer who was able to consciously tame his own natural impatience into a productive role at the Philadelphia Navy Yard.)
Which brings us back to the sentence that impressed Disraeli. It’s easy to interpret it as an admonition not to think about the future, which isn’t quite right. We can start by observing that it comes at the end of what The Five Gospels notes is possibly “the longest connected discourse that can be directly attributed to Jesus.” It’s the one that asks us to consider the birds of the air and the lilies of the field, which, for a lot of us, prompts an immediate flashback to The Life of Brian. (“Consider the lilies?” “Uh, well, the birds, then.” “What birds?” “Any birds.” “Why?” “Well, have they got jobs?”) But whether or not you agree with the argument, it’s worth noticing that the advice to focus on the evils of each day comes only after an extended attempt at defining a larger set of values—what matters, what doesn’t, and what, if anything, you can change by worrying. You’re only in a position to figure out how best to spend your time after you’ve considered the big questions. As the physician William Osler put it:
[My ideal is] to do the day’s work well and not to bother about tomorrow. You may say that is not a satisfactory ideal. It is; and there is not one which the student can carry with him into practice with greater effect. To it more than anything else I owe whatever success I have had—to this power of settling down to the day’s work and trying to do it well to the best of my ability, and letting the future take care of itself.
This has important implications for both writers and politicians, as well as for progressives who wonder how they’ll be able to get through the next twenty-four hours, much less the next four years. When you’re working on any important project, even the most ambitious agenda comes down to what you’re going to do right now. In On Directing Film, David Mamet expresses it rather differently:
Now, you don’t eat a whole turkey, right? You take off the drumstick and you take a bite of the drumstick. Okay. Eventually you get the whole turkey done. It’ll probably get dry before you do, unless you have an incredibly good refrigerator and a very small turkey, but that is outside the scope of this lecture.
A lot of frustration in art, politics, and life in general comes from attempting to swallow the turkey in one bite. Jesus, I think, was aware of the susceptibility of his followers to grandiose but meaningless gestures, which is why he offered up the advice, so easy to remember and so hard to follow, to simultaneously focus on the given day while keeping the kingdom of heaven in mind. Nearly every piece of practical wisdom in any field is about maintaining that double awareness. Fortunately, it goes in both directions: small acts of discipline aid us in grasping the whole, and awareness of the whole tells us what to do in the moment. As R.H. Blyth says of Zen: “That is all religion is: eat when you are hungry, sleep when you are tired.” And don’t try to eat the entire turkey at once.
Forty years ago, the cinematographer Garrett Brown invented the Steadicam. It was a stabilizer attached to a harness that allowed a camera operator, walking on foot or riding in a vehicle, to shoot the kind of smooth footage that had previously only been possible using a dolly. Before long, it had revolutionized the way in which both movies and television were shot, and not always in the most obvious ways. When we think of the Steadicam, we’re likely to remember virtuoso extended takes like the Copacabana sequence in Goodfellas, but it can also be a valuable tool even when we aren’t supposed to notice it. As the legendary Robert Elswit said recently to the New York Times:
“To me, it’s not a specialty item,” he said. “It’s usually there all the time.” The results, he added, are sometimes “not even necessarily recognizable as a Steadicam shot. You just use it to get something done in a simple way.”
Like digital video, the Steadicam has had a leveling influence on the movies. Scenes that might have been too expensive, complicated, or time-consuming to set up in the conventional manner can be done on the fly, which has opened up possibilities both for innovative stylists and for filmmakers who are struggling to get their stories made at all.
Not surprisingly, there are skeptics. In On Directing Film, which I think is the best book on storytelling I’ve ever read, David Mamet argues that it’s a mistake to think of a movie as a documentary record of what the protagonist does, and he continues:
The Steadicam (a hand-held camera), like many another technological miracle, has done injury; it has injured American movies, because it makes it so easy to follow the protagonist around, one no longer has to think, “What is the shot?” or “Where should I put the camera?” One thinks, instead, “I can shoot the whole thing in the morning.”
This conflicts with Mamet’s approach to structuring a plot, which hinges on dividing each scene into individual beats that can be expressed in purely visual terms. It’s a method that emerges naturally from the discipline of selecting shots and cutting them together, and it’s the kind of hard work that we’re often tempted to avoid. As Mamet adds in a footnote: “The Steadicam is no more capable of aiding in the creation of a good movie than the computer is in the writing of a good novel—both are labor-saving devices, which simplify and so make more attractive the mindless aspects of creative endeavor.” The casual use of the Steadicam seduces directors into conceiving of the action in terms of “little plays,” rather than in fundamental narrative units, and it removes some of the necessity of disciplined thinking beforehand.
But it isn’t until toward the end of the book that Mamet delivers his most ringing condemnation of what the Steadicam represents:
“Wouldn’t it be nice,” one might say, “if we could get this hall here, really around the corner from that door there; or to get that door here to really be the door that opens on the staircase to that door there? So we could just movie the camera from one to the next?”
It took me a great deal of effort and still takes me a great deal and will continue to take me a great deal of effort to answer the question thusly: no, not only is it not important to have those objects literally contiguous; it is important to fight against this desire, because fighting it reinforces an understanding of the essential nature of film, which is that it is made of disparate shorts, cut together. It’s a door, it’s a hall, it’s a blah-blah. Put the camera “there” and photograph, as simply as possible, that object. If we don’t understand that we both can and must cut the shots together, we are sneakily falling victim to the mistaken theory of the Steadicam.
This might all sound grumpy and abstract, but it isn’t. Take Birdman. You might well love Birdman—plenty of viewers evidently did—but I think it provides a devastating confirmation of Mamet’s point. By playing as a single, seemingly continuous shot, it robs itself of the ability to tell the story with cuts, and it inadvertently serves as an advertisement of how most good movies come together in the editing room. It’s an audacious experiment that never needs to be tried again. And it wouldn’t exist at all if it weren’t for the Steadicam.
But the Steadicam can also be a thing of beauty. I don’t want to discourage its use by filmmakers for whom it means the difference between making a movie under budget and never making it at all, as long as they don’t forget to think hard about all of the constituent parts of the story. There’s also a place for the bravura long take, especially when it depends on our awareness of the unfaked passage of time, as in the opening of Touch of Evil—a long take, made without benefit of a Steadicam, that runs the risk of looking less astonishing today because technology has made this sort of thing so much easier. And there’s even room for the occasional long take that exists only to wow us. De Palma has a fantastic one in Raising Cain, which I watched again recently, that deserves to be ranked among the greats. At its best, it can make the filmmaker’s audacity inseparable from the emotional core of the scene, as David Thomson observes of Goodfellas: “The terrific, serpentine, Steadicam tracking shot by which Henry Hill and his girl enter the Copacabana by the back exit is not just his attempt to impress her but Scorsese’s urge to stagger us and himself with bravura cinema.” The best example of all is The Shining, with its tracking shots of Danny pedaling his Big Wheel down the deserted corridors of the Overlook. It’s showy, but it also expresses the movie’s basic horror, as Danny is inexorably drawn to the revelation of his father’s true nature. (And it’s worth noting that much of its effectiveness is due to the sound design, with the alternation of the wheels against the carpet and floor, which is one of those artistic insights that never grows dated.) The Steadicam is a tool like any other, which means that it can be misused. It can be wonderful, too. But it requires a steady hand behind the camera.
When you’re working on any long writing project, whether it’s fiction or nonfiction, you’re eventually forced to deal with the problem of information management. In contrast to what a lot of readers might imagine, most writing—at least for someone like me—doesn’t consist of waiting for inspiration to strike while you’re staring at a blank page. A lot of the work and hard thinking has taken place prior to the physical act of writing a first draft, and more will come later, during the revision process. The rough draft becomes a kind of bottleneck through which ideas have to pass to get from one step to the next, and the challenge is less about coming up with good stuff in the moment than about ordering the material that you already have. If you’ve spent three months thinking about a project and six weeks in the actual writing, which isn’t an unreasonable proportion, you’re faced with the task of mapping one collection of thoughts onto another. The first set is amorphous, disorganized, and accumulated over a long stretch of time; the other needs to be set down in some orderly fashion, in a shorter period, and without forgetting anything important. As a result, many of the tools that writers develop to keep their thoughts straight are really designed to enable a lossless transfer of data in the transition between the chaos of conception and the more linear writing stage.
Over time, I’ve come up with various tricks to keep this information under control. The trouble, as with so much else in life, is that the approaches that work well when you’re first starting out don’t always hold up when you graduate to more complicated projects. Early on, for instance, I used hundreds of index cards to plot out my novels, supplemented with handwritten notes and mind maps, on the belief—which I still hold—that the tactile qualities of pen on paper would generate ideas in themselves. Later, as the individual pieces became too numerous to manage, I switched to keeping track of it all in a series of text files. Without thinking too much about it, I began to use TextEdit, the default text editor that comes packaged with my MacBook. And somewhat to my surprise, I’ve realized that I use it more than any other piece of software. For the actual manuscript, I still use Microsoft Word, but for almost everything else, I turn to TextEdit without hesitation. Why? It opens instantaneously when I click on its icon, as opposed to the ten seconds or so that Word takes to boot up, which makes it ideal as a notepad for jotting down quick thoughts. Even for longer writing sessions, its lack of bells and whistles appeals to me for much the same reason that WordStar still attracts loyalists like George R.R. Martin. There’s nothing between me and the words. In fact, I’m typing the first draft of this blog post on it right now.
But the most interesting use I’ve made of TextEdit is as a kind of filing system for notes. Nearly all of the information I’ve assembled for Astounding, for example, currently lives on one of four text files. One contains general biographical information about John W. Campbell and my other subjects; another consists of notes gleaned from going through 12,000 pages of his correspondence; another holds similar thoughts from reading through four decades of back issues of Astounding, Unknown, and Analog; and the last houses my notes on the hundreds of science fiction stories and novels that I’m reading or rereading for this project. My notes on back issues and stories are arranged chronologically, so I can scroll down and see patterns at a glance, but with the others, I don’t bother with any kind of order: I just type the notes in the first available spot, and I don’t really care where they end up. The result is a set of huge files—the one for biographical details alone is 60,000 words long. But it doesn’t really matter how big it is, because it’s searchable. If I’m looking for a particular piece of information, I just enter a query, either in the search box within TextEdit itself or through Spotlight, which searches the entire hard drive at once. It’s very fast and generally reliable, as long as I know what to look for, and assuming that I was smart enough to peg my notes to some obvious search term in the first place. It’s as if I’ve created a small, highly specialized slice of the Internet that only returns results that have previously passed through my brain.
Needless to say, there are limitations to this approach. My ability to find anything is predicated on my capacity to remember that it exists in the first place, which is harder than it sounds, given the thousands of discrete facts that I need to keep straight for a project like this. Every few months or so, I’ll sit down and read through the whole bulk of my notes in their entirety, which takes several days, just to refresh my memory about what I’ve got. It isn’t a perfect solution, but it’s arguably better than trying to do the same with handwritten notes. There’s also a real loss when it comes to the physical manipulation of ideas: I still do mind maps and write down ideas in my notebook whenever possible, and I can even do a rough version of shuffling the pieces in TextEdit by copying and pasting chunks of text until they fall into an order that makes sense. (Much of this, I imagine, would be possible in programs like Scrivener, but I prefer my more flexible approach.) A lot of it also depends on how much I can keep organized in my own head. It’s impossible to imagine writing a whole book at once in this fashion, but as David Mamet once said, you eat a turkey one bite at a time. I’m familiar enough with my own attention span to know how much I can handle—usually the equivalent of three chapters or so—at any given moment. So far, it seems to be working pretty well. Taking notes, as I’ve said elsewhere, amounts to a message that you send from the past to the future. And while I still miss my cards sometimes, I’ve found that it’s easier to just text myself.
Last week, I mentioned what I’ve come to see as the most valuable piece of writing wisdom I know, which is David Mamet’s advice in Some Freaks “to go one achievable step at a time.” You don’t try to do everything at once, which is probably impossible anyway. Instead, there are days in which you do “careful” jobs that are the artistic equivalent of housekeeping—research, making outlines of physical actions, working out the logic of the plot—and others in which you perform “inventive” tasks that rely on intuition. This seems like common sense: it’s hard enough to be clever or imaginative as it is, without factoring in the switching costs associated with moving from one frame of mind to another. The writer Colin Wilson believed that the best ideas emerge when your left and right hemispheres are moving at the same rate, which tends to occur in moments of either reverie or high excitement. This is based on an outdated model of how the brain works, but the phenomenon it describes is familiar enough, and it’s just a small step from there to acknowledging that neither ecstatic nor dreamlike mental states are particularly suited for methodical work. When you’re laying the foundations for future creative activity, you usually end up somewhere in the middle, in a state of mind that is focused but not heightened, less responsive to connections than to units, and concerned more with thoroughness than with inspiration. It’s an important stage, but it’s also the last place where you’d expect real insights to appear.
Clearly, a writer should strive to work with, rather than against, this natural division of labor. It’s also easy to agree with Mamet’s advice that it’s best to tackle one kind of thinking per day. (Mental switching costs of any kind are usually minimized when you’ve had a good night’s sleep in the meantime.) The real question is how to figure out what sort of work you should be doing at any given moment, and, crucially, whether it’s possible to predict this in advance. Any writer can tell you that there’s an enormous difference between getting up in the morning without any idea of what you’re doing that day, which is the mark of an amateur, and having a concrete plan—which is why professional authors use such tools as outlines and calendars. Ideally, it would be nice to know when you woke up whether it was going to be a “careful” day or an “inventive” day, which would allow you to prepare yourself accordingly. Sometimes the organic life cycle of a writing project supplies the answer: depending on where you are in the process, you engage in varying proportions of careful or inventive thought. But every stage requires some degree of both. As Mamet implies, you’ll often alternate between them, although not as neatly as in his hypothetical example. And while it might seem pointless to allocate time for inspiration, which appears according to no fixed schedule, you can certainly create the conditions in which it’s more likely to appear. But how do you know when?
I’ve come up with a simple test to answer this question: I ask myself how much time I expect to spend sitting down. Usually, before a day begins, I have a pretty good sense of how much sitting or standing I’ll be doing, and that’s really all I need to make informed decisions about how to use my time. There are some kinds of creative work that demand sustained concentration at a desk or in a seated position. This includes most of the “careful” tasks that Mamet describes, but also certain forms of intuitive, nonlinear thinking, like making a mind map. By contrast, there are other sorts of work that not only don’t require you to be at your desk, but are actively stifled by it: daydreaming, brooding over problems, trying to sketch out large blocks of the action. You often do a better job of it when you’re out taking a walk, or in the bus, bath, or bed. When scheduling creative work, then, you should start by figuring out what your body is likely to be doing that day, and then use this to plan what to do with your mind. Your brain has no choice but to tag along with your body when it’s running errands or standing in line at the bank, but if you structure your time appropriately, those moments won’t go to waste. And it’s often such external factors, rather than the internal logic of where you should be in the process, that determine what you should be doing.
At first glance, this doesn’t seem that much different from the stock advice that you should utilize whatever time you have available, whether you’re washing the dishes or taking a shower. But I think it’s a bit more nuanced than this, and that it’s more about matching the work to be done to the kind of time you have. If you try to think systematically and carefully while taking a walk in the park, you’ll feel frustrated when your mind wanders to other subjects. Conversely, if you try to daydream at your desk, not only are you likely to feel boxed in by your surroundings, but you’re also wasting valuable time that would be better spent on work that only requires the Napoleonic virtues of thoroughness and patience. Inspiration can’t be forced, and you don’t know in advance if you’re better off being careful or inventive on any given day—but the amount of time that you’ll be seated provides an important clue. (You can also reverse the process, and arrange to be seated as little as possible on days when you hope to get some inventive thinking done. For most of us, unfortunately, this isn’t entirely under our control, which makes it all the more sensible to take advantage of such moments when they present themselves.) And it doesn’t need to be planned beforehand. If you’re at work on a problem and you’re not sure what kind of thinking you should be doing, you can look at yourself and ask: Am I sitting down right now? And that’s all the information you need.