Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Robert Gottlieb

The unfinished lives

with 3 comments

Yesterday, the New York Times published a long profile of Donald Knuth, the legendary author of The Art of Computer Programming. Knuth is eighty now, and the article by Siobhan Roberts offers an evocative look at an intellectual giant in twilight:

Dr. Knuth usually dresses like the youthful geek he was when he embarked on this odyssey: long-sleeved T-shirt under a short-sleeved T-shirt, with jeans, at least at this time of year…Dr. Knuth lives in Stanford, and allowed for a Sunday visitor. That he spared an entire day was exceptional—usually his availability is “modulo nap time,” a sacred daily ritual from 1 p.m. to 4 p.m. He started early, at Palo Alto’s First Lutheran Church, where he delivered a Sunday school lesson to a standing-room-only crowd.

This year marks the fiftieth anniversary of the publication of the first volume of Knuth’s most famous work, which is still incomplete. Knuth is busy writing the fourth installment, one fascicle at a time, although its most recent piece has been delayed “because he keeps finding more and more irresistible problems that he wants to present.” As Roberts writes: “Dr. Knuth’s exacting standards, literary and otherwise, may explain why his life’s work is nowhere near done. He has a wager with Sergey Brin, the co-founder of Google and a former student…over whether Mr. Brin will finish his Ph.D. before Dr. Knuth concludes his opus…He figures it will take another twenty-five years to finish The Art of Computer Programming, although that time frame has been a constant since about 1980.”

Knuth is a prominent example, although far from the most famous, of a literary and actuarial phenomenon that has grown increasingly familiar—an older author with a projected work of multiple volumes, published one book at a time, that seems increasingly unlikely to ever see completion. On the fiction side, the most noteworthy case has to be that of George R.R. Martin, who has been fielding anxious inquiries from fans for most of the last decade. (In an article that appeared seven long years ago in The New Yorker, Laura Miller quotes Martin, who was only sixty-three at the time: “I’m still getting e-mail from assholes who call me lazy for not finishing the book sooner. They say, ‘You better not pull a Jordan.’”) Robert A. Caro is still laboring over what he hopes will be the final volume of his biography of Lyndon Johnson, and mortality has become an issue not just for him, but for his longtime editor, as we read in Charles McGrath’s classic profile in the Times:

Robert Gottlieb, who signed up Caro to do The Years of Lyndon Johnson when he was editor in chief of Knopf, has continued to edit all of Caro’s books, even after officially leaving the company. Not long ago he said he told Caro: “Let’s look at this situation actuarially. I’m now eighty, and you are seventy-five. The actuarial odds are that if you take however many more years you’re going to take, I’m not going to be here.”

That was six years ago, and both men are still working hard. But sometimes a writer has no choice but to face the inevitable. When asked about the concluding fifth volume of his life of Picasso, with the fourth one still on the way, the biographer John Richardson said candidly: “Listen, I’m ninety-one—I don’t think I have time for that.”

I don’t have the numbers to back this up, but such cases—or at least the public attention that they inspire—seem to be growing more common these days, on account of some combination of lengthening lifespans, increased media coverage of writers at work, and a greater willingness from publishers to agree to multiple volumes in the first place. The subjects of such extended commitments tend to be monumental in themselves, in order to justify the total investment of the writer’s own lifetime, and expanding ambitions are often to blame for blown deadlines. Martin, Caro, and Knuth all increased the prospective number of volumes after their projects were already underway, or as Roberts puts it: “When Dr. Knuth started out, he intended to write a single work. Soon after, computer science underwent its Big Bang, so he reimagined and recast the project in seven volumes.” And this “recasting” seems particularly common in the world of biographies, as the author discovers more material that he can’t bear to cut. The first few volumes may have been produced with relative ease, but as the years pass and anticipation rises, the length of time it takes to write the next installment grows, until it becomes theoretically infinite. Such a radical change of plans, which can involve extending the writing process for decades, or even beyond the author’s natural lifespan, requires an indulgent publisher, university, or other benefactor. (John Richardson’s book has been underwritten by nothing less than the John Richardson Fund for Picasso Research, which reminds me of what Homer Simpson said after being informed that he suffered from Homer Simpson syndrome: “Oh, why me?”) And it may not be an accident that many of the examples that first come to mind are white men, who have the cultural position and privilege to take their time.

It isn’t hard to understand a writer’s reluctance to let go of a subject, the pressures on a book being written in plain sight, or the tempting prospect of working on the same project forever. And the image of such authors confronting their mortality in the face of an unfinished book is often deeply moving. One of the most touching examples is that of Joseph Needham, whose Science and Civilization in China may have undergone the most dramatic expansion of them all, from an intended single volume to twenty-seven and counting. As Kenneth Girdwood Robinson writes in a concluding posthumous volume:

The Duke of Edinburgh, Chancellor of the University of Cambridge, visited The Needham Research Institute, and interested himself in the progress of the project. “And how long will it take to finish it?” he enquired. On being given a rather conservative answer, “At least ten years,” he exclaimed, “Good God, man, Joseph will be dead before you’ve finished,” a very true appreciation of the situation…In his closing years, though his mind remained lucid and his memory astonishing, Needham had great difficulty even in moving from one chair to another, and even more difficulty in speaking and in making himself understood, due to the effect of the medicines he took to control Parkinsonism. But a secretary, working closely with him day by day, could often understand what he had said, and could read what he had written, when others were baffled.

Needham’s decline eventually became impossible to ignore by those who knew him best, as his biographer Simon Winchester writes in The Man Who Loved China: “It was suggested that, for the first time in memory, he take the day off. It was a Friday, after all: he could make it a long weekend. He could charge his batteries for the week ahead. ‘All right,’ he said. ‘I’ll stay at home.’” He died later that day, with his book still unfinished. But it had been a good life.

Into the West

leave a comment »

A few months ago, I was on the phone with a trusted adviser to discuss some revisions to Astounding. We were focusing on the prologue, which I had recently rewritten from scratch to make it more accessible to readers who weren’t already fans of science fiction. Among other things, I’d been asked to come up with ways in which the impact of my book’s four subjects was visible in modern pop culture, and after throwing some ideas back and forth, my adviser asked me plaintively: “Couldn’t you just say that without John W. Campbell, we wouldn’t have Game of Thrones?” I was tempted to give in, but I ultimately didn’t—it just felt like too much of a stretch. (Which isn’t to say that the influence isn’t there. When a commenter on his blog asked whether his work had been inspired by the mythographer Joseph Campbell, George R.R. Martin replied: “The Campbell that influenced me was John W., not Joseph.” And that offhand comment was enough of a selling point that I put it in the very first sentence of my book proposal.) Still, I understood the need to frame the story in ways that would resonate with a mainstream readership, and I thought hard about what other reference points I could honestly provide. Star Trek was an easy one, along with such recent movies as Interstellar and The Martian, but the uncomfortable reality is that much of what we call science fiction in film and television has more to do with Star Wars. But I wanted to squeeze in one last example, and I finally settled on this line about Campbell: “For more than three decades, an unparalleled series of visions of the future passed through his tiny office in New York, where he inaugurated the main sequence of science fiction that runs through works from 2001 to Westworld.”

As the book is being set in type, I’m still comfortable with this sentence as it stands, although there are a few obvious qualifications that ought to be made. Westworld, of course, is based on a movie written and directed by Michael Crichton, whose position in the history of the genre is a curious one. As I’ve written elsewhere, Crichton was an unusually enterprising author of paperback thrillers who found himself with an unexpected blockbuster in the form of The Andromeda Strain. It was his sixth novel, and his first in hardcover, and it seems to have benefited enormously from the input of editor Robert Gottlieb, who wrote in his memoir Avid Reader:

The Andromeda Strain was a terrific concept, but it was a mess—sloppily plotted, underwritten, and worst of all, with no characterization whatsoever. [Crichton’s] scientists were beyond generic—they lacked all human specificity; the only thing that distinguished some of them from the others was that some died and some didn’t. I realized right away that with his quick mind, swift embrace of editorial input, and extraordinary work habits he could patch the plot, sharpen the suspense, clarify the science—in fact, do everything necessary except create convincing human beings. (He never did manage to; eventually I concluded that he couldn’t write about people because they just didn’t interest him.) It occurred to me that instead of trying to help him strengthen the human element, we could make a virtue of necessity by stripping it away entirely; by turning The Andromeda Strain from a documentary novel into a fictionalized documentary. Michael was all for it—I think he felt relieved.

The result, to put it mildly, did quite well, and Crichton quickly put its lessons to work. But it’s revealing that the flaws that Gottlieb cites—indifferent plotting, flat writing, and a lack of real characterization—are also typical of even some of the best works of science fiction that came out of Campbell’s circle. Crichton’s great achievement was to focus relentlessly on everything else, especially readability, and it’s fair to say that he did a better job of it than most of the writers who came up through Astounding and Analog. He was left with the reputation of a carpetbagger, and his works may have been too square and fixated on technology to ever be truly fashionable. Yet a lot of it can be traced back to his name on the cover. In his story “Pierre Menard, Author of the Quixote,” Jorge Luis Borges speaks of enriching “the slow and rudimentary act of reading by means of a new technique—the technique of deliberate anachronism and fallacious attribution.” In this case, it’s pretty useful. I have a hunch that if The Terminal Man, Congo, and Sphere had been attributed on their first release to Robert A. Heinlein, they would be regarded as minor classics. They’re certainly better than many of the books that Heinlein was actually writing around the same time. And if I’m being honest, I should probably confess that I’d rather read Jurassic Park again than any of Asimov’s novels. (As part of my research for this book, I dutifully made my way through Asimov’s novelization of Fantastic Voyage, which came out just three years before The Andromeda Strain, and his fumbling of that very Crichtonesque premise only reminded me of how good at this sort of thing Crichton really was.) If Crichton had been born thirty years earlier, John W. Campbell would have embraced him like a lost son, and he might well have written a better movie than Destination Moon.

At its best, the television version of Westworld represents an attempt to reconcile Crichton’s gifts for striking premises and suspense with the more introspective mode of the genre to which he secretly belongs. (It’s no accident that Jonathan Nolan had been developing it in parallel with Foundation.) This balance hasn’t always been easy to manage, and last night’s premiere suggests that it can only become more difficult going forward. Westworld has always seemed defined by the pattern of forces that were acting on it—its source material, its speculative and philosophical ambitions, and the pressure of being a flagship drama on HBO. It also has to deal now with the legacy of its own first season, which set a precedent for playing with time, as well as the scrutiny of viewers who figured it out prematurely. The stakes here are established early on, with Bernard awakening on a beach in a sequence that seems like a nod to the best film by Nolan’s brother, and this time around, the parallel timelines are put front and center. Yet the strain occasionally shows. The series is still finding itself, with characters, like Dolores, who seem to be thinking through their story arcs out loud. It’s overly insistent on its violence and nudity, but it’s also cerebral and detached, with little possibility of real emotional pain that the third season of Twin Peaks was able to inflict. I don’t know if the center will hold. Yet’s also possible that these challenges were there from the beginning, as the series tried to reconcile Crichton’s tricks with the tradition of science fiction that it clearly honors. I still believe that this show is in the main line of the genre’s development. Its efforts to weave together its disparate influences strike me as worthwhile and important. And I hope that it finds its way home.

Thinking on your feet

leave a comment »

The director Elia Kazan, whose credits included A Streetcar Named Desire and On the Waterfront, was proud of his legs. In his memoirs, which the editor Robert Gottlieb calls “the most gripping and revealing book I know about the theater and Hollywood,” Kazan writes of his childhood:

Everything I wanted most I would have to obtain secretly. I learned to conceal my feelings and to work to fulfill them surreptitiously…What I wanted most I’d have to take—quietly and quickly—from others. Not a logical step, but I made it at a leap. I learned to mask my desires, hide my truest feeling; I trained myself to live in deprivation, in silence, never complaining, never begging, in isolation, without expecting kindness or favors or even good luck…I worked waxing floors—forty cents an hour. I worked at a small truck farm across the road—fifty cents an hour. I caddied every afternoon I could at the Wykagyl Country Club, carrying the bags of middle-aged women in long woolen skirts—a dollar a round. I spent nothing. I didn’t take trolleys; I walked. Everywhere. I have strong leg muscles from that time.

The italics are mine, but Kazan emphasized his legs often enough on his own. In an address that he delivered at a retrospective at Wesleyan University in 1973, long after his career had peaked, he told the audience: “Ask me how with all that knowledge and all that wisdom, and all that training and all those capabilities, including the strong legs of a major league outfielder, how did I manage to mess up some of the films I’ve directed so badly?”

As he grew older, Kazan’s feelings about his legs became inseparable from his thoughts on his own physical decline. In an essay titled “The Pleasures of Directing,” which, like the address quoted above, can be found in the excellent book Kazan on Directing, Kazan observes sadly: “They’ve all said it. ‘Directing is a young man’s game.’ And time passing proves them right.” He continues:

What goes first? With an athlete, the legs go first. A director stands all day, even when he’s provided with chairs, jeeps, and limos. He walks over to an actor, stands alongside and talks to him; with a star he may kneel at the side of the chair where his treasure sits. The legs do get weary. Mine have. I didn’t think it would happen because I’ve taken care of my body, always exercised. But I suddenly found I don’t want to play singles. Doubles, okay. I stand at the net when my partner serves, and I don’t have to cover as much ground. But even at that…

I notice also that I want a shorter game—that is to say also, shorter workdays, which is the point. In conventional directing, the time of day when the director has to be most able, most prepared to push the actors hard and get what he needs, usually the close-ups of the so-called “master scene,” is in the afternoon. A director can’t afford to be tired in the late afternoon. That is also the time—after the thoughtful quiet of lunch—when he must correct what has not gone well in the morning. He better be prepared, he better be good.

As far as artistic advice goes, this is as close to the real thing as it gets. But it can only occur to an artist who can no longer take for granted the energy on which he has unthinkingly relied for most of his life.

Kazan isn’t the only player in the film industry to draw a connection between physical strength—or at least stamina—and the medium’s artistic demands. Guy Hamilton, who directed Goldfinger, once said: “To be a director, all you need is a hide like a rhinoceros—and strong legs, and the ability to think on your feet…Talent is something else.” None other than Christopher Nolan believes so much in the importance of standing that he’s institutionalized it on his film sets, as Mark Rylance recently told The Independent: “He does things like he doesn’t like having chairs on set for actors or bottles of water, he’s very particular…[It] keeps you on your toes, literally.” Walter Murch, meanwhile, noted that a film editor needed “a strong back and arms” to lug around reels of celluloid, which is less of a concern in the days of digital editing, but still worth bearing in mind. Murch famously likes to stand while editing, like a surgeon in the operating room:

Editing is sort of a strange combination of being a brain surgeon and a short-order cook. You’ll never see those guys sitting down on the job. The more you engage your entire body in the process of editing, the better and more balletic the flow of images will be. I might be sitting when I’m reviewing material, but when I’m choosing the point to cut out of a shot, I will always jump out of the chair. A gunfighter will always stand, because it’s the fastest, most accurate way to get to his gun. Imagine High Noon with Gary Cooper sitting in a chair. I feel the fastest, most accurate way to choose the critically important frame I will cut out of a shot is to be standing. I have kind of a gunfighter’s stance.

And as Murch suggests, this applies as much to solitary craftsmen as it does to the social and physical world of the director. Philip Roth, who worked at a lectern, claimed that he paced half a mile for every page that he wrote, while the mathematician Robert P. Langlands reflected: “[My] many hours of physical effort as a youth also meant that my body, never frail but also initially not particularly strong, has lasted much longer than a sedentary occupation might have otherwise permitted.” Standing and walking can be a proxy for mental and moral acuity, as Bertrand Russell implied so memorably:

Our mental makeup is suited to a life of very severe physical labor. I used, when I was younger, to take my holidays walking. I would cover twenty-five miles a day, and when the evening came I had no need of anything to keep me from boredom, since the delight of sitting amply sufficed. But modern life cannot be conducted on these physically strenuous principles. A great deal of work is sedentary, and most manual work exercises only a few specialized muscles. When crowds assemble in Trafalgar Square to cheer to the echo an announcement that the government has decided to have them killed, they would not do so if they had all walked twenty-five miles that day.

Such energy, as Kazan reminds us, isn’t limitless. I still think of myself as relatively young, but I don’t have the raw mental or physical resources that I did fifteen years ago, and I’ve had to come up with various tricks—what a pickup basketball player might call “old-man shit”—to maintain my old levels of productivity. I’ve written elsewhere that certain kinds of thinking are best done sitting down, but there’s also a case to be made for thinking on your feet. Standing is the original power pose, and perhaps the only one likely to have any real effects. And it’s in the late afternoons, both of a working day and of an entire life, that you need to stand and deliver.

Science and civilization

with 3 comments

Over the last week, I picked up two books—at the annual Newberry Library and Oak Park Public Library book sales, which are always a high point of my year—that I’d been hoping to find for a long time. One is a single volume, Civil Engineering and Nautics, of Joseph Needham’s landmark Science and Civilization in China, which currently consists of twenty-seven huge books that I all unreasonably hope to own one day. The other is a slim fascicle, or paperbound installment from a work in progress, from Donald Knuth’s The Art of Computer Programming, which, if we’re lucky, will release its fourth volume sometime in the next decade. These two projects are rather different in scale, but remarkably similar in their conception and incubation. Needham worked on his book for close to half a century without finishing it, and Knuth has been laboring on his for even longer, with no obvious end in sight. I’ve been intrigued by such grand projects for most of my life, but I’ve become even more interested after embarking on my own venture into nonfiction. Replace “computer programming” with “science fiction” and “discovered” with “written,” and what Knuth once said in an interview gets very close to my attitude two years ago when I started writing Astounding:

At the time, everybody I knew who could write a book summarizing what was known about computer programming had discovered quite a lot of the ideas themselves…By contrast, I hadn’t really discovered anything new by myself at that point. I was just a good writer…I had this half-conceited and half-unconceited view that I could explain it more satisfactorily than the others because of my lack of bias. I didn’t have any axes to grind but my own.

Knuth concludes: “Then, of course, as I started to write things I naturally discovered one or two new things as I went, and now I am just as biased as anybody.” Which pretty sums up my experience, too.

And what really fascinates me about both projects is how monstrously these tomes grew past their projected dimensions, both in space and in time. Both Needham and Knuth thought at first that their work would fit within a single volume, and although they each expanded it to the magic number of seven, neither seems to have grasped just how long it would take. Knuth recalls:

My original motivation was to write a text about how to write compilers, so I began drafting chapters. I was seriously planning to finish the book before my son was born…In June 1965, I had finally finished the first draft of the twelve chapters. It amounted to three thousand handwritten pages of manuscript…I figured about five pages of my handwriting would be about one page of a book.

As it turned out, he was a little off: the real proportion was one and a half handwritten pages to a single page in type, which meant that he had already written two thousand pages without even getting past the subject of compilers. Needham had a similar moment of clarity. As Simon Winchester writes in his biography The Man Who Loved China:

Needham had decreed early on in the process, as he watched each volume begin to swell and threaten to burst out of its covers, that no one volume should be “too big for a man to read comfortably in his bath.” But it was happening nonetheless…One book became two, three, or four. Volume V, a special case, became not five, but thirteen formal subsidiary parts, each one of them big and complicated enough to be made into a separate, self-standing, and equally enormous new volume of its own.

It’s frankly hard to imagine reading any of these expensive books in the tub, but Needham says elsewhere, more realistically, that critics found the volumes “too heavy and bulky for meditative evening reading,” which led to the work being repeatedly subdivided.

The Art of Computer Programming was released by a commercial educational publisher, Addison-Wesley, but it isn’t surprising that most such books tend to appear at academic presses, which are the only institutions capable of sustaining a project that lasts for decades. (Their sole competition here might be the Catholic Church, which has been underwriting a critical edition of the works of St. Thomas Aquinas since 1879. They’re about halfway through.) Winchester refers in passing to “the beleaguered Cambridge University Press, which was obliged to tolerate the constant expansion of the project,” and for the full picture, you can turn to the book A Skeptic Among Scholars, by August Frugé, the director of the University of California Press. He writes of The Plan of St. Gall, a three-volume monument of scholarship that is probably the most beautiful book I own:

The St. Gall manuscript…was said in 1960 to be in semifinal draft, about one hundred and fifty pages in length. When approved by the Editorial Committee and accepted by me in 1967, it came to several hundred typed pages, about right for a single quarto volume. As the work moved through the production process during the next twelve years, we paused every now and then to call for new estimates of size and cost, and each time discovered that new sections had been added, along with a few dozen new diagrams and drawings.

At one point, concerns about cost threatened to derail the whole enterprise, and Frugé retired before the three huge folios were published. James H. Clark, his successor, saw it to completion, writing later: “But what is a university press for if not to take these kinds of risks, make these investments, and publish books that make a difference?” Aside from Knuth, one of the few comparable examples on the commercial side must be Robert Caro’s The Years of Lyndon Johnson, which was originally planned as three volumes to be written over about six years. Forty years and four books later, Caro still isn’t done, and the fact that he has been allowed to keep working at the same methodical pace is a tribute to his longtime editor Robert Gottlieb at Knopf.

And if there’s one key takeaway from the examples I’ve mentioned, it’s that none of these authors set out to devote their lives to these projects—they all thought at first that they could finish it within a few years. Knuth recalls:

It gradually dawned on me how large a project this was going to be. If I had realized that at the beginning, I wouldn’t have been foolish enough to start; I wouldn’t have dared to tackle such a thing…[But] I had collected so much material that I felt it was my duty to continue with the project even though it would take a lot longer than I had originally expected.

You also realize that you can’t explain the subject at hand without covering a lot of other material first. Caro treats his books on Johnson as windows onto such vistas as local politics, Texas, and the Senate, which is a big part of their appeal. (The equivalent in my case would be deciding that I couldn’t write the life of John W. Campbell in a comprehensible form without telling the entire history of science fiction, too, which might well be true.) Frugé, perhaps to his credit, ventures a more cynical reading:

In my skeptical and perhaps scatterbrained way, I sometimes wonder how a research scholar can work on the same project decade after decade and retain faith in its intellectual importance. Perhaps some do not, and that is why their books are never completed. But we can also observe an opposite phenomenon. As the years go by the object or document for study may swell and expand in importance until—until, for example, “The Plan of St. Gall is…one of the most fascinating creations of the human mind.”

He makes a good point. The cycle feeds on itself, with the work expanding in scope to justify the amount of time it takes. It’s human nature, and there’s something a little absurd about it. But it’s also the only way we get art, science, or civilization.

Cutty Sark and the semicolon

leave a comment »

Vladimir Nabokov

Note: I’m taking a few days off, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on December 22, 2015.

In an interview that was first published in The Paris Review, the novelist Herbert Gold asked Vladimir Nabokov if an editor had ever offered him any useful advice. This is what Nabokov said in response:

By “editor” I suppose you mean proofreader. Among these I have known limpid creatures of limitless tact and tenderness who would discuss with me a semicolon as if it were a point of honor—which, indeed, a point of art often is. But I have also come across a few pompous avuncular brutes who would attempt to “make suggestions” which I countered with a thunderous “stet!”

I’ve always adored that thunderous stet, which tells us so much about Nabokov and his imperious resistance to being edited by anybody. Today, however, I’m more interested in the previous sentence. A semicolon, as Nabokov puts it, can indeed be a point of honor. Nabokov was perhaps the most painstaking of all modern writers, and it’s no surprise that the same perfectionism that produced such conceptual and structural marvels as Lolita and Pale Fire would filter down to the smallest details. But I imagine that even ordinary authors can relate to how a single punctuation mark in a manuscript can start to loom as large as the finger of God on the Sistine Chapel ceiling.

And there’s something about the semicolon that seems to inspire tussles between writers and their editors—or at least allows it to stand as a useful symbol of the battles that can occur during the editorial process. Here’s an excerpt from a piece by Charles McGrath in The New York Times Magazine about the relationship between Robert Caro, author of The Years of Lyndon Johnson, and his longtime editor Robert Gottlieb:

“You know that insane old expression, ‘The quality of his defect is the defect of his quality,’ or something like that?” Gottlieb asked me. “That’s really true of Bob. What makes him such a genius of research and reliability is that everything is of exactly the same importance to him. The smallest thing is as consequential as the biggest. A semicolon matters as much as, I don’t know, whether Johnson was gay. But unfortunately, when it comes to English, I have those tendencies, too, and we could go to war over a semicolon. That’s as important to me as who voted for what law.”

It’s possible that the semicolon keeps cropping up in such stories because its inherent ambiguity lends itself to disagreement. As Kurt Vonnegut once wrote: “Here is a lesson in creative writing. First rule: Do not use semicolons. They are transvestite hermaphrodites representing absolutely nothing. All they do is show you’ve been to college.” And I’ve more or less eliminated semicolons from my own work for much the same reason.

Robert De Niro and Martin Scorsese on the set of Raging Bull

But the larger question here is why artists fixate on things that even the most attentive reader would pass over without noticing. On one level, you could take a fight over a semicolon as an illustration of the way that the creative act—in which the artist is immersed in the work for months on end—tends to turn mountains into molehills. Here’s one of my favorite stories about the making of Raging Bull:

One night, when the filmmakers were right up against the deadline to make their release date, they were working on a nothing little shot that takes place in a nightclub, where a minor character turns to the bartender and orders a Cutty Sark. “I can’t hear what he’s saying,” [Martin Scorsese] said. Fiddling ensued—extensive fiddling—without satisfying him. [Producer Irwin] Winkler, who was present, finally deemed one result good enough and pointed out that messengers were standing by to hand-carry release prints to the few theaters where the picture was about to premiere. At which point, Scorsese snapped. “I want my name taken off the picture,” he cried—which bespeaks his devotion to detail. It also bespeaks his exhaustion at the end of Raging Bull, not to mention the craziness that so often overtakes movies as they wind down. Needless to say, he was eventually placated. And you can more or less hear the line in the finished print.

And you could argue that this kind of microscopic attention is the only thing that can lead to a work that succeeds on the largest possible scale.

But there’s yet another story that gets closer to truth. In Existential Errands, Norman Mailer describes a bad period in his life—shortly after he was jailed for stabbing his second wife Adele—in which he found himself descending into alcoholism and unable to work. His only source of consolation were the scraps of paper, “little crossed communications from some wistful outpost of my mind,” that he would find in his jacket pocket after a drunken night. Mailer writes of these poems:

I would go to work, however, on my scraps of paper. They were all I had for work. I would rewrite them carefully, printing in longhand and ink, and I would spend hours whenever there was time going over these little poems…And since I wasn’t doing anything else very well in those days, I worked the poems over every chance I had. Sometimes a working day would go by, and I might put a space between two lines and remove a word. Maybe I was mending.

Which just reminds us that a seemingly minuscule change can be the result of a prolonged confrontation with the work as a whole. You can’t obsess over a semicolon without immersing yourself in the words around it, and there are times when you need such a focal point to structure your engagement with the rest. It’s a little like what is called a lakshya in yoga: the tiny spot on the body or in the mind on which you concentrate while meditating. In practice, the lakshya can be anything or nothing, but without it, your attention tends to drift. In art, it can be a semicolon, a word, or a line about Cutty Sark. It may not be much in itself. But when you need to tether yourself to something, even a semicolon can be a lifeline.

Assisted living

leave a comment »

If you’re a certain kind of writer, whenever you pick up a new book, instead of glancing at the beginning or opening it to a random page, you turn immediately to the acknowledgments. Once you’ve spent any amount of time trying to get published, that short section of fine print starts to read like a gossip column, a wedding announcement, and a high school yearbook all rolled into one. For most writers, it’s also the closest they’ll ever get to an Oscar speech, and many of them treat it that way, with loving tributes and inside jokes attached to every name. It’s a chance to thank their editors and agents—while the unagented reader suppresses a twinge of envy—and to express gratitude to various advisers, colonies, and fellowships. (The most impressive example I’ve seen has to be in The Lisle Letters by Muriel St. Clare Byrne, which pays tribute to the generosity of “Her Majesty Queen Elizabeth II.”) But if there’s one thing I’ve learned from the acknowledgments that I’ve been reading recently, it’s that I deserve an assistant. It seems as if half the nonfiction books I see these days thank a whole squadron of researchers, inevitably described as “indefatigable,” who live in libraries, work through archives and microfilm reels, and pass along the results to their grateful employers. If the author is particularly famous, like Bob Woodward or Kurt Eichenwald, the acknowledgment can sound like a letter of recommendation: “I was startled by his quick mind and incomparable work ethic.” Sometimes the assistants are described in such glowing terms that you start to wonder why you aren’t reading their books instead. And when I’m trying to decipher yet another illegible scan of a carbon copy of a letter written fifty years ago on a manual typewriter, I occasionally wish that I could outsource it to an intern.

But there are also good reasons for doing everything yourself, at least at the early stages of a project. In his book The Integrity of the Body, the immunologist Sir Frank Macfarlane Burnet says that there’s one piece of advice that he always gives to “ambitious young research workers”: “Do as large a proportion as possible of your experiments with your own hands.” In Discovering, Robert Scott Root-Bernstein expands on this point:

When you climb those neighboring hills make sure you do your own observing. Many scientists assign all experimental work to lab techs and postdocs. But…only the prepared mind will note and attach significance to an anomaly. Each individual possesses a specific blend of personality, codified science, science in the making, and cultural biases that will match particular observations. If you don’t do your own observing, the discovery won’t be made. Never delegate research.

Obviously, there are situations in which you can’t avoid delegating the work to some degree. But I think Root-Bernstein gets at something essential when he frames it in terms of recognizing anomalies. If you don’t sift through the raw material yourself, it’s difficult to know what is unusual or important, and even if you have a bright assistant who will flag any striking items for your attention, it’s hard to put them in perspective. As I’ve noted elsewhere, drudgery can be an indispensable precursor to insight. You’re more likely to come up with worthwhile connections if you’re the one mining the ore.

This is why the great biographers and historians often seem like monsters of energy. I never get tired of quoting the advice that Alan Hathaway gave to the young Robert Caro at Newsday: “Turn every goddamn page.” Caro took this to heart, noting proudly of one of the archives he consulted: “The number [of pages] may be in the area of forty thousand. I don’t know how many of these pages I’ve read, but I’ve read a lot of them.” And it applies to more than just what you read, as we learn from a famous story about Caro and his editor Robert Gottlieb:

Gott­lieb likes to point to a passage fairly early in The Power Broker describing Moses’ parents one morning in their lodge at Camp Madison, a fresh-air charity they established for poor city kids, picking up the Times and reading that their son had been fined $22,000 for improprieties in a land takeover. “Oh, he never earned a dollar in his life, and now we’ll have to pay this,” Bella Moses says.

“How do you know that?” Gottlieb asked Caro. Caro explained that he tried to talk to all of the social workers who had worked at Camp Madison, and in the process he found one who had delivered the Moseses’ paper. “It was as if I had asked him, ‘How do you know it’s raining out?’”

This is the kind of thing that you’d normally ask your assistant to do, if it occurred to you at all, and it’s noteworthy that Caro has kept at it long after he could have hired an army of researchers. Instead, he relies entirely on his wife Ina, whom he calls “the only person besides myself who has done research on the four volumes of The Years of Lyndon Johnson or on the biography of Robert Moses that preceded them, the only person I would ever trust to do so.” And perhaps a trusted spouse is the best assistant you could ever have.

Of course, there are times when an assistant is necessary, especially if, unlike Caro, you’re hoping to finish your project in fewer than forty years. But it’s often the assistant who benefits. As one of them recalled:

I was working for [Professor] Bernhard J. Stern…and since he was writing a book on social resistance to technological change, he had me reading a great many books that might conceivably be of use to him. My orders were to take note of any passages that dealt with the subject and to copy them down.

It was a liberal education for me and I was particularly struck by a whole series of articles by astronomer Simon Newcomb, which I read at Stern’s direction. Newcomb advanced arguments that demonstrated the impossibility of heavier-than-air flying machines, and maintained that one could not be built that would carry a man. While these articles were appearing, the Wright brothers flew their plane. Newcomb countered with an article that said, essentially, “Very well, one man, but not two.”

Every significant social advance roused opposition on the part of many, it seemed. Well, then, shouldn’t space flight, which involved technological advances, arouse opposition too?

The assistant in question was Isaac Asimov, who used this idea as the basis for his short story “Trends,” which became his first sale to John W. Campbell. It launched his career, and the rest is history. And that’s part of the reason why, when I think of my own book, I say to myself: “Very well, one man, but not two.”

The chosen one

leave a comment »

The Chosen by Chaim Potok

After the two novels [Chaim Potok] had written were reduced to one and the usual editorial work was accomplished, there remained a major problem: the title. I can’t remember what the original one was, but it was hopelessly fancy. Some books arrive with perfect titles, others don’t, and this was a severe example of the latter kind. No one could come up with anything plausible: The book had so many aspects that it seemed impossible to find something that reflected the whole. Very late in the day we still had no title, and a jacket had to be designed and the book announced.

What happened was one of the very few miracles I’ve ever stumbled into—maybe the only one. I was brooding on the problem as I was walking down the hall from my office to the men’s room when I ran into a man named Arthur Sheekman…Arthur was a screenwriter—he had written a bunch of the Marx Brothers movies, starting with Monkey Business, as well as movies for Eddie Cantor, Danny Kaye, and others—and he possessed a friendly elegance and refinement that made him a favorite on our floor. “You look worried,” he said to me as we passed each other in the hall. “What’s the problem?” So I told him I was going nuts trying to find a title for a book about boys in wartime Brooklyn, Hasidim, and baseball. “Call it The Chosen,” he said casually, and walked on. Literary history was made because I had to take a leak.

Robert Gottlieb, Avid Reader

Written by nevalalee

December 4, 2016 at 7:30 am

%d bloggers like this: