Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘The New York Times Magazine

The purity test

with one comment

Earlier this week, The New York Times Magazine published a profile by Taffy Brodesser-Akner of the novelist Jonathan Franzen. It’s full of fascinating moments, including a remarkable one that seems to have happened entirely by accident—the reporter was in the room when Frazen received a pair of phone calls, including one from Daniel Craig, to inform him that production had halted on the television adaptation of his novel Purity. Brodesser-Akner writes: “Franzen sat down and blinked a few times.” That sounds about right to me. And the paragraph that follows gets at something crucial about the writing life, in which the necessity of solitary work clashes with the pressure to put its fruits at the mercy of the market:

He should have known. He should have known that the bigger the production—the more people you involve, the more hands the thing goes through—the more likely that it will never see the light of day resembling the thing you set out to make in the first place. That’s the real problem with adaptation, even once you decide you’re all in. It just involves too many people. When he writes a book, he makes sure it’s intact from his original vision of it. He sends it to his editor, and he either makes the changes that are suggested or he doesn’t. The thing that we then see on shelves is exactly the thing he set out to make. That might be the only way to do this. Yes, writing a novel—you alone in a room with your own thoughts—might be the only way to get a maximal kind of satisfaction from your creative efforts. All the other ways can break your heart.

To be fair, Franzen’s status is an unusual one, and even successful novelists aren’t always in the position of taking for granted the publication of “exactly the thing he set out to make.” (In practice, it’s close to all or nothing. In my experience, the novel that you see on store shelves mostly reflects what the writer wanted, while the ones in which the vision clashes with those of other stakeholders in the process generally doesn’t get published at all.) And I don’t think I’m alone when I say that some of the most interesting details that Brodesser-Akner provides are financial. A certain decorum still surrounds the reporting of sales figures in the literary world, so there’s a certain frisson in seeing them laid out like this:

And, well, sales of his novels have decreased since The Corrections was published in 2001. That book, about a Midwestern family enduring personal crises, has sold 1.6 million copies to date. Freedom, which was called a “masterpiece” in the first paragraph of its New York Times review, has sold 1.15 million since it was published in 2010. And 2015’s Purity, his novel about a young woman’s search for her father and the story of that father and the people he knew, has sold only 255,476.

For most writers, selling a quarter of a million copies of any book would exceed their wildest dreams. Having written one of the greatest outliers of the last twenty years, Franzen simply reverting to a very exalted mean. But there’s still a lot to unpack here.

For one thing, while Purity was a commercial disappointment, it doesn’t seem to have been an unambiguous disaster. According to Publisher’s Weekly, its first printing—which is where you can see a publisher calibrating its expectations—came to around 350,000 copies, which wasn’t even the largest print run for that month. (That honor went to David Lagercrantz’s The Girl in the Spider’s Web, which had half a million copies, while a new novel by the likes of John Grisham can run to over a million.) I don’t know what Franzen was paid in advance, but the loss must have fallen well short of a book like Tom Wolfe’s Back to Blood, for which he received $7 million and sold 62,000 copies, meaning that his publisher paid over a hundred dollars for every copy that someone actually bought. And any financial hit would have been modest compared to the prestige of keeping a major novelist on one’s list, which is unquantifiable, but no less real. If there’s one thing that I’ve learned about publishing over the last decade, it’s that it’s a lot like the movie industry, in which apparently inexplicable commercial and marketing decisions are easier to understand when you consider their true audience. In many cases, when they buy or pass on a book, editors aren’t making decisions for readers, but for other editors, and they’re very conscious of what everyone in their imprint thinks. A readership is an abstraction, except when quantified in sales, but editors have their everyday judgement calls reflected back on them by the people they see every day. Giving up a writer like Franzen might make financial sense, but it would be devastating to Farrar, Straus and Giroux, to say nothing of the relationship that can grow between an editor and a prized author over time.

You find much the same dynamic in Hollywood, in which some decisions are utterly inexplicable until you see them as a manifestation of office politics. In theory, a film is made for moviegoers, but the reactions of the producer down the hall are far more concrete. The difference between publishing and the movies is that the latter publish their box office returns, often in real time, while book sales remain opaque even at the highest level. And it’s interesting to wonder how both industries might differ if their approaches were more similar. After years of work, the success of a movie can be determined by the Saturday morning after its release, while a book usually has a little more time. (The exception is when a highly anticipated title doesn’t make it onto the New York Times bestseller list, or falls off it with alarming speed. The list doesn’t disclose any sales figures, which means that success is relative, not absolute—and which may be a small part of the reason why writers seldom wish one another well.) In the absence of hard sales, writers establish the pecking order with awards, reviews, and the other signifiers that have allowed Franzen to assume what Brodesser-Akner calls the mantle of “the White Male Great American Literary Novelist.” But the real takeaway is how narrow a slice of the world this reflects. Even if we place the most generous interpretation imaginable onto Franzen’s numbers, it’s likely that well under one percent of the American population has bought or read any of his books. You’ll find roughly the same number on any given weeknight playing HQ Trivia. If we acknowledged this more widely, it might free writers to return to their proper cultural position, in which the difference between a bestseller and a disappointment fades rightly into irrelevance. Who knows? They might even be happier.

Written by nevalalee

June 28, 2018 at 7:49 am

Quote of the Day

leave a comment »

Written by nevalalee

November 29, 2017 at 7:30 am

The homecoming king

leave a comment »

In my last year at college, I spent an inordinate amount of time trying to figure out how to come back from the dead. I had decided to write my senior thesis about Amphiaraus, an obscure figure from Greek literature best known for a brief appearance in the eighth Pythian ode of Pindar. (When you’re majoring in a field that has been generating articles, term papers, and dissertations with monotonous regularity for centuries, you take your subjects wherever you can find them.) Amphiaraus was the legendary king of Argos, proverbial for his wisdom, who joined the doomed assault of the Seven Against Thebes, although he knew that it would end in tragedy. Because he was beloved by the gods, at the moment that he was about to die in battle, the earth opened up beneath him, swallowing him whole. Much of my thesis was devoted to describing his afterlife as an object of cult veneration, where he appears to have persisted as a chthonic oracle, delivering dreams to pilgrims at his sanctuary as they slept on the ground. He also occasionally returned in person, at least in literature—in Pindar’s ode, he’s evidently some kind of ghost or revenant, since he appears in a speaking role at a point in the narrative at which he should have been long dead. This is striking in itself, because in the ancient Greek conception of the underworld, most men and women survive only as shades, shadowy figures without any trace of memory or personality. In technical terms, when we die, we lose our noos, which can roughly be regarded as the part of the soul responsible for conscious thought. And the remarkable thing about Amphiaraus is that he seems to retain his noos even after his death, as an oracular hero who remains fully aware and capable of returning to our world when necessary.

As I tried to define what made Amphiaraus special, I went down a linguistic rabbit hole in which I was perhaps overly influenced by a curious book titled The Myth of Return in Early Greek Epic. Its argument, presented by the linguist Douglas Frame, is that the word noos, or “mind,” is connected to nostos, or “return,” the central theme of the Odyssey. (It’s where we get the word “nostalgia,” which combines nostos with algos, or “pain.”) The quality that allows Odysseus to make his way home to Ithaca is his intelligence—which, by extension, is also the attribute that enables Amiphiaraus to return from the dead. A rumor of this theory somehow reached John Updike, of all people, who wrote a story called “Cruise” that offered a portrait of a lecturer on a cruise ship that I’m still convinced was inspired by one of my professors, since he was literally the only other man in the world, besides Douglas Frame, who sounded like this:

His sallow triangular face was especially melancholy, lit from beneath by the dim lectern bulb. The end of the journey meant for him the return to his university—its rosy-cheeked students invincible in their ignorance, its demonic faculty politics, its clamorous demands for ever-higher degrees of political correctness and cultural diversity. “ΚΡΝΩ,” he wrote on the blackboard, pronouncing, “krino—to discern, to be able to distinguish the real from the unreal. To do this, we need noos, mind, consciousness.” He wrote, then, “ΝΟΟΣ.” His face illumined from underneath was as eerie as that of a jack-in-the-box or a prompter hissing lines to stymied thespians. “We need no-os,” he pronounced, scrabbling with his invisible chalk in a fury of insertion, “to achieve our nos-tos, our homecoming.” He stood aside to reveal the completed word: ΝΟΣΤΟΣ. In afterthought he rapidly rubbed out two of the letters, created ΠΟΝΤΟΣ, and added with a small sly smile, “After our crossing together of the sea, the pontos.”

In the end, I moved away from this line of reasoning, and I spent most of my thesis developing arguments based on readings of words like poikōlos and polēplokos, which described the quality of mind—a kind of flexibility and resourcefulness—that was necessary to achieve this return, whether to Ithaca or to the world of the living. Until recently, I hadn’t thought about this for years. Over the weekend, however, I read a wonderful profile in The New York Times Magazine by Wyatt Mason of the classicist Emily Wilson, who has published a new translation of the Odyssey. Much of the article is devoted to a discussion of the word polytropos, which appears in the very first line of the poem as a description of Odysseus himself. Wilson explains:

The prefix poly means “many” or “multiple.” Tropos means “turn.” “Many” or “multiple” could suggest that he’s much turned, as if he is the one who has been put in the situation of having been to Troy, and back, and all around, gods and goddesses and monsters turning him off the straight course that, ideally, he’d like to be on. Or, it could be that he’s this untrustworthy kind of guy who is always going to get out of any situation by turning it to his advantage. It could be that he’s the turner…So the question of whether he’s the turned or the turner: I played around with that a lot in terms of how much should I be explicit about going for one versus the other. I remember that being one of the big questions I had to start off with.

And it’s precisely this notion of slipperiness and changeability that I often saw in descriptions of Amphiaraus, who, like Odysseus, has affinities with the god Hermes—the crosser of borders, the conductor of souls, the trickster.

The same qualities, of course, also tend to be present in writers, poets, scholars, and all those who, in W.H. Auden’s words, “live by their wits.” This may be why translators of the Odyssey have been so preoccupied with polytropos, which stands as a signal at the beginning of the poem of the intelligence that you need to make it all the way to the end. As Mason writes:

You might be inclined to suppose that, over the course of nearly half a millennium, we must have reached a consensus on the English equivalent for an old Greek word, polytropos. But to consult Wilson’s sixty some predecessors, living and dead, is to find that consensus has been hard to come by. Chapman starts things off, in his version, with “many a way/Wound with his wisdom”; John Ogilby counters with the terser “prudent”; Thomas Hobbes evades the word, just calling Odysseus “the man.” Quite a range, and we’ve barely started.

Mason lists dozens of variants, including Alexander Pope’s “for wisdom’s various arts renown’d”; H.F. Cary’s “crafty”; William Sotheby’s “by long experience tried”; Theodore Buckley’s “full of resources”; the Rev. Lovelace Bigge-Wither’s “many-sided-man”; Roscoe Mongan’s “skilled in expedients”; and T.E. Lawrence’s “various-minded.” Perhaps for sentimental reasons, I’m partial to Lawrence’s version, which recalls my old favorites poikōlos and polēplokos in evoking a sort of visual variety or shiftiness, like the speckled scales of a snake. And Wilson? She clearly thought long and hard on the matter. And when I read her solution, I felt a shiver of recognition, as well as a strange pang of nostalgia for the student I used to be, and to whom I still sometimes dream of returning again: “Tell me about a complicated man.”

Written by nevalalee

November 6, 2017 at 8:44 am

Writing with scissors

leave a comment »

Over the last few years, one of my great pleasures has been reading the articles on writing that John McPhee has been contributing on an annual basis to The New Yorker. I’ve written here about my reactions to McPhee’s advice on using the dictionary, on “greening” or cutting a piece by an arbitrary length, on structure, on frames of reference. Now his full book on the subject is here, Draft No. 4, and it’s arriving in my life at an opportune time. I’m wrapping up a draft of my own book, with two months to go before deadline, and I have a daunting set of tasks ahead of me—responding to editorial comments, preparing the notes and bibliography, wrestling the whole thing down to size. McPhee’s reasonable voice is a balm at such times, although he never minimizes the difficulty of the process itself, which he calls “masochistic, mind-fracturing self-enslaved labor,” even as he speaks of the writer’s “animal sense of being hunted.” And when you read Sam Anderson’s wonderful profile on McPhee in this week’s issue of The New York Times Magazine, it’s like listening to an old soldier who has been in combat so many times that everything that he says carries the weight of long experience. (Reading it, I was reminded a little of the film editor Walter Murch, whom McPhee resembles in certain ways—they look sort of alike, they’re both obsessed with structure, and they both seem to know everything. I was curious to see whether anyone else had made this connection, so I did a search for their names together on Google. Of the first five results, three were links from this blog.)

Anderson’s article offers us the portrait of a man who, at eighty-six, has done a better job than just about anyone else of organizing his own brain: “Each of those years seems to be filed away inside of him, loaded with information, ready to access.” I would have been equally pleased to learn that McPhee was as privately untidy as his writing is intricately patterned, but it makes sense that his interest in problems of structure—to which he returns endlessly—would manifest itself in his life and conversation. He’s interested in structure in the same way that the rest of us are interested in the lives of our own children. I never tire of hearing how writers deal with structural issues, and I find passages like the following almost pornographically fascinating:

The process is hellacious. McPhee gathers every single scrap of reporting on a given project—every interview, description, stray thought and research tidbit—and types all of it into his computer. He studies that data and comes up with organizing categories: themes, set pieces, characters and so on. Each category is assigned a code. To find the structure of a piece, McPhee makes an index card for each of his codes, sets them on a large table and arranges and rearranges the cards until the sequence seems right. Then he works back through his mass of assembled data, labeling each piece with the relevant code. On the computer, a program called “Structur” arranges these scraps into organized batches, and McPhee then works sequentially, batch by batch, converting all of it into prose. (In the old days, McPhee would manually type out his notes, photocopy them, cut up everything with scissors, and sort it all into coded envelopes. His first computer, he says, was “a five-thousand-dollar pair of scissors.”)

Anderson writes: “[McPhee] is one of the world’s few remaining users of a program called Kedit, which he writes about, at great length, in Draft No. 4.” The phrase “at great length” excites me tremendously—I’m at a point in my life where I’d rather hear about a writer’s favorite software program than his or her inspirational  thoughts on creativity—and McPhee’s process doesn’t sound too far removed from the one that I’ve worked out for myself. As I read it, though, I found myself thinking in passing of what might be lost when you move from scissors to a computer. (Scissors appear in the toolboxes of many of the writers and artists I admire. In The Elements of Style, E.B. White advises: “Quite often the writer will discover, on examining the completed work, that there are serious flaws in the arrangement of the material, calling for transpositions. When this is the case, he can save himself much labor and time by using scissors on his manuscript, cutting it to pieces and fitting the pieces together in a better order.” In The Silent Clowns, Walter Kerr describes the narrative challenges of filmmaking in the early fifties and concludes: “The problem was solved, more or less, with a scissors.” And Paul Klee once wrote in his diary: “What I don’t like, I cut away with the scissors.”) But McPhee isn’t sentimental about the tools themselves. In Anderson’s profile, the New Yorker editor David Remnick, who took McPhee’s class at Princeton, recalls: “You were in the room with a craftsman of the art, rather than a scholar or critic—to the point where I remember him passing around the weird mechanical pencils he used to use.” Yet there’s no question in my mind that McPhee would drop that one brand of pencil if he found one that he thought was objectively better. As soon as he had Kedit, he got rid of the scissors. When you’re trying to rethink structure from the ground up, you don’t have much time for nostalgia.

And when McPhee explains the rationale behind his methods, you can hear the pragmatism of fifty years of hard experience:

If this sounds mechanical, its effect was absolutely the reverse. If the contents of the seventh folder were before me, the contents of twenty-nine other folders were out of sight. Every organizational aspect was behind me. The procedure eliminated nearly all distraction and concentrated only the material I had to deal with in a given day or week. It painted me into a corner, yes, but in doing so it freed me to write.

This amounts to an elaboration of what I’ve elsewhere called my favorite piece of writing advice, which David Mamet offers in Some Freaks:

As a writer, I’ve tried to train myself to go one achievable step at a time: to say, for example, “Today I don’t have to be particularly inventive, all I have to be is careful, and make up an outline of the actual physical things the character does in Act One.” And then, the following day to say, “Today I don’t have to be careful. I already have this careful, literal outline, and I all have to do is be a little bit inventive,” et cetera, et cetera.

Mamet might as well have come out of the same box as Walter Murch and McPhee, which implies that I have a definite type when it comes to looking for advice. And what they all have in common, besides the glasses and beard, is the air of having labored at a craft for decades, survived, and returned to tell the tale. Of the three, McPhee’s career may be the most enviable of all, if only because he spent it in Princeton, not Hollywood. It’s nice to be able to structure an essay. The tricky part is structuring a life.

Jokes against inanity

with 3 comments

Yesterday, Harvard University made headlines by withdrawing acceptances for ten high school students who had posted “sexually explicit memes and messages” on a private Facebook group. Here’s how The Crimson describes the situation:

A handful of admitted students formed the messaging group—titled, at one point, “Harvard memes for horny bourgeois teens”—on Facebook in late December…In the group, students sent each other memes and other images mocking sexual assault, the Holocaust, and the deaths of children, according to screenshots of the chat obtained by The Crimson. Some of the messages joked that abusing children was sexually arousing, while others had punchlines directed at specific ethnic or racial groups. One called the hypothetical hanging of a Mexican child “piñata time.”

Not surprisingly, the decision has been a divisive one, with critics of the college making the argument—which can’t be dismissed out of hand—that Harvard overreached in policing statements that were made essentially in private. But there’s another line of reasoning that I find increasingly hard to take seriously. The Washington Post quotes Erica Goldberg, an assistant professor at Ohio Northern Law School, who compares the humor in question to the party game Cards Against Humanity:

It’s an unabashedly irreverent game whose purpose is to be as cleverly offensive as possible. The game uses cards to create inappropriate associations, on topics we are generally not socially permitted to mock—such as AIDS, the Holocaust, and dead babies. Even many good liberals love the game, precisely because the humor is so wrong, so contrary to our values. There is something appealing about the freedom to be irreverent and dark.

I might have agreed with this once, but I don’t think I do anymore. The catalyst, oddly, was a passage in Jon Ronson’s otherwise very good book So You’ve Been Publicly Shamed, which was evidently intended to make the opposite point. Ronson discusses the notorious case of Justine Sacco, the public relations executive who inspired a torrent of online outrage after tweeting before a flight to Cape Town: “Going to Africa. Hope I don’t get AIDS. Just kidding. I’m white!” Sacco then switched off her phone, which meant that she spent the next eleven hours oblivious to the fact that her life had effectively been ruined. Ronson writes of the firestorm:

I could understand why some people found it offensive. Read literally, she said that white people don’t get AIDS, but it seems doubtful many interpreted it that way. More likely it was her apparently gleeful flaunting of her privilege that angered people. But after thinking about her tweet for a few seconds more, I began to suspect that it wasn’t racist but a reflexive critique of white privilege—on our tendency to naïvely imagine ourselves immune from life’s horrors.

He concludes: “Justine’s crime had been a badly worded joke mocking privilege. To see the catastrophe as her fault felt, to me, a little like ‘Don’t wear short skirts.’ It felt like victim-blaming.” And there’s no question that Sacco, who was fired from her job, paid a disproportionately harsh price for her actions. But it also feels like an overstatement to repeatedly insist, as Ronson does, that Sacco “didn’t do anything wrong.” To say that her tweet was “a badly worded joke” implies that there was an alternative wording that would have made it funny and acceptable. I have trouble imagining one. And the implicit assumption that this was a problem of phrasing or context strikes me as the slipperiest slope of all.

This brings us to Cards Against Humanity, a kind of analog computer for generating offensive jokes, which, revealingly, often evokes the specter of “white privilege” to justify itself. When asked to explain its expansion pack “Ten Days or Whatever of Kwanzaa,” one of the game’s designers told the Daily Dot: “It’s a joke that we meant to poke fun at white privilege, ignorance, and laziness.” This amounts to a defense of the entire game, in which players theoretically interrogate their privilege by forcing one another to make what Goldberg calls “irreverent and dark jokes.” In the same article, Jaya Saxena neatly sums up the company’s position:

The Cards Against Humanity team is stalled in the middle of that narrative: understanding that there is a cultural hierarchy that disenfranchises people, making it clear they’re aware of the privilege they hold, attempting to use their humor to separate themselves from those who don’t get it, and apologizing for their mistakes when they’re called out.

This raises two related issues. One is whether this kind of scrutiny is, in fact, what most players of the game think they’re doing. The other is whether this activity is worthwhile. I would argue that the answer to both questions is “probably not.” This isn’t a matter of political correctness, but of a logical and comedic inconsistency—and, frankly, of “privilege, ignorance, and laziness”—in the sort of humor involved. Let’s say that you’ve made a “transgressive” joke of the type that got these prospective Harvard freshmen in trouble. Now imagine how you’d react if it had been said by Milo Yiannopoulos or posted as a meme on the alt-right. If it bothers you, then the only conclusion is that your identity as a progressive somehow justifies statements that would be horrifyingly racist in the mouth of someone of whom you disapprove. You can make the joke because you, as a “horny bourgeois teen,” know better.

This sounds a lot like privilege to me. I won’t say that it’s worse or more insidious than other forms of racism, but that doesn’t mean that it isn’t problematic, especially if you believe that transgressive humor is something to be celebrated. As Dan Brooks writes in an excellent essay in the New York Times Magazine: “The whole architecture of the game is designed to provide the thrill of transgression with none of the responsibility—to let players feel horrible, if you will, without feeling bad.” It’s a mechanical simulation of transgression, and, like bad art that allows for an emotional release that absolves the viewer from other kinds of empathy, it can numb us to the real thing, leaving us unable to make the distinction. Just because you were smart enough to get into Harvard—and believe me, I know—doesn’t make you Michael O’Donoghue. On that level, the college made the right call. It has the right to withdraw admission if “an admitted student engages in behavior that brings into question his or her honesty, maturity, or moral character,” and even if these teenagers share their assumptions with millions of other “good liberals,” that doesn’t make them any less wrong. Max Temkin, the primary creative force behind Cards Against Humanity, has impeccably progressive credentials and has done a lot of admirable things, but he has also said “We removed all of the ‘rape’ jokes from Cards Against Humanity years ago,” as if this were commendable in itself. They cull the cards that they’ve personally outgrown, as if objective standards of acceptability have changed simply because they’re no longer in their early twenties, and I’m not even sure if this strikes them as problematic. As a profile of the company in Fusion notes:

As part of their job, [the creators] periodically pull cards that seemed funny to college seniors in their parents’ basement, but are a little less funny now…Meanwhile some [cards], like “passable transvestites” and “date rape,” were pulled when the guys realized that kind of “humor” wasn’t actually very humorous.

The reference to “the guys” speaks volumes. But this kind of culling is something that we all do, as we leave behind our adolescent selves, and it has one inevitable conclusion. Speaking of the “passable transvestites” card, Temkin said: “It’s embarrassing to me that there was a time in my life that that was funny.” And for a lot of us, that includes the game as a whole.

The A/B Test

with 2 comments

In this week’s issue of The New York Times Magazine, there’s a profile of Mark Zuckerberg by Farhad Manjoo, who describes how the founder of Facebook is coming to terms with his role in the world in the aftermath of last year’s election. I find myself thinking about Zuckerberg a lot these days, arguably even more than I use Facebook itself. We just missed overlapping in college, and with one possible exception, which I’ll mention later, he’s the most influential figure to emerge from those ranks in the last two decades. Manjoo depicts him as an intensely private man obliged to walk a fine line in public, leading him to be absurdly cautious about what he says: “When I asked if he had chatted with Obama about the former president’s critique of Facebook, Zuckerberg paused for several seconds, nearly to the point of awkwardness, before answering that he had.” Zuckerberg is trying to figure out what he believes—and how to act—under conditions of enormous scrutiny, but he also has more resources at his disposal than just about anyone else in history. Here’s the passage in the article that stuck with me the most:

The people who work on News Feed aren’t making decisions that turn on fuzzy human ideas like ethics, judgment, intuition, or seniority. They are concerned only with quantifiable outcomes about people’s actions on the site. That data, at Facebook, is the only real truth…This ideal runs so deep that the people who make News Feed often have to put aside their own notions of what’s best. “One of the things we’ve all learned over the years is that our intuition can be wrong a fair amount of the time,” John Hegeman, the vice president of product management and a News Feed team member, told me. “There are things you don’t expect will happen. And we learn a lot from that process: Why didn’t that happen, and what might that mean?”

Reading this, I began to reflect on how rarely we actually test our intuitions. I’ve spoken a lot on this blog about the role of intuitive thinking in the arts and sciences, mostly because it doesn’t get the emphasis it deserves, but there’s also no guarantee that intuition will steer us in the right direction. The psychologist Daniel Kahneman has devoted his career to showing how we tend to overvalue our gut reactions, particularly if we’ve been fortunate enough to be right in the past, and the study of human irrationality has become a rich avenue of research in the social sciences, which are often undermined by poor hunches of their own. It may not even be a matter of right or wrong. An intuitive choice may be better or worse than the alternative, but for the most part, we’ll never know. One of the quirks of Silicon Valley culture is that it claims to base everything on raw data, but it’s often in the service of notions that are outlandish, untested, and easy to misrepresent. Facebook comes closer than any company in existence to the ideal of an endless A/B test, in which the user base is randomly divided into two or more groups to see which approaches are the most effective. It’s the best lab ever developed for testing our hunches about human behavior. (Most controversially, Facebook modified the news feeds of hundreds of thousands of users to adjust the number of positive or negative posts, in order to gauge the emotional impact, and it has conducted similar tests on voter turnout.) And it shouldn’t surprise us if many of our intuitions turn out to be mistaken. If anything, we should expect them to be right about half the time—and if we can nudge that percentage just a little bit upward, in theory, it should give us a significant competitive advantage.

So what good is intuition, anyway? I like to start with William Goldman’s story about the Broadway producer George Abbott, who once passed a choreographer holding his head in his hands while the dancers stood around doing nothing. When Abbott asked what was wrong, the choreographer said that he couldn’t figure out what to do next. Abbott shot back: “Well, have them do something! That way we’ll have something to change.” Intuition, as I’ve argued before, is mostly about taking you from zero ideas to one idea, which you can then start to refine. John W. Campbell makes much the same argument in what might be his single best editorial, “The Value of Panic,” which begins with a maxim from the Harvard professor Wayne Batteau: “In total ignorance, try anything. Then you won’t be so ignorant.” Campbell argues that this provides an evolutionary rationale for panic, in which an animal acts “in a manner entirely different from the normal behavior patterns of the organism.” He continues:

Given: An organism with N characteristic behavior modes available. Given: An environmental situation which cannot be solved by any of the N available behavior modes, but which must be solved immediately if the organism is to survive. Logical conclusion: The organism will inevitably die. But…if we introduce Panic, allowing the organism to generate a purely random behavior mode not a member of the N modes characteristically available?

Campbell concludes: “When the probability of survival is zero on the basis of all known factors—it’s time to throw in an unknown.” In extreme situations, the result is panic; under less intense circumstances, it’s a blind hunch. You can even see them as points on a spectrum, the purpose of which is to provide us with a random action or idea that can then be revised into something better, assuming that we survive for long enough. But sometimes the animal just gets eaten.

The idea of refinement, revision, or testing is inseparable from intuition, and Zuckerberg has been granted the most powerful tool imaginable for asking hard questions and getting quantifiable answers. What he does with it is another matter entirely. But it’s also worth looking at his only peer from college who could conceivably challenge him in terms of global influence. On paper, Mark Zuckerberg and Jared Kushner have remarkable similarities. Both are young Jewish men—although Kushner is more observant—who were born less than four years and sixty miles apart. Kushner, whose acceptance to Harvard was so manifestly the result of his family’s wealth that it became a case study in a book on the subject, was a member of the final clubs that Zuckerberg badly wanted to join, or so Aaron Sorkin would have us believe. Both ended up as unlikely media magnates of a very different kind: Kushner, like Charles Foster Kane, took over a New York newspaper from a man named Carter. Yet their approaches to their newfound positions couldn’t be more different. Kushner has been called “a shadow secretary of state” whose portfolio includes Mexico, China, the Middle East, and the reorganization of the federal government, but it feels like one long improvisation, on the apparent assumption that he can wing it and succeed where so many others have failed. As Bruce Bartlett writes in the New York Times, without a staff, Kushner “is just a dilettante meddling in matters he lacks the depth or the resources to grasp,” and we may not have a chance to recover if his intuitions are wrong. In other words, he resembles his father-in-law, as Frank Bruni notes:

I’m told by insiders that when Trump’s long-shot campaign led to victory, he and Kushner became convinced not only that they’d tapped into something that everybody was missing about America, but that they’d tapped into something that everybody was missing about the two of them.

Zuckerberg and Kushner’s lives ran roughly in parallel for a long time, but now they’re diverging at a point at which they almost seem to be offering us two alternate versions of the future, like an A/B test with only one possible outcome. Neither is wholly positive, but that doesn’t make the choice any less stark. And if you think this sounds farfetched, bookmark this post, and read it again in about six years.

Cutty Sark and the semicolon

leave a comment »

Vladimir Nabokov

Note: I’m taking a few days off, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on December 22, 2015.

In an interview that was first published in The Paris Review, the novelist Herbert Gold asked Vladimir Nabokov if an editor had ever offered him any useful advice. This is what Nabokov said in response:

By “editor” I suppose you mean proofreader. Among these I have known limpid creatures of limitless tact and tenderness who would discuss with me a semicolon as if it were a point of honor—which, indeed, a point of art often is. But I have also come across a few pompous avuncular brutes who would attempt to “make suggestions” which I countered with a thunderous “stet!”

I’ve always adored that thunderous stet, which tells us so much about Nabokov and his imperious resistance to being edited by anybody. Today, however, I’m more interested in the previous sentence. A semicolon, as Nabokov puts it, can indeed be a point of honor. Nabokov was perhaps the most painstaking of all modern writers, and it’s no surprise that the same perfectionism that produced such conceptual and structural marvels as Lolita and Pale Fire would filter down to the smallest details. But I imagine that even ordinary authors can relate to how a single punctuation mark in a manuscript can start to loom as large as the finger of God on the Sistine Chapel ceiling.

And there’s something about the semicolon that seems to inspire tussles between writers and their editors—or at least allows it to stand as a useful symbol of the battles that can occur during the editorial process. Here’s an excerpt from a piece by Charles McGrath in The New York Times Magazine about the relationship between Robert Caro, author of The Years of Lyndon Johnson, and his longtime editor Robert Gottlieb:

“You know that insane old expression, ‘The quality of his defect is the defect of his quality,’ or something like that?” Gottlieb asked me. “That’s really true of Bob. What makes him such a genius of research and reliability is that everything is of exactly the same importance to him. The smallest thing is as consequential as the biggest. A semicolon matters as much as, I don’t know, whether Johnson was gay. But unfortunately, when it comes to English, I have those tendencies, too, and we could go to war over a semicolon. That’s as important to me as who voted for what law.”

It’s possible that the semicolon keeps cropping up in such stories because its inherent ambiguity lends itself to disagreement. As Kurt Vonnegut once wrote: “Here is a lesson in creative writing. First rule: Do not use semicolons. They are transvestite hermaphrodites representing absolutely nothing. All they do is show you’ve been to college.” And I’ve more or less eliminated semicolons from my own work for much the same reason.

Robert De Niro and Martin Scorsese on the set of Raging Bull

But the larger question here is why artists fixate on things that even the most attentive reader would pass over without noticing. On one level, you could take a fight over a semicolon as an illustration of the way that the creative act—in which the artist is immersed in the work for months on end—tends to turn mountains into molehills. Here’s one of my favorite stories about the making of Raging Bull:

One night, when the filmmakers were right up against the deadline to make their release date, they were working on a nothing little shot that takes place in a nightclub, where a minor character turns to the bartender and orders a Cutty Sark. “I can’t hear what he’s saying,” [Martin Scorsese] said. Fiddling ensued—extensive fiddling—without satisfying him. [Producer Irwin] Winkler, who was present, finally deemed one result good enough and pointed out that messengers were standing by to hand-carry release prints to the few theaters where the picture was about to premiere. At which point, Scorsese snapped. “I want my name taken off the picture,” he cried—which bespeaks his devotion to detail. It also bespeaks his exhaustion at the end of Raging Bull, not to mention the craziness that so often overtakes movies as they wind down. Needless to say, he was eventually placated. And you can more or less hear the line in the finished print.

And you could argue that this kind of microscopic attention is the only thing that can lead to a work that succeeds on the largest possible scale.

But there’s yet another story that gets closer to truth. In Existential Errands, Norman Mailer describes a bad period in his life—shortly after he was jailed for stabbing his second wife Adele—in which he found himself descending into alcoholism and unable to work. His only source of consolation were the scraps of paper, “little crossed communications from some wistful outpost of my mind,” that he would find in his jacket pocket after a drunken night. Mailer writes of these poems:

I would go to work, however, on my scraps of paper. They were all I had for work. I would rewrite them carefully, printing in longhand and ink, and I would spend hours whenever there was time going over these little poems…And since I wasn’t doing anything else very well in those days, I worked the poems over every chance I had. Sometimes a working day would go by, and I might put a space between two lines and remove a word. Maybe I was mending.

Which just reminds us that a seemingly minuscule change can be the result of a prolonged confrontation with the work as a whole. You can’t obsess over a semicolon without immersing yourself in the words around it, and there are times when you need such a focal point to structure your engagement with the rest. It’s a little like what is called a lakshya in yoga: the tiny spot on the body or in the mind on which you concentrate while meditating. In practice, the lakshya can be anything or nothing, but without it, your attention tends to drift. In art, it can be a semicolon, a word, or a line about Cutty Sark. It may not be much in itself. But when you need to tether yourself to something, even a semicolon can be a lifeline.

The passages of power

with 2 comments

Robert Caro

Every year or so, I go back and read Charles McGrath’s profile of the biographer Robert Caro, which is one of my favorite features ever published in The New York Times Magazine. (My favorite piece of all, if I’m being honest with myself, is Stephen Rodrick’s account of the notorious meltdown by Lindsay Lohan during the filming of The Canyons—and you’d probably learn a lot about human nature by reading those two articles back to back.) Caro, who was recently honored with a lifetime achievement medal from the National Book Foundation, has long been a hero of mine, for reasons that I’ve described at length elsewhere. He’s eighty years old now, and for most of his career, he has remained obsessively focused on two men: the city planner Robert Moses and President Lyndon Johnson. Yet his real subject is the nature of power in America, a theme that he has kept at the forefront of his work even at his bleakest moments. McGrath writes:

At the lowest point during the writing of The Power Broker, when Caro had run out of money and was close to despair about being able to finish, [Caro’s wife Ina] sold their house in suburban Long Island, moved the family…to an apartment in the Bronx and took a job teaching school to keep him going. “That was a bad time, a very bad time,” Caro recalled.

In an interview published this week with Rachel Syme of Matter, Caro goes into greater detail about this bad period:

This book took seven years. And money played a big part of this. We didn’t have any savings. I was a reporter. And I thought it was only going to take a year, so I couldn’t quit. I got a contract for $5,000, they gave me $2,500 as an advance. So I was trying for half the year to keep my job and work on the book, but I wasn’t getting anywhere. And I heard about a grant for one year, and I got it. And I remember I told Ina, we are finally going to get to go to France. I thought, they are giving me this money for a year, and this outline is only going to take me nine months! Of course at the end of the year, I’d hardly started, and we were just broke.

So Ina sold the house, but that only gave us — this was before the real estate boom — $25,000 of profit. That was enough to live for a year in an apartment in the Bronx. And then I was just totally broke, and Ina went to work. Then I got hurt and couldn’t get out of bed for a long time. So she had to stop and do the research.

Caro adds that he still remembers the exact amount of their monthly rent in those years—$362.73—because they constantly worried how to pay it, and that Ina began to take a circuitous route on her walk home to avoid passing the butcher and dry cleaner to whom they owed money.

Robert Moses

And here’s the quote that sticks in my head the most: “But I must say, for several years, I had very little hope of finishing the book,” Caro says. “When I thought about the book, I didn’t feel powerful. I believed no one was going to read it. And I was just thinking I have to finish, but I don’t know how we are going to make it.” The italics are mine. The Power Broker was written by a man who felt all but powerless, doubted that he would ever finish it, and feared that nobody would read the result even if he did. (Caro recalls: “All this time, all I’m hearing is nobody is going to read a book on Robert Moses, including from my first editor. He said, ‘It’s a good book, but nobody is going to read it. You have to prepare yourselves for a very small printing.'”) For a writer, that kind of financial squeeze, as I know from personal experience, enforces a relentless logic. You can quantify the cost of every wasted day—and, even more terrifyingly, of every day spent on real work. Caro realized, for instance, that in order to give the reader a full picture of Moses’s impact on the city, he had to tell the story of one of the neighborhoods that were destroyed. The result, “One Mile,” is one of the most powerful chapters in the book, and probably the first that most readers remember. Caro knew that it would take him six months to research and write, and he didn’t have the money or time. But he did it it anyway. When you take that kind of decision and multiply it by the scale of the book that he envisioned, you’ve got seven years spent in utter suspense over whether any of it would make a difference. It couldn’t have been less like the power that he was trying to describe.

But he pulled it off magnificently. And it’s an example that is worth remembering now more than ever. We’re entering a era in which the media will be forced to confront the rise to power of a man it resoundingly did not want to see in the White House. (Only two mainstream newspapers endorsed Donald Trump.) In the aftermath of the election, writers and journalists can’t be blamed for asking themselves how much power they really have, in the face of a post-truth environment in which fake news was just as influential as the real thing. The answer, honestly, is that they have almost none. Yet this might be the only position from which they can speak honestly about power itself. As Caro says to Matter: “You have to deal with the powerless. You couldn’t just write about how power works, you had to write about its effect on people who didn’t have power, both for good and for ill.” And it’s that invisible dynamic between the subject and its author—between one man of enormous power and another who could wield nothing but words—that made Caro’s work so enduring. The Power Broker is as massive a book as can be physically encompassed between two covers, and its scale, I think, was partly a result of Caro’s attempt to match himself to Moses. He might not have been able to build highways and bridges, but he could construct a book on the same scale. The result had a greater impact on Moses’s reputation than any of the monuments that he left to himself. It took Caro seven years, but he did it, starting from nothing. And we owe it to ourselves to do the same.

Written by nevalalee

November 18, 2016 at 8:16 am

The tragic life of Mitsui

with 2 comments

Leonardo Nam on Westworld

In the latest issue of The New York Times Magazine, the film critic Wesley Morris has a reflective piece titled “Last Taboo,” the subheadline of which reads: “Why Pop Culture Just Can’t Deal With Black Male Sexuality.” Morris, who is a gay black man, notes that full-frontal male nudity has become more common in recent years in movies and television, but it’s usually white men who are being undressed for the camera, which tells us a lot about the unresolved but highly charged feelings that the culture still has toward the black male body. As Morris writes:

Black men [are] desired on one hand and feared on the other…Here’s our original sin metastasized into a perverted sticking point: The white dick means nothing, while, whether out of revulsion or lust, the black dick means too much.

And although I don’t want to detract from the importance of the point that Morris is making here, I’ll admit that as I read these words, another thought ran though my mind. If the white penis means nothing, then the Asian penis, by extension, must mean—well, less than nothing. I don’t mean to equate the desexualization of Asian males in popular culture with the treatment of black men in fiction and in real life. But both seem to provide crucial data points, from opposite ends, for our understanding of the underlying phenomenon, which is how writers and other artists have historically treated the bodies of those who look different than they do.

I read Morris’s piece after seeing a tweet by the New Yorker critic Emily Nussbaum, who connected it to an awful scene in last night’s episode of Westworld, in which an otherwise likable character makes a joke about a well-endowed black robot. It’s a weirdly dissonant moment for a series that is so controlled in other respects, and it’s possible that it reflects nothing more than Jonathan Nolan’s clumsiness—which he shares with his older brother—whenever he makes a stab at humor. (I also suspect, given the show’s production delays, that the line was written and shot a long time ago, before these questions assumed a more prominent role in the cultural conversation. Which doesn’t make it any easier to figure out what the writers were thinking.) Race hasn’t played much of a role on the series so far, and it may not be fair to pass judgment on a show that has only aired five episodes and clearly has a lot of other stuff on its mind. But it’s hard not to wonder. The cast is diverse, but the guests are mostly white men, undoubtedly because, as Nussbaum notes elsewhere, they’re the natural target audience for the park’s central fantasy. And the show has a strange habit of using its Asian cast members, who are mostly just faces in the background, as verbal punching bags for the other characters, a trend so peculiar that my wife and I both noticed it separately. It’s likely that this has all been muddied by what seems to be shaping up to be an actual storyline for Felix, played by Leonardo Nam, who looks as if he’s about to respond to his casual mistreatment by rising to a larger role in the story. But even for a show with a lot of moving parts, it strikes me as a lazy way of prodding a character into action.

John Lone in Year of the Dragon

Over the last few months, as it happens, I’ve been thinking a lot about the representation of Asians in science fiction. (As I’ve mentioned before, I’m Eurasian—half Chinese, half Finnish and Estonian.) I may as well start with Robert A. Heinlein’s Sixth Column, a novel that he wrote on assignment for Astounding Science Fiction, based in part on All, an earlier, unpublished serial by John W. Campbell. Both stories, which were written long before Pearl Harbor, are about the invasion of the United States by a combined Chinese and Japanese empire, which inspires an underground resistance movement in the form of a fake religion. Heinlein later wrote that he tried to rework the narrative to tone down its more objectionable elements, but it pains me to say that Sixth Column actually reads as more racist than All, simply because Heinlein was the stronger writer. When you read All, you don’t feel much of anything, because Campbell was a stiff and awkward stylist. Heinlein, by contrast, spent much of his career bringing immense technical skill to even the most questionable projects, and he can’t keep from investing his characters with real rhetorical vigor as they talk about “flat-faced apes” and “our slant-eyed lords.” I don’t even mind the idea of an Asian menace, as long as the bad guys are treated as worthy antagonists, which Heinlein mostly does. But when the leaders of the resistance decide to grow beards in order to fill the invaders with “a feeling of womanly inferiority,” it’s hard to excuse it. And the most offensive moment of all involves Mitsui, the only sympathetic Asian character in sight, who sacrifices himself for the sake of his friends and is rewarded with the epitaph: “But they had no time to dwell on the end of little Mitsui’s tragic life.”

That’s the kind of racism that rankles me: not the diabolical Asian villain, who can be invested with a kind of sinister allure, as much as the legion of little Mitsuis who still populate so much of our fiction. (This may be why I’ve always sort of liked Michael Cimino’s indefensible Year of the Dragon, which at least treats John Lone’s character as a formidable, glamorous foe. It’s certainly less full of hate than The Deer Hunter.) And it complicates my reactions to other issues. When it was announced that Sulu would be unobtrusively presented as gay in Star Trek Beyond, it filled me with mixed feelings, and not just because George Takei didn’t seem to care for the idea. As much as I appreciated what the filmmakers were trying to do, I couldn’t help but think that it would have been just as innovative, if not more so, to depict Sulu as straight. I’m aware that this risks making it all seem like a zero-sum game, which it isn’t. But these points deserve to be raised, if only because they enrich the larger conversation. If a single scene on Westworld can spark a discussion of how we treat black men as sexual objects, we can do the same with the show’s treatment of Asians. The series presumably didn’t invite or expect such scrutiny, but it occupies a cultural position—as a prestige drama on a premium cable channel—in which it has no choice but to play that part. Science fiction, in particular, has always been a sandbox in which these issues can be investigated in ways that wouldn’t be possible in narratives set in the present, from the original run of Star Trek on down. Westworld belongs squarely in that tradition. And these are frontiers that it ought to explore.

The Coco Chanel rule

with 4 comments

Coco Chanel

“Before you leave the house,” the fashion designer Coco Chanel is supposed to have said, “look in the mirror and remove one accessory.” As much as I like it, I’m sorry to say that this quote is most likely apocryphal: you see it attributed to Chanel everywhere, but without the benefit of an original source, which implies that it’s one of those pieces of collective wisdom that have attached themselves parasitically to a famous name. Still, it’s valuable advice. It’s usually interpreted, correctly enough, as a reminder that less is more, but I prefer to think of it as a statement about revision. The quote isn’t about reaching simplicity from the ground up, but about taking something and improving it by subtracting one element, like the writing rule that advises you to cut ten percent from every draft. And what I like the most about it is that its moment of truth arrives at the very last second, when you’re about to leave the house. That final glance in the mirror, when it’s almost too late to make additional changes, is often when the true strengths and weaknesses of your decisions become clear, if you’re smart enough to distinguish it from the jitters. (As Jeffrey Eugenides said to The Paris Review: “Usually I’m turning the book in at the last minute. I always say it’s like the Greek Olympics—’Hope the torch lights.'”)

But which accessory should you remove? In the indispensable book Behind the Seen, the editor Walter Murch gives us an important clue, using an analogy from filmmaking:

In interior might have four different sources of light in it: the light from the window, the light from the table lamp, the light from the flashlight that the character is holding, and some other remotely sourced lights. The danger is that, without hardly trying, you can create a luminous clutter out of all that. There’s a shadow over here, so you put another light on that shadow to make it disappear. Well, that new light casts a shadow in the other direction. Suddenly there are fifteen lights and you only want four.

As a cameraman what you paradoxically do is have the gaffer turn off the main light, because it is confusing your ability to really see what you’ve got. Once you do that, you selectively turn off some of the lights and see what’s left. And you discover that, “OK, those other three lights I really don’t need at all—kill ’em.” But it can also happen that you turn off the main light and suddenly, “Hey, this looks great! I don’t need that main light after all, just these secondary lights. What was I thinking?”

This principle, which Murch elsewhere calls “blinking the key,” implies that you should take away the most important piece, or the accessory that you thought you couldn’t live without.

Walter Murch

This squares nicely with a number of principles that I’ve discussed here before. I once said that ambiguity is best created out of a network of specifics with one crucial piece removed, and when you follow the Chanel rule, on a deeper level, the missing accessory is still present, even after you’ve taken it off. The remaining accessories were presumably chosen with it in mind, and they preserve its outlines, resulting in a kind of charged negative space that binds the rest together. This applies to writing, too. “The Cask of Amontillado” practically amounts to a manual on how to wall up a man alive, but Poe omits the one crucial detail—the reason for Montresor’s murderous hatred—that most writers would have provided up front, and the result is all the more powerful. Shakespeare consistently leaves out key explanatory details from his source material, which renders the behavior of his characters more mysterious, but no less concrete. And the mumblecore filmmaker Andrew Bujalski made a similar point a few years ago to The New York Times Magazine: “Write out the scene the way you hear it in your head. Then read it and find the parts where the characters are saying exactly what you want/need them to say for the sake of narrative clarity (e.g., ‘I’ve secretly loved you all along, but I’ve been too afraid to tell you.’) Cut that part out. See what’s left. You’re probably close.”

This is a piece of advice that many artists could stand to take to heart, especially if they’ve been blessed with an abundance of invention. I like Interstellar, for instance, but I have a hunch that it would have been an even stronger film if Christopher Nolan had made a few cuts. If he had removed Anne Hathaway’s speech on the power of love, for instance, the same point would have come across in the action, but more subtly, assuming that the rest of the story justified its inclusion in the first place. (Of course, every film that Nolan has ever made strives valiantly to strike a balance between action and exposition, and in this case, it stumbled a little in the wrong direction. Interstellar is so openly indebted to 2001 that I wish it had taken a cue from that movie’s script, in which Kubrick and Clarke made the right strategic choice by minimizing the human element wherever possible.) What makes the Chanel rule so powerful is that when you glance in the mirror on your way out the door, what catches your eye first is likely to be the largest, flashiest, or most obvious component, which often adds the most by its subtraction. It’s the accessory that explains too much, or draws attention to itself, rather than complementing the whole, and by removing it, we’re consciously saying no to what the mind initially suggests. As Chanel is often quoted as saying: “Elegance is refusal.” And she was right—even if it was really Diana Vreeland who said it. 

The story whisperer

with 5 comments

Storyboard for Aladdin

If you really want to learn how a story works, you should try telling it to a three-year-old. Over the last twelve months, as my daughter has begun to watch longer movies, I’ve developed a sideline business as a sort of simultaneous interpreter: I’ll sit next to her and offer a running commentary on the action, designed to keep her from getting restless and to preemptively answer her questions. If it’s a movie I’ve seen before, like My Neighbor Totoro, I don’t need to concentrate quite as intently, but on the handful of occasions when I’ve watched a movie with her for the first time in theaters—as we’ve done with The Peanuts Movie, The Good Dinosaur, Kung Fu Panda 3, and Zootopia—I’ve had to pay closer attention. What I whisper in her ear usually boils down to a basic description of a character’s emotions or objectives, if it isn’t already clear from action or dialogue: “He’s sad.” “She’s worried about her friend.” “He wants to find his family.” And I’ve come to realize that this amounts to a kind of reverse engineering. If a movie often originates in the form of beat sheets or storyboards that the filmmakers have to turn into fully realized scenes, by breaking down the action in terms that my daughter can understand, I’m simply rewinding that process back to the beginning.

And it’s taught me some surprising lessons about storytelling. It reminds me a little of a piece that ran last year in The New York Times Magazine about Rasha Ajalyaqeen, a former interpreter for the United Nations. Like Ajalyaqeen, I’m listening to a story and translating it into a different language in real time, and many of the tips that she shares apply equally well here: “Be invisible.” “Leave your opinions behind; your voice should reflect the speaker’s feelings.” “Forget pausing to find the right word.” And most of all:

Word-for-word translation can result in a nonsensical mess. Instead, break longer, complicated phrases into shorter units of single concepts. “A good translator does not interpret words; he interprets meaning,” says Ajalyaqeen, who grew up in Syria. Be prepared to dive into sen­tences without knowing where they are going grammatically…”Sometimes you start and you don’t know what your subject is—you’re waiting for the verb.”

“Waiting for the verb” is as good a way as any to describe what I often have to do with my daughter: I’m not sure where the scene is going, but I have to sustain her interest until the real action kicks in.

Storyboard for Aladdin

This is a valuable exercise, because it forces me to engage with the story entirely in the present tense. I’ve spoken here before of how a story can best be understood as a sequence of objectives, which is the approach that David Mamet articulates so beautifully in On Directing Film, the best book on storytelling I’ve ever read. In practice, though, it’s easy to forget this. When you’re the writer, you find yourself thinking in terms of the story’s overall shape, and even if you’re just the reader or a member of the audience, you often skip ahead to anticipate what comes next. When you’re trying to explain it to a three-year-old, there isn’t time for any of this—your only goal is to explicate what is happening on the screen right now. After you’ve done this for a dozen or more movies, you start to appreciate how this approximates how we subconsciously experience all stories, no matter how sophisticated they might be. A good movie or novel doesn’t just put one scene after another, like a series of beads on a string, but that’s how we absorb it, and it needs to be told with clarity on that simple sequential level if its larger patterns are going to have any meaning. Like a properly constructed improvisation, an engaging story comes down to a series of “Yes, and…” statements. And the fact that it also needs to be more doesn’t excuse it from its basic obligation to be clear and logical with each individual beat.

And talking your way through through a movie like this—even if the three-year-old you’re addressing is an imaginary one—can lead to unexpected insights into a story’s strengths and weaknesses. I came away even more impressed by Zootopia because of how cleverly it grounds its complicated plot in a series of units that can be easily grasped: I don’t think Beatrix was ever lost for more than a few seconds. And when I watched Aladdin with her this morning, I became uncomfortably aware of the golden thread of fakery that runs through the center of that story: it’s a skillful script, but it hits its beats so emphatically that I was constantly aware of how it was manipulating us. (Compare this to Miyazaki’s great movies, from Kiki’s Delivery Service to Ponyo, which achieve their effects more subtly and mysteriously, while never being anything less than fascinating.) I’ve even found myself doing much the same thing when I’m watching a television show or reading a book on my own. When you try to see the story through a child’s eyes, and to frame it in terms that would hold the attention of a preschooler, you quickly learn that it isn’t a question of dumbing it down, but of raising it to an even greater level of sophistication, with the story conveyed with the clarity of a fairy tale. Anyone who thinks that this is easy has never tried to do it for real. And at every turn, you need to be asking yourself a toddler’s favorite question: “Why?”

Written by nevalalee

March 29, 2016 at 9:22 am

Cutty Sark and the semicolon

with 2 comments

Vladimir Nabokov

In an interview that was first published in The Paris Review, the novelist Herbert Gold asked Vladimir Nabokov if an editor had ever offered him any useful advice. This is what Nabokov said in response:

By “editor” I suppose you mean proofreader. Among these I have known limpid creatures of limitless tact and tenderness who would discuss with me a semicolon as if it were a point of honor—which, indeed, a point of art often is. But I have also come across a few pompous avuncular brutes who would attempt to “make suggestions” which I countered with a thunderous “stet!”

I’ve always adored that thunderous stet, which tells us so much about Nabokov and his imperious resistance to being edited by anybody. Today, however, I’m more interested in the previous sentence. A semicolon, as Nabokov puts it, can indeed be a point of honor. Nabokov was perhaps the most painstaking of all modern writers, and it’s no surprise that the same perfectionism that produced such conceptual and structural marvels as Lolita and Pale Fire would filter down to the smallest details. But I imagine that most authors can relate to how a single punctuation mark in a manuscript can start to loom as large as the finger of God in the Sistine Chapel.

And there’s something about the semicolon that seems to inspire tussles between writers and their editors—or at least allows it to stand as a useful symbol of the battles that can occur during the editorial process. Here’s an excerpt from a piece by Charles McGrath in The New York Times Magazine about the relationship between Robert Caro, author of The Years of Lyndon Johnson, and his longtime editor Robert Gottlieb:

“You know that insane old expression, ‘The quality of his defect is the defect of his quality,’ or something like that?” Gottlieb asked me. “That’s really true of Bob. What makes him such a genius of research and reliability is that everything is of exactly the same importance to him. The smallest thing is as consequential as the biggest. A semicolon matters as much as, I don’t know, whether Johnson was gay. But unfortunately, when it comes to English, I have those tendencies, too, and we could go to war over a semicolon. That’s as important to me as who voted for what law.”

It’s possible that the semicolon keeps cropping up in such stories because its inherent ambiguity lends itself to disagreement. As Kurt Vonnegut once wrote: “Here is a lesson in creative writing. First rule: Do not use semicolons. They are transvestite hermaphrodites representing absolutely nothing. All they do is show you’ve been to college.” And I’ve more or less eliminated semicolons from my own work for much the same reason.

Robert De Niro and Martin Scorsese on the set of Raging Bull

But the larger question here is why artists fixate on things that even the most attentive reader would pass over without noticing. On one level, you could take a fight over a semicolon as an illustration of the way that the creative act—in which the artist is immersed in the work for months on end—tends to turn mountains into molehills. Here’s one of my favorite stories about the making of Raging Bull:

One night, when the filmmakers were right up against the deadline to make their release date, they were working on a nothing little shot that takes place in a nightclub, where a minor character turns to the bartender and orders a Cutty Sark. “I can’t hear what he’s saying,” [Martin Scorsese] said. Fiddling ensued—extensive fiddling—without satisfying him. [Producer Irwin] Winkler, who was present, finally deemed one result good enough and pointed out that messengers were standing by to hand-carry release prints to the few theaters where the picture was about to premiere. At which point, Scorsese snapped. “I want my name taken off the picture,” he cried—which bespeaks his devotion to detail. It also bespeaks his exhaustion at the end of Raging Bull, not to mention the craziness that so often overtakes movies as they wind down. Needless to say, he was eventually placated. And you can more or less hear the line in the finished print.

And you could argue that this kind of microscopic attention is the only thing that can lead to a work that succeeds on the largest possible scale.

But there’s another story that gets closer to truth. In Existential Errands, Norman Mailer describes a bad period in his life—shortly after he was jailed for stabbing his second wife Adele—in which he found himself descending into alcoholism and unable to work. His only source of consolation were the scraps of paper, “little crossed communications from some wistful outpost of my mind,” that he would find in his jacket pocket after a drunken night. Mailer writes of these poems:

I would go to work, however, on my scraps of paper. They were all I had for work. I would rewrite them carefully, printing in longhand and ink, and I would spend hours whenever there was time going over these little poems…And since I wasn’t doing anything else very well in those days, I worked the poems over every chance I had. Sometimes a working day would go by, and I might put a space between two lines and remove a word. Maybe I was mending.

Which just reminds us that a seemingly minuscule change can be the result of a prolonged confrontation with the work as a whole. You can’t obsess over a semicolon without immersing yourself in the words around it, and there are times when you need such a focal point to structure your engagement with the rest. It’s a little like what is called a lakshya in yoga: the tiny spot on the body or in the mind on which you concentrate while meditating. In practice, the lakshya can be anything or nothing, but without it, your attention tends to drift. In art, it can be a semicolon, a word, or a line about Cutty Sark. It may not be much in itself. But when you need to tether yourself to something, even a semicolon can be a lifeline.

Thinking away from the notes

with 2 comments

Terry Gross

On a typical day, [Terry] Gross is at the office from 8:45 to 5:45…Gross will continue working at home, preparing for the next day’s interview in the living room. She clarifies her thoughts first thing in the morning in the shower. That’s when she asks herself: What do I care about? What in all of this research is meaningful? It’s important to be away from her notes when she does this. She emerges from the shower with her ‘‘major destination points.’’ Then she goes to her office and refers back to her notes—sheafs of facts; dog-eared, marked-up books—for the details. Then she does the interview. And then she is inundated by the other daily tasks of running a radio show. The next day, she does it all again.

Susan Burton, in The New York Times Magazine

Written by nevalalee

October 31, 2015 at 7:30 am

Sex and the single shark

with 4 comments

Jaws by Peter Benchley

A few weeks ago, I picked up a used copy of the original hardcover edition of Peter Benchley’s Jaws. It caught my eye in part because of the iconic cover art, designed by the legendary Paul Bacon, who passed away earlier this summer. Although the painting was redrawn for the paperback, which later became the basis for one of the great movie posters, it’s still a work of graphic genius, second only to Chip Kidd’s dust jacket design for Jurassic Park in the unexpected way it came to define an entire franchise. And upon leafing through the novel itself—I’m still only halfway through—I was struck by how much it differs, not just from its film adaptation, but from what we’ve come to expect from a modern thriller. There’s a lot of background material on the town of Amity, some engaging, some not, including an entire subplot about the mayor’s mob connections. Most stupefying of all is the huge amount of space devoted to a plot thread, which the movie omits entirely, about an affair between Chief Brody’s wife and Hooper, the oceanographer played in the film by Richard Dreyfuss. It takes up something like sixty uninterrupted pages right in the middle of the novel, and frankly, it’s terrible, complete with passages of awful, clinical, mid-seventies lovemaking as bad as anything from Irving Wallace, who wrote about sex, as one critic put it, as if he’d never had it himself. (A tip to writers: any passage that unblushingly includes the phrase “her genitals” probably doesn’t need to exist.)

Reading the section again today, it’s hard to shake a sense that it must have struck many readers at the time as about as pointless as it seems now. Benchley can be a fine writer elsewhere, but I’d like to think that a modern editor would have taken him firmly by the hand and advised him to cut the whole thing. In fact, the man who edited Jaws was Thomas Congdon, an editor at Doubleday whose clients would later include David Halberstam and Russell Baker, and his collaboration with Benchley has been documented in exceptional detail, thanks to a fascinating story that the journalist Ted Morgan wrote for The New York Times Magazine around the time of the book’s publication. Congdon commissioned the novel from Benchley before a single word of it had been written, and he worked closely with the author, starting at the outline phase, which is unusual in itself. And Congdon, unbelievably, is the one we have to thank for what I have no choice but to call, ahem, the Dreyfuss affair. As Morgan writes:

When Benchley wrote a sex scene between the police chief and his wife, Congdon’s sense of propriety was offended: “I don’t think there’s any place for wholesome married sex in this kind of book,” he wrote. Benchley obediently turned the wife into an adulteress, who has an affair with a young marine scientist. [Italics mine.]

The poster for Jaws

Still, for all I know, Congdon may have been right. It certainly didn’t hurt the novel: half of Morgan’s article is devoted to cataloging its massive sales figures and proceeds from subsidiary rights, and this is all before the movie came out. (The name “Steven Spielberg” never appears, and the only person mentioned from the film side is producer Richard Zanuck.) And while Jaws might seem like a genre unto itself, it has to be read in the context of seventies bestsellerdom, which was dominated by the likes of Wallace, Jacqueline Susann, and Harold Robbins, who spiced up every story with generous helpings of smut. You might even say that the movie version of Jaws, which spawned the modern blockbuster, marks a transitional moment in more ways than one: the only remotely erotic moment in the film is Susan Backlinie’s nude swim at the very beginning, followed by the unavoidable sexual overtones of the ensuing shark attack. Mass culture was moving into an era in which the adult obsessions of the seventies would give way to a fascination with hardware and special effects, calculated to appeal to a teenage male audience that would have found Ellen Brody’s midlife sexual awakening even less interesting than I did. The real love affair in the movie is between the audience and the shark, or, more precisely, between Spielberg’s camera and the shark’s elusive silhouette. Anything else would be superfluous.

As it happens, Jaws wasn’t the first major motion picture of that decade to shy away from sexual elements in the source material. Mario Puzo’s original novel of The Godfather goes on for page after page about Lucy Mancini, Sonny’s girlfriend, and in particular about an odd feature of her anatomy and its subsequent surgical correction. Francis Coppola found it about as weird as many readers undoubtedly did:

I started to read the book. I got only fifty pages into it. I thought, it’s a popular, sensational novel, pretty cheap stuff. I got to the part about the singer supposedly modeled on Frank Sinatra and the girl Sonny Corleone liked so much because her vagina was enormous—remember that stuff in the book? It never showed up in the movie. Anyway, I said, “My God, what is this—The Carpetbaggers? So I stopped reading and said, “Forget it.”

Not every movie from that era shied away from the sexual elements—The Exorcist sure as hell didn’t—but it’s hard not to see the pattern here. As audiences changed, books that were written in part with an eye to the movie rights began to tone down the sex, then cut it altogether, knowing that it was unlikely to survive the adaptation anyway. Readers didn’t seem to miss it, either. And while I’d say that it was no great loss, I also wish that we had books and movies large enough to accommodate good sex in fiction, when necessary, along with more innocent thrills. Pop culture is a ship in which we’re all traveling together, and to get the range of stories we deserve, we’re going to need a bigger boat.

“Let’s get out of here!”

leave a comment »

The script of Django Unchained
Over the weekend, Virginia Heffernan of The New York Times Magazine published a short essay on the most versatile line of dialogue in movies: “Let’s get out of here!” She quotes examples from films ranging from Breakfast at Tiffany’s to Grease to Titanic, and she notes that Roger Ebert once “casually ranked” it as one of the most common lines in cinema, alongside “Look out!” and “Take this!” Heffernan doesn’t mention—or perhaps she was unaware—that the line’s apparent popularity is more than just a hunch, at least according to Guinness Film Facts and Feats, which states:

The most hackneyed line in movie scripts is “Let’s get outta here.” A survey of 150 American features of the period 1938-74 (revived on British television) showed that it was used at least once in 84 per cent of Hollywood productions and more than once in 17 percent.

And although this particular source is four decades out of date, I don’t doubt that an updated study would yield much the same result. A quick search on Subzin, which pulls in quotes from movie and television subtitles, reveals thousands of examples, including many instances from recent movies like Birdman, Fury, Lone Survivor, and Muppets Most Wanted.

Heffernan goes on to make the case, based on her readings of the scripts of this year’s Oscar nominees, that the line that resonates more with us now is “Stay.” It’s a little too anecdotal to be entirely convincing, and it smacks a bit of a Ctrl-F search. But I love the way she explains the appeal of the earlier phrase:

“Let’s get out of here” may be the five most productive monosyllables in American movies. It confers agency on whoever says it. It draws a line under what’s gone before. It propels action. It justifies a change of scene, no matter how abrupt. No wonder screenwriters can’t get enough of it.

In other words, it’s a kind of screenwriting multitool, a line that comes in handy in any number of situations. I’ve noted before that writers of all kinds are always on the lookout for reliable tricks, and “Let’s get out of here” might be the best of them all. It’s like “Supercalifragilisticexpialidocious,” but for real—something you can always say when you can’t think of anything else. If you’re writing a script or a story and you’re stuck on a line, you can have a character say those five little words, and more often than not, it’ll work.

Script for American Hustle

And what makes “Let’s get out of here” so useful is that has all the qualities that Heffernan notes—it confers agency, drives the story forward, and prompts a change of scene—while being uninflected enough to pass unremarked. Your average cliché is rendered useless, or of limited utility, once it becomes familiar enough to be noticeable, but “Let’s get out of here” is to screenplay structure what a subordinating conjunction, like “in order that” or “as soon as,” is to ordinary grammar. It’s a connective that bridges two units of action, and it’s so commonplace that we don’t even hear it. Yet it still retains its power, in part because of the subtle way in which it differs from similar sentiments like “Let’s leave” or “Let’s go.” As Heffernan says:

“Let’s get out of here” is our bold spin on the innocuous “Let’s leave,” sending a signal to the nervous system that we’re slipping the knot, and we’re doing it together. The offhand contempt in the phrase is what makes it so satisfying: When we’re getting out of here, we’re not going to some idealized destination. Who knows where we’re going, really? Anywhere but here.

Occasionally, screenwriters try to invent a new phrase that serves the same purpose, but the results aren’t nearly as neat as the gold standard that has gradually evolved over time. There was a moment in the last decade when every other movie—Children of Men, The Hangover, Star Trek, Iron Man—seemed to include some version of the line “Walk with me.” You can see why it might catch a screenwriter’s eye: it’s pithy, it provides a neat justification for a walk and talk, and the imperative form is all business, as if the character has too much on his mind to simply say “Let’s take a walk.” The only trouble is that it doesn’t sound much like anything a real human being would say, unless he or she is mimicking a movie. It rings false, at least to me, and it always takes me out of the story for a second: I’m aware of the screenwriter straining just a tad too hard. And it isn’t necessary. “Let’s get out of here” is perfectly fine, and it works its magic without drawing attention to itself. (It also seems to appear more often in the movies themselves in their original scripts, implying that it was improvised on the set, which only shows how intuitive it is.) So there’s little point in tinkering with something that already works so beautifully: in movies, as in most kinds of storytelling, the only important thing is to get from here to there.

Written by nevalalee

February 24, 2015 at 10:07 am

The old pornographers

with 2 comments

Spaceways by John Cleve

“Dad typed swiftly and with great passion,” Chris Offutt writes in “My Dad, the Pornographer,” his compelling essay from last weekend’s New York Times Magazine. “In this fashion, he eventually wrote and published more than four hundred books. Two were science fiction and twenty-four were fantasy, written under his own name; the rest were pornography, using seventeen pseudonyms.” Offutt’s reflections on his father, who wrote predominantly under the name John Cleve, make for a gripping read, and he doesn’t shy away from some of the story’s darker elements. Yet he touches only briefly on a fascinating chapter in the history of American popular fiction. Cleve plunged into the world of paperback pornography and never left, but other writers in the middle third of the last century used it as a kind of paid apprenticeship, cranking out an anonymous novel a month, learning a few useful tricks, and moving on once they had established a name for themselves in other genres. And they included Dean Koontz, Lawrence Block, Donald E. Westlake, and Anne Rice, as well as undoubtedly many more who haven’t acknowledged their work in public.

These cheap little novels were the product of a unique cultural moment, spanning the forty years between the introduction of the paperback after World War II and the widespread availability of videocassettes in the early eighties, and we aren’t likely to see their like again—although the proliferation of erotic short stories for the Kindle and other formats points to a possible resurgence. And although in a few cases, like that of Marion Zimmer Bradley, those detours into erotica cast a troubling light on other aspects of the writer’s life, for many authors, it was just another job, in the way a contemporary novelist might dabble in corporate or technical writing. Block—whose first published book was a “lesbian novel,” mostly because he knew it was something he could sell—wrote erotic paperbacks at the rate of twelve or more a year, spending exactly two weeks on each one, including a weekend off. He says:

There was one time I well remember when, checking the pages at the end of the day’s work, I discovered that I’d written pages 31 through 45 but had somehow jumped in my page numbering from 38 to 40. Rather than renumber the pages, I simply sat down and wrote page 39 to fit. Since page 38 ended in the middle of a sentence, a sentence which then resumed on page 40, it took a little fancy footwork to slide page 39 in there, but the brash self-confidence of youth was evidently up to the challenge.

The Crusader by John Cleve

To be fair, it’s tempting to romanticize the life of a hack pornographer, as much as it is for any kind of pulp fiction: most of these novels are terrible, and even for writers with talent, the danger of creative burnout or cynicism was a real one. (Westlake describes this at length in his very funny novel Adios Scheherazade, a thinly veiled account of his own days in the smut mines.) But it’s hard not to feel at least a little nostalgic for a time when it was possible for writers to literally prostitute their talent, while hopefully emerging with some valuable experience. In Writing Popular Fiction, which dates from around the same period, Koontz outlines a few of the potential benefits:

For one thing, since virtually all Rough Sexy Novels are published under pen names, you can learn to polish your writing while getting paid for the pleasure, and have no fear of damaging your creative reputation. Also, because [erotica] puts absolutely no restrictions on the writer besides the requirement of regular sex scenes, one after the other, you can experiment with style, try stream of consciousness, present tense narrative, and other stylistic tricks, to learn if you can make them work.

Pornography and literary modernism have always had a curiously intimate relationship: Grove Press, which published Cleve’s most successful titles, was both a landmark defender of free speech in fiction—it released the first American editions of Lady Chatterley’s Lover and Tropic of Cancer—and a prolific producer of paperback smut. Erotica, both in written form and elsewhere, has often been used to sneak in subversive material that would be rejected outright in more conventional works. For evidence, we need look no further than the career of Russ Meyer, or even of Robert Anton Wilson, who tells the following story about The Book of the Breast:

This book, frankly, got written originally because an editor at Playboy Press asked me if I could write a whole book on the female breast. “Sure,” I said at once. I would have said the same if he had asked me if I could write a book on the bull elephant’s toenails. I was broke that month and would have tried to write anything, if somebody would pay me for it. When I got the contract and the first half of the advance money, I sat down and asked myself what the hell I would put in the damned book.

Wilson eventually ended up structuring the book around an introduction to Taoist philosophy, “to keep myself amused, and thereby speed the writing so I could get the second half of the advance quickly.” Regardless of how we feel about the result, that’s a sentiment that all working writers can cheer. These days, a novelist in need of a paycheck is more likely to go into public relations. But I don’t know if that makes us any better off.

Tying the knot

leave a comment »

Bowline

McKenzie Funk’s recent piece in The New York Times Magazine on the wreck of the Kulluk, the doomed oil rig sent by Shell to drill an exploratory well in the Arctic Sea, is one of the most riveting stories I’ve read in a long time. The whole thing is full of twists and turns—I devoured it in a single sitting—but my favorite moment involves a simple knot. Faced with a rig with a broken emergency line, Craig Matthews, the chief engineer of the tugboat Alert, came up with a plan: they’d get close enough to grab the line with a grappling hook, reel it in, and tie it to their own tow cable with a bowline. After two tries, they managed to snag the line, “thicker than a man’s arm, a soggy dead weight.” Funk describes what happened next:

Now Matthews tried to orient himself. A knot he could normally tie with one hand without looking would have to be tackled by two people, chunk by chunk. The chief mate helped him lift the line again, and together they hurriedly bent it and forced the rabbit through the hole…Matthews had planned to do a second bend, just in case, but he was exhausted. “Is that it?” Matthews recalls the chief mate asking. His answer was to let the towline slide over the edge.

And although plenty of other things would go wrong later on, the knot held throughout all that followed.

The story caught my eye because it reminded me, as almost everything does these days, of the creative process. When you’re a writer, you generally hone your craft on smaller projects, short stories or essays that you can practically hold in one hand. Early on, it’s like learning to tie a bowline for the first time—as Brody does in Jaws—and it can be hard to even keep the ends straight, but sooner or later, you internalize the steps to the point where you can take them for granted. As soon as you tackle a larger project, though, you find that you suddenly need to stop and remember everything you thought you knew by heart. Most of us don’t think twice about how to tie our shoelaces, but if we were told to make the same knot in a rope the thickness of a fire hose, we’d have to think hard. A change in scale forces us to relearn the most basic tricks, and at first, we feel almost comically clumsy. That’s all the more true of a collaborative effort, like making a movie or staging a play, which can often feel like two people tying a knot together while being buffeted by wind and waves. (Technically, a novel or play is more like one big knot made up of many other knots, but maybe this analogy is strained enough as it is.)

Alexander Cutting the Gordian Knot by Fedele Fischetti

Knots have long fascinated novelists, like Annie Proulx, perhaps because they’re the purest example of an abstract design intended to perform a functional task. As Buckminster Fuller points out in Synergetics, you can tie a loose knot in a rope spliced together successively from distinct kinds of fiber, like manila and cotton, and slip it from one end to the other: the materials change, but the knot stays the same. Fuller concludes, in his marvelously explicit and tangled prose: “The knot is not the rope; it is a weightless, mathematical, geometrical, metaphysically conceptual, pattern integrity tied momentarily into the rope by the knot-conceiving, weightless mind of the human conceiver—knot-former.” That’s true of a novel, too. You can, and sometimes do, revise every sentence of a story into a different form while leaving the overall pattern the same. Knots themselves can be used to transmit information, as in the Incan quipu, which record numbers and even syllables in the form of knotted cords. And Robert Graves has suggested that the Gordian knot encoded the name of a Phrygian god, which could only be untied by reading the message one letter at a time. Alexander the Great simply cut it with his sword—a solution that has occurred to more than one novelist frustrated by the mess he’s created.

You could write an entire essay on the metaphors inherent in knots, the language of which itself is rich with analogies: the phrase “the bitter end,” for instance, originates in ropeworking, referring to the end of the rope that is tied off. Most memorable is Fuller’s own suggestion that the ropeworker himself is a kind of thinking knot:

The metabolic flow that passes through a man is not the man. He is an abstract pattern integrity that is sustained through all his physical changes and processing, a knot through which pass the swift strands of concurrent ecological cycles—recycling transformations of solar energy.

And if we’re all simply knots passing through time, the prospect of untying and redoing that pattern is far more daunting than doing the same for even the most complicated novel. We’re all a little like Matthews on the deck of the Alert: we’d like to do a second bend to be safe, but sometimes we have no choice but to let the line slip over the side. That’s true of the small things, like sending out a story when we might prefer to noodle over it forever, and the large, like choosing a shape for your life that you hope will get the job done. And all we can really do is tie the knot we know best, let it go, and hang on to the bitter end.

Written by nevalalee

January 6, 2015 at 9:58 am

The power of the page

leave a comment »

Laura Hillenbrand

Over the weekend, I found myself contemplating two very different figures from the history of American letters. The first is the bestselling nonfiction author Laura Hillenbrand, whose lifelong struggle with chronic fatigue syndrome compelled her to research and write Seabiscuit and Unbroken while remaining largely confined to her house for the last quarter of a century. (Wil S. Hylton’s piece on Hillebrand in The New York Times Magazine is absolutely worth a read—it’s the best author profile I’ve seen in a long time.) The other is the inventor and engineer Buckminster Fuller, whose life was itinerant as Hillebrand’s is stationary. There’s a page in E.J. Applewhite’s Cosmic Fishing, his genial look at his collaboration with Fuller on the magnum opus Synergetics, that simply reprints Fuller’s travel schedule for a representative two weeks in March: he flies from Philadelphia to Denver to Minneapolis to Miami to Washington to Harrisburg to Toronto, attending conferences and giving talks, to the point where it’s hard to see how he found time to get anything else done. Writing a coherent book, in particular, seemed like the least of his concerns; as Applewhite notes, Fuller’s natural element was the seminar, which allowed him to spin complicated webs of ideas in real time for appreciative listeners, and one of the greatest challenges of producing Synergetics lay in harnessing that energy in a form that could be contained within two covers.

At first glance, Hillenbrand and Fuller might seem to have nothing in common. One is a meticulous journalist, historian, and storyteller; the other a prodigy of worldly activity who was often reluctant to put his ideas down in any systematic way. But if they meet anywhere, it’s on the printed page—and I mean this literally. Hylton’s profile of Hillebrand is full of fascinating details, but my favorite passage describes how her constant vertigo has left her unable to study works on microfilm. Instead, she buys and reads original newspapers, which, in turn, has influenced the kinds of stories she tells:

Hillenbrand told me that when the newspaper arrived, she found herself engrossed in the trivia of the period—the classified ads, the gossip page, the size and tone of headlines. Because she was not hunched over a microfilm viewer in the shimmering fluorescent basement of a research library, she was free to let her eye linger on obscure details.

There are shades here of Nicholson Baker, who became so concerned over the destruction of library archives of vintage newspapers that he bought a literal ton of them with his life savings, and ended up writing an entire book, the controversial Human Smoke, based on his experience of reading press coverage of the events leading up to World War II day by day. And the serendipity that these old papers afforded was central to Hillebrand’s career: she first stumbled across the story of Louie Zamperini, the subject of Unbroken, on the opposite side of a clipping she was reading about Seabiscuit.

Buckminster Fuller

Fuller was similarly energized by the act of encountering ideas in printed form, with the significant difference that the words, in this case, were his own. Applewhite devotes a full chapter to Fuller’s wholesale revision of Synergetics after the printed galleys—the nearly finished proofs of the typeset book itself—had been delivered by their publisher. Authors aren’t supposed to make extensive rewrites in the galley stage; it’s so expensive to reset the text that writers pay for any major changes out of their own pockets. But Fuller enthusiastically went to town, reworking entire sections of the book in the margins, at a personal cost of something like $3,500 in 1975 dollars. And Applewhite’s explanation for this impulse is what caught my eye:

Galleys galvanize Fuller partly because of the large visual component of his imagination. The effect is reflexive: his imagination is triggered by what the eye frames in front of him. It was the same with manuscript pages: he never liked to turn them over or continue to another sheet. Page = unit of thought. So his mind was retriggered with every galley and its quite arbitrary increment of thought from the composing process.

The key word here is “quite arbitrary.” A sequence of pages—whether in a newspaper or in a galley proof—is an arbitrary grid laid on a sequence of ideas. Where the page break falls, or what ends up on the opposite side, is largely a matter of chance. And for both Fuller and Hillenbrand, the physical page itself becomes a carrier of information. It’s serendipitous, random, but no less real.

And it makes me reflect on what we give up when pages, as tangible objects, pass out of our lives. We talk casually about “web pages,” but they aren’t quite the same thing: now that many websites, including this one, offer visitors an infinite scroll, the effect is less like reading a book than like navigating the spool of paper that Kerouac used to write On the Road. Occasionally, a web page’s endlessness can be turned into a message in itself, as in the Clickhole blog post “The Time I Spent on a Commercial Whaling Ship Totally Changed My Perspective on the World,” which turns out to contain the full text of Moby-Dick. More often, though, we end up with a wall of text that destroys any possibility of accidental juxtaposition or structure. I’m not advocating a return to the practice of arbitrarily dividing up long articles into multiple pages, which is usually just an excuse to generate additional clicks. But the primacy of the page—with its arbitrary slice or junction of content—reminds us of why it’s still sometimes best to browse through a physical newspaper or magazine, or to look at your own work in printed form. At a time when we all have access to the same world of information, something as trivial as a page break or an accidental pairing of ideas can be the source of insights that have occurred to no one else. And the first step might be as simple as looking at something on paper.

The Ian Malcolm rule

with one comment

Jeff Goldblum in Jurassic Park

A man is rich in proportion to the number of things he can afford to leave alone.

—Henry David Thoreau, Walden

Last week, at the inaugural town hall meeting at Facebook headquarters, one brave questioner managed to cut through the noise and press Mark Zuckerberg on the one issue that really matters: what’s the deal with that gray shirt he always wears? Zuckerberg replied:

I really want to clear my life to make it so I have to make as few decisions as possible about anything except best how to serve this community…I’m in this really lucky position where I get to wake up every day and help serve more than a billion people. And I feel like I’m not doing my job if I spend any of my energy on things that are silly or frivolous about my life…So even though it kind of sounds silly—that that’s my reason for wearing a gray t-shirt every day—it also is true.

There’s a surprising amount to unpack here, starting with the fact, as Allison P. Davis of New York Magazine points out, that it’s considerably easier for a young white male to always wear the same clothes than a woman in the same situation. It’s also worth noting that wearing the exact same shirt each day turns simplicity into a kind of ostentation: there are ways of minimizing the amount of time you spend thinking about your wardrobe without calling attention to it so insistently.

Of course, Zuckerberg is only the latest in a long line of high-achieving nerds who insist, rightly or wrongly, that they have more important things to think about than what they’re going to wear. There’s more than an echo here of the dozens of black Issey Miyake turtlenecks that were stacked in Steve Jobs’s closet, and in the article linked above, Vanessa Friedman of The New York Times also notes that Zuckerberg sounds a little like Obama, who told Michael Lewis in Vanity Fair: “You’ll see I wear only gray or blue suits. I’m trying to pare down decisions. I don’t want to make decisions about what I’m eating or wearing. Because I have too many other decisions to make.” Even Christopher Nolan gets into the act, as we learn in the recent New York Times Magazine profile by Gideon Lewis-Kraus:

Nolan’s own look accords with his strict regimen of optimal resource allocation and flexibility: He long ago decided it was a waste of energy to choose anew what to wear each day, and the clubbable but muted uniform on which he settled splits the difference between the demands of an executive suite and a tundra. The ensemble is smart with a hint of frowzy, a dark, narrow-lapeled jacket over a blue dress shirt with a lightly fraying collar, plus durable black trousers over scuffed, sensible shoes.

Mark Zuckerberg

If you were to draw a family tree between all these monochromatic Vulcans, you’d find that, consciously or not, they’re all echoing their common patron saint, Ian Malcolm in Jurassic Park, who says:

In any case, I wear only two colors, black and gray…These colors are appropriate for any occasion…and they go well together, should I mistakenly put on a pair of gray socks with my black trousers…I find it liberating. I believe my life has value, and I don’t want to waste it thinking about clothing.

As Malcolm speaks, Crichton writes, “Ellie was staring at him, her mouth open”—apparently stunned into silence, as all women would be, at this display of superhuman rationality. And while it’s easy to make fun of it, I’m basically one of those guys. I eat the same breakfast and lunch every day; my daily uniform of polo shirt, jeans, and New Balance sneakers rarely, if ever, changes; and I’ve had the same haircut for the last eighteen years. If pressed, I’d probably offer a rationale more or less identical to the ones given above. As a writer, I’m called upon to solve a series of agonizingly specific problems each time I sit down at my desk, so the less headspace I devote to everything else, the better.

Which is all well and good. But it’s also easy to confuse the externals with their underlying intention. The world, or at least the Bay Area, is full of young guys with the Zuckerberg look, but it doesn’t matter how little time you spend getting dressed if you aren’t mindfully reallocating the time you save, or extending the principle beyond the closet. The most eloquent defense of minimizing extraneous thinking was mounted by the philosopher Alfred North Whitehead, who writes:

It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking of what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them. Operations of thought are like cavalry charges in a battle—they are strictly limited in number, they require fresh horses, and must only be made at decisive moments.

Whitehead isn’t talking about his shirts here; he’s talking about the Arabic number system, a form of “good notation” that frees the mind to think about more complicated problems. Which only reminds us that the shirts you wear won’t make you more effective if you aren’t being equally thoughtful about the decisions that really count. Otherwise, they’re only an excuse for laziness or indifference, which is just as contagious as efficiency. And it often comes to us as a wolf in nerd’s clothing.

The running man

leave a comment »

Tom Cruise in Jack Reacher

“There is a major but very difficult realization that needs to be reached about [Cary] Grant—difficult, that is, for many people who like to think they take the art of film seriously,” David Thomson writes in The New Biographical Dictionary of Film, before going on to make a persuasive argument that Grant “was the best and most important actor in the history of the cinema.” There’s a similarly difficult realization that needs to be reached about Tom Cruise, which is that for better or worse, over the last quarter of a century, he’s been the best movie star we have, and one of the best we’ve ever had. Not the best actor, certainly, or even the one, like Clooney, who most embodies our ideas of what a star should be, but simply the one who gave us the most good reasons to go to the movies for more than twenty years. I love film deeply, and I’ve thought about it more than any sane person probably should, and I have no trouble confessing that for most of my adult life, Cruise and his movies have given me more pleasure than the work of any other actor or director.

And yet it wasn’t until I realized that I loved his movies that I really started to take notice of him in his own right. We’re usually drawn to stars because of the qualities they embody, but in Cruise’s case, I became a fan—and remain a huge one—because I belatedly noticed that whenever I bought a ticket to a movie with his name above the title, I generally had a hell of a good time. That hasn’t always been true in recent years, and while some might say that his movies have taken a hit because Cruise’s own public image has been tarnished, I’d argue that the causal arrow runs the other way. Cruise has always functioned less as a traditional movie star than as a sort of seal of quality: a guarantee that we’ll be treated to a film that provides everything that the money, talent, and resources of a major studio can deliver. As a result, whenever the movies in which he appears become less interesting, Cruise himself grows less attractive. Left to his own devices, he can’t rescue Lions for Lambs or Knight and Day, but if he gives us a big, impersonal toy like Mission: Impossible—Ghost Protocol, all is forgiven.

Tom Cruise in Mission: Impossible—Ghost Protocol

It’s worth emphasizing how strange this is. We tend to think of movie stars as supernatural beings who can elevate mediocre material by their mere presence, but Cruise is more of a handsome, professional void, a running man around whom good to great movies have assembled themselves with remarkable consistency. In fact, he’s more of a great producer and packager of talent who happens to occupy the body of a star who can also get movies made. Hollywood consists of many ascending circles of power, in which each level has more of it than the one below, but when judged by its only real measure—the ability to give a film a green light—true power has traditionally resided with a handful of major stars. What sets Cruise apart from the rest is that he’s used his stardom to work with many of the great filmmakers of his time (Kubrick, Scorsese, Spielberg, Coppola, Mann, Stone, De Palma, Anderson) and a host of inspired journeymen, and he’s been largely responsible for the ascent of such talents as J.J. Abrams and Brad Bird. If this sort of thing were easy, we’d see it more often. And the fact that he did it for more than two decades speaks volumes about his intelligence, shrewdness, and ambition.

Recently, he’s faltered a bit, but his choices, good or bad, are still fascinating, especially as his aura continues to enrich his material with memories of his earlier roles, a process that goes at least as far back as Eyes Wide Shut. I haven’t seen Oblivion, but over the weekend, I caught Jack Reacher, a nifty but profoundly odd and implausible genre movie that runs off Cruise like a battery. (It’s actually much more of a star vehicle than Ghost Protocol, in which Cruise himself tended to get lost among all the wonders on display.) While most leading men strive to make it all seem easy, much of the appeal of watching Cruise lies in how hard this boy wonder of fifty seems to push himself in every frame, as if he still has everything to prove. Other stars may embody wit, cool, elegance, or masculinity, but Cruise is the emblem of the man who wills himself into existence, both on and off the screen, and sustains the world around him through sheer focus and energy. Real or not, it’s a seductive vision, or illusion, for those of us blessed with less certainty. As Taffy Brodesser-Akner says this week in The New York Times Magazine: “Who has ever worked so hard for our pleasure?”

%d bloggers like this: