Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘The New York Times Magazine

The purity test

with one comment

Earlier this week, The New York Times Magazine published a profile by Taffy Brodesser-Akner of the novelist Jonathan Franzen. It’s full of fascinating moments, including a remarkable one that seems to have happened entirely by accident—the reporter was in the room when Frazen received a pair of phone calls, including one from Daniel Craig, to inform him that production had halted on the television adaptation of his novel Purity. Brodesser-Akner writes: “Franzen sat down and blinked a few times.” That sounds about right to me. And the paragraph that follows gets at something crucial about the writing life, in which the necessity of solitary work clashes with the pressure to put its fruits at the mercy of the market:

He should have known. He should have known that the bigger the production—the more people you involve, the more hands the thing goes through—the more likely that it will never see the light of day resembling the thing you set out to make in the first place. That’s the real problem with adaptation, even once you decide you’re all in. It just involves too many people. When he writes a book, he makes sure it’s intact from his original vision of it. He sends it to his editor, and he either makes the changes that are suggested or he doesn’t. The thing that we then see on shelves is exactly the thing he set out to make. That might be the only way to do this. Yes, writing a novel—you alone in a room with your own thoughts—might be the only way to get a maximal kind of satisfaction from your creative efforts. All the other ways can break your heart.

To be fair, Franzen’s status is an unusual one, and even successful novelists aren’t always in the position of taking for granted the publication of “exactly the thing he set out to make.” (In practice, it’s close to all or nothing. In my experience, the novel that you see on store shelves mostly reflects what the writer wanted, while the ones in which the vision clashes with those of other stakeholders in the process generally doesn’t get published at all.) And I don’t think I’m alone when I say that some of the most interesting details that Brodesser-Akner provides are financial. A certain decorum still surrounds the reporting of sales figures in the literary world, so there’s a certain frisson in seeing them laid out like this:

And, well, sales of his novels have decreased since The Corrections was published in 2001. That book, about a Midwestern family enduring personal crises, has sold 1.6 million copies to date. Freedom, which was called a “masterpiece” in the first paragraph of its New York Times review, has sold 1.15 million since it was published in 2010. And 2015’s Purity, his novel about a young woman’s search for her father and the story of that father and the people he knew, has sold only 255,476.

For most writers, selling a quarter of a million copies of any book would exceed their wildest dreams. Having written one of the greatest outliers of the last twenty years, Franzen simply reverting to a very exalted mean. But there’s still a lot to unpack here.

For one thing, while Purity was a commercial disappointment, it doesn’t seem to have been an unambiguous disaster. According to Publisher’s Weekly, its first printing—which is where you can see a publisher calibrating its expectations—came to around 350,000 copies, which wasn’t even the largest print run for that month. (That honor went to David Lagercrantz’s The Girl in the Spider’s Web, which had half a million copies, while a new novel by the likes of John Grisham can run to over a million.) I don’t know what Franzen was paid in advance, but the loss must have fallen well short of a book like Tom Wolfe’s Back to Blood, for which he received $7 million and sold 62,000 copies, meaning that his publisher paid over a hundred dollars for every copy that someone actually bought. And any financial hit would have been modest compared to the prestige of keeping a major novelist on one’s list, which is unquantifiable, but no less real. If there’s one thing that I’ve learned about publishing over the last decade, it’s that it’s a lot like the movie industry, in which apparently inexplicable commercial and marketing decisions are easier to understand when you consider their true audience. In many cases, when they buy or pass on a book, editors aren’t making decisions for readers, but for other editors, and they’re very conscious of what everyone in their imprint thinks. A readership is an abstraction, except when quantified in sales, but editors have their everyday judgement calls reflected back on them by the people they see every day. Giving up a writer like Franzen might make financial sense, but it would be devastating to Farrar, Straus and Giroux, to say nothing of the relationship that can grow between an editor and a prized author over time.

You find much the same dynamic in Hollywood, in which some decisions are utterly inexplicable until you see them as a manifestation of office politics. In theory, a film is made for moviegoers, but the reactions of the producer down the hall are far more concrete. The difference between publishing and the movies is that the latter publish their box office returns, often in real time, while book sales remain opaque even at the highest level. And it’s interesting to wonder how both industries might differ if their approaches were more similar. After years of work, the success of a movie can be determined by the Saturday morning after its release, while a book usually has a little more time. (The exception is when a highly anticipated title doesn’t make it onto the New York Times bestseller list, or falls off it with alarming speed. The list doesn’t disclose any sales figures, which means that success is relative, not absolute—and which may be a small part of the reason why writers seldom wish one another well.) In the absence of hard sales, writers establish the pecking order with awards, reviews, and the other signifiers that have allowed Franzen to assume what Brodesser-Akner calls the mantle of “the White Male Great American Literary Novelist.” But the real takeaway is how narrow a slice of the world this reflects. Even if we place the most generous interpretation imaginable onto Franzen’s numbers, it’s likely that well under one percent of the American population has bought or read any of his books. You’ll find roughly the same number on any given weeknight playing HQ Trivia. If we acknowledged this more widely, it might free writers to return to their proper cultural position, in which the difference between a bestseller and a disappointment fades rightly into irrelevance. Who knows? They might even be happier.

Written by nevalalee

June 28, 2018 at 7:49 am

Quote of the Day

leave a comment »

Written by nevalalee

November 29, 2017 at 7:30 am

The homecoming king

leave a comment »

In my last year at college, I spent an inordinate amount of time trying to figure out how to come back from the dead. I had decided to write my senior thesis about Amphiaraus, an obscure figure from Greek literature best known for a brief appearance in the eighth Pythian ode of Pindar. (When you’re majoring in a field that has been generating articles, term papers, and dissertations with monotonous regularity for centuries, you take your subjects wherever you can find them.) Amphiaraus was the legendary king of Argos, proverbial for his wisdom, who joined the doomed assault of the Seven Against Thebes, although he knew that it would end in tragedy. Because he was beloved by the gods, at the moment that he was about to die in battle, the earth opened up beneath him, swallowing him whole. Much of my thesis was devoted to describing his afterlife as an object of cult veneration, where he appears to have persisted as a chthonic oracle, delivering dreams to pilgrims at his sanctuary as they slept on the ground. He also occasionally returned in person, at least in literature—in Pindar’s ode, he’s evidently some kind of ghost or revenant, since he appears in a speaking role at a point in the narrative at which he should have been long dead. This is striking in itself, because in the ancient Greek conception of the underworld, most men and women survive only as shades, shadowy figures without any trace of memory or personality. In technical terms, when we die, we lose our noos, which can roughly be regarded as the part of the soul responsible for conscious thought. And the remarkable thing about Amphiaraus is that he seems to retain his noos even after his death, as an oracular hero who remains fully aware and capable of returning to our world when necessary.

As I tried to define what made Amphiaraus special, I went down a linguistic rabbit hole in which I was perhaps overly influenced by a curious book titled The Myth of Return in Early Greek Epic. Its argument, presented by the linguist Douglas Frame, is that the word noos, or “mind,” is connected to nostos, or “return,” the central theme of the Odyssey. (It’s where we get the word “nostalgia,” which combines nostos with algos, or “pain.”) The quality that allows Odysseus to make his way home to Ithaca is his intelligence—which, by extension, is also the attribute that enables Amiphiaraus to return from the dead. A rumor of this theory somehow reached John Updike, of all people, who wrote a story called “Cruise” that offered a portrait of a lecturer on a cruise ship that I’m still convinced was inspired by one of my professors, since he was literally the only other man in the world, besides Douglas Frame, who sounded like this:

His sallow triangular face was especially melancholy, lit from beneath by the dim lectern bulb. The end of the journey meant for him the return to his university—its rosy-cheeked students invincible in their ignorance, its demonic faculty politics, its clamorous demands for ever-higher degrees of political correctness and cultural diversity. “ΚΡΝΩ,” he wrote on the blackboard, pronouncing, “krino—to discern, to be able to distinguish the real from the unreal. To do this, we need noos, mind, consciousness.” He wrote, then, “ΝΟΟΣ.” His face illumined from underneath was as eerie as that of a jack-in-the-box or a prompter hissing lines to stymied thespians. “We need no-os,” he pronounced, scrabbling with his invisible chalk in a fury of insertion, “to achieve our nos-tos, our homecoming.” He stood aside to reveal the completed word: ΝΟΣΤΟΣ. In afterthought he rapidly rubbed out two of the letters, created ΠΟΝΤΟΣ, and added with a small sly smile, “After our crossing together of the sea, the pontos.”

In the end, I moved away from this line of reasoning, and I spent most of my thesis developing arguments based on readings of words like poikōlos and polēplokos, which described the quality of mind—a kind of flexibility and resourcefulness—that was necessary to achieve this return, whether to Ithaca or to the world of the living. Until recently, I hadn’t thought about this for years. Over the weekend, however, I read a wonderful profile in The New York Times Magazine by Wyatt Mason of the classicist Emily Wilson, who has published a new translation of the Odyssey. Much of the article is devoted to a discussion of the word polytropos, which appears in the very first line of the poem as a description of Odysseus himself. Wilson explains:

The prefix poly means “many” or “multiple.” Tropos means “turn.” “Many” or “multiple” could suggest that he’s much turned, as if he is the one who has been put in the situation of having been to Troy, and back, and all around, gods and goddesses and monsters turning him off the straight course that, ideally, he’d like to be on. Or, it could be that he’s this untrustworthy kind of guy who is always going to get out of any situation by turning it to his advantage. It could be that he’s the turner…So the question of whether he’s the turned or the turner: I played around with that a lot in terms of how much should I be explicit about going for one versus the other. I remember that being one of the big questions I had to start off with.

And it’s precisely this notion of slipperiness and changeability that I often saw in descriptions of Amphiaraus, who, like Odysseus, has affinities with the god Hermes—the crosser of borders, the conductor of souls, the trickster.

The same qualities, of course, also tend to be present in writers, poets, scholars, and all those who, in W.H. Auden’s words, “live by their wits.” This may be why translators of the Odyssey have been so preoccupied with polytropos, which stands as a signal at the beginning of the poem of the intelligence that you need to make it all the way to the end. As Mason writes:

You might be inclined to suppose that, over the course of nearly half a millennium, we must have reached a consensus on the English equivalent for an old Greek word, polytropos. But to consult Wilson’s sixty some predecessors, living and dead, is to find that consensus has been hard to come by. Chapman starts things off, in his version, with “many a way/Wound with his wisdom”; John Ogilby counters with the terser “prudent”; Thomas Hobbes evades the word, just calling Odysseus “the man.” Quite a range, and we’ve barely started.

Mason lists dozens of variants, including Alexander Pope’s “for wisdom’s various arts renown’d”; H.F. Cary’s “crafty”; William Sotheby’s “by long experience tried”; Theodore Buckley’s “full of resources”; the Rev. Lovelace Bigge-Wither’s “many-sided-man”; Roscoe Mongan’s “skilled in expedients”; and T.E. Lawrence’s “various-minded.” Perhaps for sentimental reasons, I’m partial to Lawrence’s version, which recalls my old favorites poikōlos and polēplokos in evoking a sort of visual variety or shiftiness, like the speckled scales of a snake. And Wilson? She clearly thought long and hard on the matter. And when I read her solution, I felt a shiver of recognition, as well as a strange pang of nostalgia for the student I used to be, and to whom I still sometimes dream of returning again: “Tell me about a complicated man.”

Written by nevalalee

November 6, 2017 at 8:44 am

Writing with scissors

leave a comment »

Over the last few years, one of my great pleasures has been reading the articles on writing that John McPhee has been contributing on an annual basis to The New Yorker. I’ve written here about my reactions to McPhee’s advice on using the dictionary, on “greening” or cutting a piece by an arbitrary length, on structure, on frames of reference. Now his full book on the subject is here, Draft No. 4, and it’s arriving in my life at an opportune time. I’m wrapping up a draft of my own book, with two months to go before deadline, and I have a daunting set of tasks ahead of me—responding to editorial comments, preparing the notes and bibliography, wrestling the whole thing down to size. McPhee’s reasonable voice is a balm at such times, although he never minimizes the difficulty of the process itself, which he calls “masochistic, mind-fracturing self-enslaved labor,” even as he speaks of the writer’s “animal sense of being hunted.” And when you read Sam Anderson’s wonderful profile on McPhee in this week’s issue of The New York Times Magazine, it’s like listening to an old soldier who has been in combat so many times that everything that he says carries the weight of long experience. (Reading it, I was reminded a little of the film editor Walter Murch, whom McPhee resembles in certain ways—they look sort of alike, they’re both obsessed with structure, and they both seem to know everything. I was curious to see whether anyone else had made this connection, so I did a search for their names together on Google. Of the first five results, three were links from this blog.)

Anderson’s article offers us the portrait of a man who, at eighty-six, has done a better job than just about anyone else of organizing his own brain: “Each of those years seems to be filed away inside of him, loaded with information, ready to access.” I would have been equally pleased to learn that McPhee was as privately untidy as his writing is intricately patterned, but it makes sense that his interest in problems of structure—to which he returns endlessly—would manifest itself in his life and conversation. He’s interested in structure in the same way that the rest of us are interested in the lives of our own children. I never tire of hearing how writers deal with structural issues, and I find passages like the following almost pornographically fascinating:

The process is hellacious. McPhee gathers every single scrap of reporting on a given project—every interview, description, stray thought and research tidbit—and types all of it into his computer. He studies that data and comes up with organizing categories: themes, set pieces, characters and so on. Each category is assigned a code. To find the structure of a piece, McPhee makes an index card for each of his codes, sets them on a large table and arranges and rearranges the cards until the sequence seems right. Then he works back through his mass of assembled data, labeling each piece with the relevant code. On the computer, a program called “Structur” arranges these scraps into organized batches, and McPhee then works sequentially, batch by batch, converting all of it into prose. (In the old days, McPhee would manually type out his notes, photocopy them, cut up everything with scissors, and sort it all into coded envelopes. His first computer, he says, was “a five-thousand-dollar pair of scissors.”)

Anderson writes: “[McPhee] is one of the world’s few remaining users of a program called Kedit, which he writes about, at great length, in Draft No. 4.” The phrase “at great length” excites me tremendously—I’m at a point in my life where I’d rather hear about a writer’s favorite software program than his or her inspirational  thoughts on creativity—and McPhee’s process doesn’t sound too far removed from the one that I’ve worked out for myself. As I read it, though, I found myself thinking in passing of what might be lost when you move from scissors to a computer. (Scissors appear in the toolboxes of many of the writers and artists I admire. In The Elements of Style, E.B. White advises: “Quite often the writer will discover, on examining the completed work, that there are serious flaws in the arrangement of the material, calling for transpositions. When this is the case, he can save himself much labor and time by using scissors on his manuscript, cutting it to pieces and fitting the pieces together in a better order.” In The Silent Clowns, Walter Kerr describes the narrative challenges of filmmaking in the early fifties and concludes: “The problem was solved, more or less, with a scissors.” And Paul Klee once wrote in his diary: “What I don’t like, I cut away with the scissors.”) But McPhee isn’t sentimental about the tools themselves. In Anderson’s profile, the New Yorker editor David Remnick, who took McPhee’s class at Princeton, recalls: “You were in the room with a craftsman of the art, rather than a scholar or critic—to the point where I remember him passing around the weird mechanical pencils he used to use.” Yet there’s no question in my mind that McPhee would drop that one brand of pencil if he found one that he thought was objectively better. As soon as he had Kedit, he got rid of the scissors. When you’re trying to rethink structure from the ground up, you don’t have much time for nostalgia.

And when McPhee explains the rationale behind his methods, you can hear the pragmatism of fifty years of hard experience:

If this sounds mechanical, its effect was absolutely the reverse. If the contents of the seventh folder were before me, the contents of twenty-nine other folders were out of sight. Every organizational aspect was behind me. The procedure eliminated nearly all distraction and concentrated only the material I had to deal with in a given day or week. It painted me into a corner, yes, but in doing so it freed me to write.

This amounts to an elaboration of what I’ve elsewhere called my favorite piece of writing advice, which David Mamet offers in Some Freaks:

As a writer, I’ve tried to train myself to go one achievable step at a time: to say, for example, “Today I don’t have to be particularly inventive, all I have to be is careful, and make up an outline of the actual physical things the character does in Act One.” And then, the following day to say, “Today I don’t have to be careful. I already have this careful, literal outline, and I all have to do is be a little bit inventive,” et cetera, et cetera.

Mamet might as well have come out of the same box as Walter Murch and McPhee, which implies that I have a definite type when it comes to looking for advice. And what they all have in common, besides the glasses and beard, is the air of having labored at a craft for decades, survived, and returned to tell the tale. Of the three, McPhee’s career may be the most enviable of all, if only because he spent it in Princeton, not Hollywood. It’s nice to be able to structure an essay. The tricky part is structuring a life.

Jokes against inanity

with 3 comments

Yesterday, Harvard University made headlines by withdrawing acceptances for ten high school students who had posted “sexually explicit memes and messages” on a private Facebook group. Here’s how The Crimson describes the situation:

A handful of admitted students formed the messaging group—titled, at one point, “Harvard memes for horny bourgeois teens”—on Facebook in late December…In the group, students sent each other memes and other images mocking sexual assault, the Holocaust, and the deaths of children, according to screenshots of the chat obtained by The Crimson. Some of the messages joked that abusing children was sexually arousing, while others had punchlines directed at specific ethnic or racial groups. One called the hypothetical hanging of a Mexican child “piñata time.”

Not surprisingly, the decision has been a divisive one, with critics of the college making the argument—which can’t be dismissed out of hand—that Harvard overreached in policing statements that were made essentially in private. But there’s another line of reasoning that I find increasingly hard to take seriously. The Washington Post quotes Erica Goldberg, an assistant professor at Ohio Northern Law School, who compares the humor in question to the party game Cards Against Humanity:

It’s an unabashedly irreverent game whose purpose is to be as cleverly offensive as possible. The game uses cards to create inappropriate associations, on topics we are generally not socially permitted to mock—such as AIDS, the Holocaust, and dead babies. Even many good liberals love the game, precisely because the humor is so wrong, so contrary to our values. There is something appealing about the freedom to be irreverent and dark.

I might have agreed with this once, but I don’t think I do anymore. The catalyst, oddly, was a passage in Jon Ronson’s otherwise very good book So You’ve Been Publicly Shamed, which was evidently intended to make the opposite point. Ronson discusses the notorious case of Justine Sacco, the public relations executive who inspired a torrent of online outrage after tweeting before a flight to Cape Town: “Going to Africa. Hope I don’t get AIDS. Just kidding. I’m white!” Sacco then switched off her phone, which meant that she spent the next eleven hours oblivious to the fact that her life had effectively been ruined. Ronson writes of the firestorm:

I could understand why some people found it offensive. Read literally, she said that white people don’t get AIDS, but it seems doubtful many interpreted it that way. More likely it was her apparently gleeful flaunting of her privilege that angered people. But after thinking about her tweet for a few seconds more, I began to suspect that it wasn’t racist but a reflexive critique of white privilege—on our tendency to naïvely imagine ourselves immune from life’s horrors.

He concludes: “Justine’s crime had been a badly worded joke mocking privilege. To see the catastrophe as her fault felt, to me, a little like ‘Don’t wear short skirts.’ It felt like victim-blaming.” And there’s no question that Sacco, who was fired from her job, paid a disproportionately harsh price for her actions. But it also feels like an overstatement to repeatedly insist, as Ronson does, that Sacco “didn’t do anything wrong.” To say that her tweet was “a badly worded joke” implies that there was an alternative wording that would have made it funny and acceptable. I have trouble imagining one. And the implicit assumption that this was a problem of phrasing or context strikes me as the slipperiest slope of all.

This brings us to Cards Against Humanity, a kind of analog computer for generating offensive jokes, which, revealingly, often evokes the specter of “white privilege” to justify itself. When asked to explain its expansion pack “Ten Days or Whatever of Kwanzaa,” one of the game’s designers told the Daily Dot: “It’s a joke that we meant to poke fun at white privilege, ignorance, and laziness.” This amounts to a defense of the entire game, in which players theoretically interrogate their privilege by forcing one another to make what Goldberg calls “irreverent and dark jokes.” In the same article, Jaya Saxena neatly sums up the company’s position:

The Cards Against Humanity team is stalled in the middle of that narrative: understanding that there is a cultural hierarchy that disenfranchises people, making it clear they’re aware of the privilege they hold, attempting to use their humor to separate themselves from those who don’t get it, and apologizing for their mistakes when they’re called out.

This raises two related issues. One is whether this kind of scrutiny is, in fact, what most players of the game think they’re doing. The other is whether this activity is worthwhile. I would argue that the answer to both questions is “probably not.” This isn’t a matter of political correctness, but of a logical and comedic inconsistency—and, frankly, of “privilege, ignorance, and laziness”—in the sort of humor involved. Let’s say that you’ve made a “transgressive” joke of the type that got these prospective Harvard freshmen in trouble. Now imagine how you’d react if it had been said by Milo Yiannopoulos or posted as a meme on the alt-right. If it bothers you, then the only conclusion is that your identity as a progressive somehow justifies statements that would be horrifyingly racist in the mouth of someone of whom you disapprove. You can make the joke because you, as a “horny bourgeois teen,” know better.

This sounds a lot like privilege to me. I won’t say that it’s worse or more insidious than other forms of racism, but that doesn’t mean that it isn’t problematic, especially if you believe that transgressive humor is something to be celebrated. As Dan Brooks writes in an excellent essay in the New York Times Magazine: “The whole architecture of the game is designed to provide the thrill of transgression with none of the responsibility—to let players feel horrible, if you will, without feeling bad.” It’s a mechanical simulation of transgression, and, like bad art that allows for an emotional release that absolves the viewer from other kinds of empathy, it can numb us to the real thing, leaving us unable to make the distinction. Just because you were smart enough to get into Harvard—and believe me, I know—doesn’t make you Michael O’Donoghue. On that level, the college made the right call. It has the right to withdraw admission if “an admitted student engages in behavior that brings into question his or her honesty, maturity, or moral character,” and even if these teenagers share their assumptions with millions of other “good liberals,” that doesn’t make them any less wrong. Max Temkin, the primary creative force behind Cards Against Humanity, has impeccably progressive credentials and has done a lot of admirable things, but he has also said “We removed all of the ‘rape’ jokes from Cards Against Humanity years ago,” as if this were commendable in itself. They cull the cards that they’ve personally outgrown, as if objective standards of acceptability have changed simply because they’re no longer in their early twenties, and I’m not even sure if this strikes them as problematic. As a profile of the company in Fusion notes:

As part of their job, [the creators] periodically pull cards that seemed funny to college seniors in their parents’ basement, but are a little less funny now…Meanwhile some [cards], like “passable transvestites” and “date rape,” were pulled when the guys realized that kind of “humor” wasn’t actually very humorous.

The reference to “the guys” speaks volumes. But this kind of culling is something that we all do, as we leave behind our adolescent selves, and it has one inevitable conclusion. Speaking of the “passable transvestites” card, Temkin said: “It’s embarrassing to me that there was a time in my life that that was funny.” And for a lot of us, that includes the game as a whole.

The A/B Test

with 2 comments

In this week’s issue of The New York Times Magazine, there’s a profile of Mark Zuckerberg by Farhad Manjoo, who describes how the founder of Facebook is coming to terms with his role in the world in the aftermath of last year’s election. I find myself thinking about Zuckerberg a lot these days, arguably even more than I use Facebook itself. We just missed overlapping in college, and with one possible exception, which I’ll mention later, he’s the most influential figure to emerge from those ranks in the last two decades. Manjoo depicts him as an intensely private man obliged to walk a fine line in public, leading him to be absurdly cautious about what he says: “When I asked if he had chatted with Obama about the former president’s critique of Facebook, Zuckerberg paused for several seconds, nearly to the point of awkwardness, before answering that he had.” Zuckerberg is trying to figure out what he believes—and how to act—under conditions of enormous scrutiny, but he also has more resources at his disposal than just about anyone else in history. Here’s the passage in the article that stuck with me the most:

The people who work on News Feed aren’t making decisions that turn on fuzzy human ideas like ethics, judgment, intuition, or seniority. They are concerned only with quantifiable outcomes about people’s actions on the site. That data, at Facebook, is the only real truth…This ideal runs so deep that the people who make News Feed often have to put aside their own notions of what’s best. “One of the things we’ve all learned over the years is that our intuition can be wrong a fair amount of the time,” John Hegeman, the vice president of product management and a News Feed team member, told me. “There are things you don’t expect will happen. And we learn a lot from that process: Why didn’t that happen, and what might that mean?”

Reading this, I began to reflect on how rarely we actually test our intuitions. I’ve spoken a lot on this blog about the role of intuitive thinking in the arts and sciences, mostly because it doesn’t get the emphasis it deserves, but there’s also no guarantee that intuition will steer us in the right direction. The psychologist Daniel Kahneman has devoted his career to showing how we tend to overvalue our gut reactions, particularly if we’ve been fortunate enough to be right in the past, and the study of human irrationality has become a rich avenue of research in the social sciences, which are often undermined by poor hunches of their own. It may not even be a matter of right or wrong. An intuitive choice may be better or worse than the alternative, but for the most part, we’ll never know. One of the quirks of Silicon Valley culture is that it claims to base everything on raw data, but it’s often in the service of notions that are outlandish, untested, and easy to misrepresent. Facebook comes closer than any company in existence to the ideal of an endless A/B test, in which the user base is randomly divided into two or more groups to see which approaches are the most effective. It’s the best lab ever developed for testing our hunches about human behavior. (Most controversially, Facebook modified the news feeds of hundreds of thousands of users to adjust the number of positive or negative posts, in order to gauge the emotional impact, and it has conducted similar tests on voter turnout.) And it shouldn’t surprise us if many of our intuitions turn out to be mistaken. If anything, we should expect them to be right about half the time—and if we can nudge that percentage just a little bit upward, in theory, it should give us a significant competitive advantage.

So what good is intuition, anyway? I like to start with William Goldman’s story about the Broadway producer George Abbott, who once passed a choreographer holding his head in his hands while the dancers stood around doing nothing. When Abbott asked what was wrong, the choreographer said that he couldn’t figure out what to do next. Abbott shot back: “Well, have them do something! That way we’ll have something to change.” Intuition, as I’ve argued before, is mostly about taking you from zero ideas to one idea, which you can then start to refine. John W. Campbell makes much the same argument in what might be his single best editorial, “The Value of Panic,” which begins with a maxim from the Harvard professor Wayne Batteau: “In total ignorance, try anything. Then you won’t be so ignorant.” Campbell argues that this provides an evolutionary rationale for panic, in which an animal acts “in a manner entirely different from the normal behavior patterns of the organism.” He continues:

Given: An organism with N characteristic behavior modes available. Given: An environmental situation which cannot be solved by any of the N available behavior modes, but which must be solved immediately if the organism is to survive. Logical conclusion: The organism will inevitably die. But…if we introduce Panic, allowing the organism to generate a purely random behavior mode not a member of the N modes characteristically available?

Campbell concludes: “When the probability of survival is zero on the basis of all known factors—it’s time to throw in an unknown.” In extreme situations, the result is panic; under less intense circumstances, it’s a blind hunch. You can even see them as points on a spectrum, the purpose of which is to provide us with a random action or idea that can then be revised into something better, assuming that we survive for long enough. But sometimes the animal just gets eaten.

The idea of refinement, revision, or testing is inseparable from intuition, and Zuckerberg has been granted the most powerful tool imaginable for asking hard questions and getting quantifiable answers. What he does with it is another matter entirely. But it’s also worth looking at his only peer from college who could conceivably challenge him in terms of global influence. On paper, Mark Zuckerberg and Jared Kushner have remarkable similarities. Both are young Jewish men—although Kushner is more observant—who were born less than four years and sixty miles apart. Kushner, whose acceptance to Harvard was so manifestly the result of his family’s wealth that it became a case study in a book on the subject, was a member of the final clubs that Zuckerberg badly wanted to join, or so Aaron Sorkin would have us believe. Both ended up as unlikely media magnates of a very different kind: Kushner, like Charles Foster Kane, took over a New York newspaper from a man named Carter. Yet their approaches to their newfound positions couldn’t be more different. Kushner has been called “a shadow secretary of state” whose portfolio includes Mexico, China, the Middle East, and the reorganization of the federal government, but it feels like one long improvisation, on the apparent assumption that he can wing it and succeed where so many others have failed. As Bruce Bartlett writes in the New York Times, without a staff, Kushner “is just a dilettante meddling in matters he lacks the depth or the resources to grasp,” and we may not have a chance to recover if his intuitions are wrong. In other words, he resembles his father-in-law, as Frank Bruni notes:

I’m told by insiders that when Trump’s long-shot campaign led to victory, he and Kushner became convinced not only that they’d tapped into something that everybody was missing about America, but that they’d tapped into something that everybody was missing about the two of them.

Zuckerberg and Kushner’s lives ran roughly in parallel for a long time, but now they’re diverging at a point at which they almost seem to be offering us two alternate versions of the future, like an A/B test with only one possible outcome. Neither is wholly positive, but that doesn’t make the choice any less stark. And if you think this sounds farfetched, bookmark this post, and read it again in about six years.

Cutty Sark and the semicolon

leave a comment »

Vladimir Nabokov

Note: I’m taking a few days off, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on December 22, 2015.

In an interview that was first published in The Paris Review, the novelist Herbert Gold asked Vladimir Nabokov if an editor had ever offered him any useful advice. This is what Nabokov said in response:

By “editor” I suppose you mean proofreader. Among these I have known limpid creatures of limitless tact and tenderness who would discuss with me a semicolon as if it were a point of honor—which, indeed, a point of art often is. But I have also come across a few pompous avuncular brutes who would attempt to “make suggestions” which I countered with a thunderous “stet!”

I’ve always adored that thunderous stet, which tells us so much about Nabokov and his imperious resistance to being edited by anybody. Today, however, I’m more interested in the previous sentence. A semicolon, as Nabokov puts it, can indeed be a point of honor. Nabokov was perhaps the most painstaking of all modern writers, and it’s no surprise that the same perfectionism that produced such conceptual and structural marvels as Lolita and Pale Fire would filter down to the smallest details. But I imagine that even ordinary authors can relate to how a single punctuation mark in a manuscript can start to loom as large as the finger of God on the Sistine Chapel ceiling.

And there’s something about the semicolon that seems to inspire tussles between writers and their editors—or at least allows it to stand as a useful symbol of the battles that can occur during the editorial process. Here’s an excerpt from a piece by Charles McGrath in The New York Times Magazine about the relationship between Robert Caro, author of The Years of Lyndon Johnson, and his longtime editor Robert Gottlieb:

“You know that insane old expression, ‘The quality of his defect is the defect of his quality,’ or something like that?” Gottlieb asked me. “That’s really true of Bob. What makes him such a genius of research and reliability is that everything is of exactly the same importance to him. The smallest thing is as consequential as the biggest. A semicolon matters as much as, I don’t know, whether Johnson was gay. But unfortunately, when it comes to English, I have those tendencies, too, and we could go to war over a semicolon. That’s as important to me as who voted for what law.”

It’s possible that the semicolon keeps cropping up in such stories because its inherent ambiguity lends itself to disagreement. As Kurt Vonnegut once wrote: “Here is a lesson in creative writing. First rule: Do not use semicolons. They are transvestite hermaphrodites representing absolutely nothing. All they do is show you’ve been to college.” And I’ve more or less eliminated semicolons from my own work for much the same reason.

Robert De Niro and Martin Scorsese on the set of Raging Bull

But the larger question here is why artists fixate on things that even the most attentive reader would pass over without noticing. On one level, you could take a fight over a semicolon as an illustration of the way that the creative act—in which the artist is immersed in the work for months on end—tends to turn mountains into molehills. Here’s one of my favorite stories about the making of Raging Bull:

One night, when the filmmakers were right up against the deadline to make their release date, they were working on a nothing little shot that takes place in a nightclub, where a minor character turns to the bartender and orders a Cutty Sark. “I can’t hear what he’s saying,” [Martin Scorsese] said. Fiddling ensued—extensive fiddling—without satisfying him. [Producer Irwin] Winkler, who was present, finally deemed one result good enough and pointed out that messengers were standing by to hand-carry release prints to the few theaters where the picture was about to premiere. At which point, Scorsese snapped. “I want my name taken off the picture,” he cried—which bespeaks his devotion to detail. It also bespeaks his exhaustion at the end of Raging Bull, not to mention the craziness that so often overtakes movies as they wind down. Needless to say, he was eventually placated. And you can more or less hear the line in the finished print.

And you could argue that this kind of microscopic attention is the only thing that can lead to a work that succeeds on the largest possible scale.

But there’s yet another story that gets closer to truth. In Existential Errands, Norman Mailer describes a bad period in his life—shortly after he was jailed for stabbing his second wife Adele—in which he found himself descending into alcoholism and unable to work. His only source of consolation were the scraps of paper, “little crossed communications from some wistful outpost of my mind,” that he would find in his jacket pocket after a drunken night. Mailer writes of these poems:

I would go to work, however, on my scraps of paper. They were all I had for work. I would rewrite them carefully, printing in longhand and ink, and I would spend hours whenever there was time going over these little poems…And since I wasn’t doing anything else very well in those days, I worked the poems over every chance I had. Sometimes a working day would go by, and I might put a space between two lines and remove a word. Maybe I was mending.

Which just reminds us that a seemingly minuscule change can be the result of a prolonged confrontation with the work as a whole. You can’t obsess over a semicolon without immersing yourself in the words around it, and there are times when you need such a focal point to structure your engagement with the rest. It’s a little like what is called a lakshya in yoga: the tiny spot on the body or in the mind on which you concentrate while meditating. In practice, the lakshya can be anything or nothing, but without it, your attention tends to drift. In art, it can be a semicolon, a word, or a line about Cutty Sark. It may not be much in itself. But when you need to tether yourself to something, even a semicolon can be a lifeline.

%d bloggers like this: