Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Daniel Kahneman

The law of small numbers

leave a comment »

If he believes in the law of small numbers, the scientist will have exaggerated confidence in the validity of conclusions based on small samples. To illustrate, suppose he is engaged in studying which of two toys infants will prefer to play with. Of the first five infants studied, four have shown a preference for the same toy. Many a psychologist will feel some confidence at this point, that the null hypothesis of no preference is false. Fortunately, such a conviction is not a sufficient condition for journal publication, although it may do for a book.

Amos Tversky and Daniel Kahneman, “Belief in the Law of Small Numbers”

Written by nevalalee

March 3, 2018 at 7:30 am

The A/B Test

with 2 comments

In this week’s issue of The New York Times Magazine, there’s a profile of Mark Zuckerberg by Farhad Manjoo, who describes how the founder of Facebook is coming to terms with his role in the world in the aftermath of last year’s election. I find myself thinking about Zuckerberg a lot these days, arguably even more than I use Facebook itself. We just missed overlapping in college, and with one possible exception, which I’ll mention later, he’s the most influential figure to emerge from those ranks in the last two decades. Manjoo depicts him as an intensely private man obliged to walk a fine line in public, leading him to be absurdly cautious about what he says: “When I asked if he had chatted with Obama about the former president’s critique of Facebook, Zuckerberg paused for several seconds, nearly to the point of awkwardness, before answering that he had.” Zuckerberg is trying to figure out what he believes—and how to act—under conditions of enormous scrutiny, but he also has more resources at his disposal than just about anyone else in history. Here’s the passage in the article that stuck with me the most:

The people who work on News Feed aren’t making decisions that turn on fuzzy human ideas like ethics, judgment, intuition, or seniority. They are concerned only with quantifiable outcomes about people’s actions on the site. That data, at Facebook, is the only real truth…This ideal runs so deep that the people who make News Feed often have to put aside their own notions of what’s best. “One of the things we’ve all learned over the years is that our intuition can be wrong a fair amount of the time,” John Hegeman, the vice president of product management and a News Feed team member, told me. “There are things you don’t expect will happen. And we learn a lot from that process: Why didn’t that happen, and what might that mean?”

Reading this, I began to reflect on how rarely we actually test our intuitions. I’ve spoken a lot on this blog about the role of intuitive thinking in the arts and sciences, mostly because it doesn’t get the emphasis it deserves, but there’s also no guarantee that intuition will steer us in the right direction. The psychologist Daniel Kahneman has devoted his career to showing how we tend to overvalue our gut reactions, particularly if we’ve been fortunate enough to be right in the past, and the study of human irrationality has become a rich avenue of research in the social sciences, which are often undermined by poor hunches of their own. It may not even be a matter of right or wrong. An intuitive choice may be better or worse than the alternative, but for the most part, we’ll never know. One of the quirks of Silicon Valley culture is that it claims to base everything on raw data, but it’s often in the service of notions that are outlandish, untested, and easy to misrepresent. Facebook comes closer than any company in existence to the ideal of an endless A/B test, in which the user base is randomly divided into two or more groups to see which approaches are the most effective. It’s the best lab ever developed for testing our hunches about human behavior. (Most controversially, Facebook modified the news feeds of hundreds of thousands of users to adjust the number of positive or negative posts, in order to gauge the emotional impact, and it has conducted similar tests on voter turnout.) And it shouldn’t surprise us if many of our intuitions turn out to be mistaken. If anything, we should expect them to be right about half the time—and if we can nudge that percentage just a little bit upward, in theory, it should give us a significant competitive advantage.

So what good is intuition, anyway? I like to start with William Goldman’s story about the Broadway producer George Abbott, who once passed a choreographer holding his head in his hands while the dancers stood around doing nothing. When Abbott asked what was wrong, the choreographer said that he couldn’t figure out what to do next. Abbott shot back: “Well, have them do something! That way we’ll have something to change.” Intuition, as I’ve argued before, is mostly about taking you from zero ideas to one idea, which you can then start to refine. John W. Campbell makes much the same argument in what might be his single best editorial, “The Value of Panic,” which begins with a maxim from the Harvard professor Wayne Batteau: “In total ignorance, try anything. Then you won’t be so ignorant.” Campbell argues that this provides an evolutionary rationale for panic, in which an animal acts “in a manner entirely different from the normal behavior patterns of the organism.” He continues:

Given: An organism with N characteristic behavior modes available. Given: An environmental situation which cannot be solved by any of the N available behavior modes, but which must be solved immediately if the organism is to survive. Logical conclusion: The organism will inevitably die. But…if we introduce Panic, allowing the organism to generate a purely random behavior mode not a member of the N modes characteristically available?

Campbell concludes: “When the probability of survival is zero on the basis of all known factors—it’s time to throw in an unknown.” In extreme situations, the result is panic; under less intense circumstances, it’s a blind hunch. You can even see them as points on a spectrum, the purpose of which is to provide us with a random action or idea that can then be revised into something better, assuming that we survive for long enough. But sometimes the animal just gets eaten.

The idea of refinement, revision, or testing is inseparable from intuition, and Zuckerberg has been granted the most powerful tool imaginable for asking hard questions and getting quantifiable answers. What he does with it is another matter entirely. But it’s also worth looking at his only peer from college who could conceivably challenge him in terms of global influence. On paper, Mark Zuckerberg and Jared Kushner have remarkable similarities. Both are young Jewish men—although Kushner is more observant—who were born less than four years and sixty miles apart. Kushner, whose acceptance to Harvard was so manifestly the result of his family’s wealth that it became a case study in a book on the subject, was a member of the final clubs that Zuckerberg badly wanted to join, or so Aaron Sorkin would have us believe. Both ended up as unlikely media magnates of a very different kind: Kushner, like Charles Foster Kane, took over a New York newspaper from a man named Carter. Yet their approaches to their newfound positions couldn’t be more different. Kushner has been called “a shadow secretary of state” whose portfolio includes Mexico, China, the Middle East, and the reorganization of the federal government, but it feels like one long improvisation, on the apparent assumption that he can wing it and succeed where so many others have failed. As Bruce Bartlett writes in the New York Times, without a staff, Kushner “is just a dilettante meddling in matters he lacks the depth or the resources to grasp,” and we may not have a chance to recover if his intuitions are wrong. In other words, he resembles his father-in-law, as Frank Bruni notes:

I’m told by insiders that when Trump’s long-shot campaign led to victory, he and Kushner became convinced not only that they’d tapped into something that everybody was missing about America, but that they’d tapped into something that everybody was missing about the two of them.

Zuckerberg and Kushner’s lives ran roughly in parallel for a long time, but now they’re diverging at a point at which they almost seem to be offering us two alternate versions of the future, like an A/B test with only one possible outcome. Neither is wholly positive, but that doesn’t make the choice any less stark. And if you think this sounds farfetched, bookmark this post, and read it again in about six years.

The second system effect

with 2 comments

Kevin Costner in The Postman

Note: This post originally appeared, in a slightly different form, on July 14, 2015.

Why are second novels or movies so consistently underwhelming? Even if you account for such variables as heightened pressure, compressed turnaround time, and unrealistic expectations, the track record for works of art from The Postman to the second season of True Detective suggests that the sophomore slump, whatever it reflects, is real. For the economist Daniel Kahneman in Thinking, Fast and Slow, it’s a case of regression to the mean: any artistic breakthrough is by definition an outlier, since only exceptional efforts survive to come to light at all, and later attempts revert back to the artist’s natural level of ability. There’s also a sense in which a massive success removes many of the constraints that allowed for good work to happen in the first place. By now, it’s a cliché to note that the late installments in a popular series, from Harry Potter to A Song of Ice and Fire, feel like they haven’t been edited. It’s certainly true that authors who have sold a million copies have greater leverage when it comes to disregarding editorial notes, if they even receive them at all. Editors are as human as anyone else, and since commercial outcomes are such a crapshoot, you can’t blame them for not wanting to get in the way of a good thing. It didn’t hurt Rowling or Martin, but in the case of, say, the later novels of Thomas Harris, you could make a case that a little more editorial control might have been nice. And I’ve noted elsewhere that this may have more to do with the need to schedule blockbuster novels for a release date long in advance, whether they’re ready or not.

Yet there’s also a third, even more plausible explanation, which I first encountered in The Mythical Man-Month by Frederick P. Brooks, Jr., a seminal work on software engineering. Writing about what he calls “the second system effect,” Brooks notes:

An architect’s first work is apt to be spare and clean. He knows he doesn’t know what he’s doing, so he does it carefully and with great restraint.

As he designs the first work, frill after frill and embellishment after embellishment occur to him. These get stored away to be used “next time.” Sooner or later the first system is finished, and the architect, with firm confidence and a demonstrated mastery of that class of systems, is ready to build a second system.

The second is the most dangerous system a man ever designs. When he does his third and later ones, his prior experiences will confirm each other as to the general characteristics of such systems, and their differences will identify those parts of his experience that are particular are not generalizable.

Francis Ford Coppola

Brooks concludes: “The general tendency is to over-design the second system, using all the ideas and frills that were cautiously sidetracked on the first one.” And it’s startling how well this statement describes so many sophomore efforts in film and literature. It’s the difference between Easy Rider and The Last Movie, Sex Lies and Videotape and Kafka, Donnie Darko and Southland Tales, in which a spare, disciplined freshman work is succeeded by a movie that contains everything up to and including the kitchen sink. When you first try your hand at any kind of storytelling, you discover that the natural arc of the project tends toward removal and subtraction: you cut, pare back, and streamline, either because of your natural caution or because you don’t have the resources you need. Every edit is necessary, but it also carries a charge of regret. If your constraints are removed for your second project, this only adds fuel to an artist’s natural tendency to overindulge. And while the result may be a likable mess—a lot of us prefer Mallrats to Clerks—it rarely exhibits the qualities that first drew us to an artist’s work. (Even in movies made by committee, there’s an assumption that viewers want a bigger, louder, and busier version of what worked the first time around, which leads to so much of the narrative inflation that we see in blockbuster sequels.)

So what’s an artist to do? Brooks has some advice that everyone trying to follow up his or her first effort should keep in mind:

How does the architect avoid the second-system effect? Well, obviously he can’t skip his second system. But he can be conscious of the peculiar hazards of that system, and exert extra self-discipline to avoid functional ornamentation and to avoid extrapolation of functions that are obviated by changes in assumptions and purposes.

Translated into artistic terms, this means nothing more or less than treating a second attempt as exactly as hazardous as it really is. If anything, the track record of sophomore efforts should make writers even more aware of those risks, and even more relentless about asking the hard questions after a big success has made it possible to stop. When Francis Ford Coppola followed The Godfather with The Conversation, it was both a regathering and an act of discipline—in a movie largely about craft and constraints—that enabled the grand gestures to come. Coppola certainly wasn’t beyond insane acts of overreaching, but in this case, his instincts were sound. And I have a feeling that a lot of writers and filmmakers, in hindsight, wish that they could have skipped their second system and gone straight to their third.

Written by nevalalee

November 25, 2016 at 9:00 am

The oilman’s divorce, or the problem of luck

leave a comment »

Harold Hamm

Money may not be everything, but it’s certainly a clarifying force, especially when seventeen billion dollars are on the line. Right now, the news is filled with coverage of what may end up being the most expensive divorce in the history of the world, a courtroom battle currently being waged between Oklahoma oilman Harold Hamm and his wife Sue Ann. This kind of story possesses a certain guilty fascination of its own—if Ms. Hamm gets even a quarter of her husband’s fortune, she’ll be richer than Oprah—but it also raises some unexpected philosophical questions. At the moment, Hamm is claiming that his stake in Continental Resources, the company he founded almost five decades ago, was built entirely on luck, since the discovery of oil deposits depends on so many factors that are fundamentally beyond human control. Ms. Hamm, by contrast, says that it was all thanks to her husband’s skill and hard work. Which leaves us with the odd prospect of one of the world’s wealthiest men disclaiming all responsibility for his own success, while his soon-to-be ex-wife insists that he isn’t giving himself enough credit.

Of course, it isn’t hard to see why they’d stake out their particular positions. Under laws governing equitable division, assets that were actively earned over the course of the marriage are split between the two parties, while passive assets, or those in which chance played a role, remain undivided. The underlying presumption is that a marriage is a partnership in which both parties collectively participate, and the work of one can’t be separated from the presence of the other—an argument that breaks down when luck is involved. If Sue Ann can successfully argue that Harold’s fortune was derived from his own efforts, she’s entitled to up to half of everything; if Harold makes the case that it’s due to luck, he clings to the lion’s share. In the end, it’s likely that the decision will fall somewhere in the middle, and I’m very curious to hear Judge Howard Haralson’s final ruling on the subject. Because while Harold Hamm is clearly pressing his case beyond what he really believes about his own abilities—”He must be squirming inside,” as a psychologist notes in the NBC News story—he isn’t entirely wrong. And it took only a multibillion-dollar divorce suit for him to admit it.

Daniel Kahneman

We all tend to underrate the role that luck plays in human endeavor, whether successful or otherwise, especially when we’ve done well for ourselves. The truth is probably closer to what Daniel Kahneman sets forth in Thinking, Fast and Slow, using what he says is his favorite equation:

Success = talent + luck
Great success = a little more talent + a lot of luck

That’s particularly true of something like oil and gas, which remains a punishingly uncertain industry even with modern geological and technological expertise. There’s a reason why more than a few wildcatters used divining rods to locate petroleum deposits: even with considerable skill and experience, a lot of it comes down to luck, persistence, and time. David Mamet likes to say that everybody gets a break in Hollywood after twenty-five years, except that some get it at the beginning and others at the end, and the important thing is to be the one who stays after everyone else has gone home. And wildcatting, along with so much else, works much the same way.

So while it’s easy to dismiss Hamm’s argument as disingenuous—his success in building one of the largest energy companies in the country can’t be entirely divorced from his own actions and decisions—I’d prefer to think of it as a moment of rare candor, however little he might believe in it himself. As John Bogle likes to point out, choosing an investment fund manager based on past performance is a little like staking money on the winner of a coin-flipping contest: given enough participants, someone is bound to get ten heads in a row, but that doesn’t mean you should bet on the streak in the future. That’s as true of running a company as managing a mutual fund. What seems like a smart decision may really have been a lucky break, and it’s only in retrospect, as we construct a narrative to make sense of what happened, that we emerge with the myth of the business visionary. If Kahneman’s formula holds true, as it does in most things in life, Harold Hamm might well be entitled to keep most of what he earned. Admitting this would involve giving up many of our most cherished notions about success. But with seventeen billion dollars at stake, even one of the richest men in the world might have to concede that it’s better to be lucky than good.

Written by nevalalee

August 27, 2014 at 8:58 am

Transformed by print

leave a comment »

A page from my first draft

Somewhere in his useful and opinionated book Trial and Error, the legendary pulp writer Jack Woodford says that if you feel that your work isn’t as good as the fiction you see in stores, there’s a simple test to see if this is actually true. Take a page from a recent novel you admire—or one that has seen big sales or critical praise—and type the whole thing out unchanged. When you see the words in your own typewriter or computer screen, stripped of their superficial prettiness, they suddenly seem a lot less impressive. There’s something about professional typesetting that elevates even the sloppiest prose: it attains a kind of dignity and solidity that can be hard to see in its unpublished form. It isn’t just a story now; it’s an art object. And restoring it to the malleable, unpolished medium of a manuscript page often reveals how arbitrary the author’s choices really were, just as we tend to be hard on our own work because our rough drafts don’t look as nice as the stories we see in print.

There’s something undeniably mysterious about how visual cues affect the way we think about the words we’re reading, whether they’re our own or others. Daniel Kahneman has written about how we tend to read texts more critically when they’re printed in unattractive fonts, and Errol Morris recently ran an online experiment to test this by asking readers for their opinions about a short written statement, without revealing that some saw it in Baskerville and others in Comic Sans. (Significantly more of those who read it in Baskerville thought the argument was persuasive, while those who saw it in Comic Sans were less impressed, presumably because they were too busy clawing out their eyes.) Kindle, as in so many other respects, is the great leveler: it strips books of their protective sheen and forces us to evaluate them on their own merits. And I’d be curious to see a study on how the average review varies between those who read a novel in print and those who saw it in electronic form.

"Arkady arrived at the museum at ten..."

This is is also why I can’t bear to read my own manuscripts in anything other than Times New Roman, which is the font in which they were originally composed. When I’m writing a story, I’m primarily thinking about the content, yes, but I’m also consciously shaping how the text appears on the screen. As I’ve mentioned before, I’ve acquired a lot of odd tics and aversions from years spent staring at my own words on a computer monitor, and I’ve evolved just as many strategies for coping. I don’t like the look of a ragged right margin, for instance, so all my manuscripts are justified and hyphenated, at least until they go out to readers. I generally prefer it when the concluding line of a paragraph ends somewhere on the left half of the page, and I’ll often rewrite the text accordingly. And I like my short lines of dialogue to be exactly one page width long. All this disappears, of course, the second the manuscript is typeset, but as a way of maintaining my sanity throughout the writing process, these rituals play an important role.

And I don’t seem to mind their absence when I finally see my work in print, which introduces another level of detachment: these words don’t look like mine anymore, but someone else’s. (There are occasional happy exceptions: by sheer accident, the line widths in The Year’s Best Science Fiction Vol. 29 happen to exactly match the ones I use at home, so “The Boneless One” looks pretty much like it did on my computer, down to the shape of the paragraphs.) Last week, I finally received an advance copy of my novel Eternal Empire, hot off the presses, and I was struck by how little it felt like a book I’d written. Part of this is because it’s been almost a year since I finished the first draft, I’ve been working on unrelated projects since then, and a lot has happened in the meantime. But there’s also something about the cold permanence of the printed page that keeps me at arm’s length from my work. Once a story can no longer be changed, it ceases to be quite as alive as it once was. It’s still special. But it’s no longer a part of you.

Written by nevalalee

August 12, 2013 at 8:35 am

A few holiday thoughts on happiness

leave a comment »

Daniel Kahneman

The use of time is one of the areas of life over which people have some control. Few individuals can will themselves to have a sunnier disposition, but some may be able to arrange their lives to spend less of their day commuting, and more time doing things they enjoy with people they like…

Not surprisingly, a headache will make a person miserable, and the second best predictor of the feelings of a day is whether a person did or did not have contacts with friends or relatives. It is only a slight exaggeration to say that happiness is the experience of spending time with people you love and who love you.

Daniel Kahneman, Thinking, Fast and Slow

Written by nevalalee

December 25, 2012 at 9:50 am

Blurbed for your protection

leave a comment »

A few days ago, my wife was opening a new package of vitamins when she noticed that the familiar foil seal over the bottle’s mouth was missing. We’d bought the vitamins a while ago, and it was possible that we’d just opened them earlier without remembering it, but that didn’t seem likely—and the mouth of the bottle was perfectly clean, which implied that the seal had never been there in the first place. After a bit of discussion, we decided to throw the bottle away. The odds of there being anything wrong with it were vanishingly small, but there didn’t seem to be any point in taking the chance. That’s the function of a protective seal: when it’s there, you don’t think about it—unless you’re annoyed by how hard it is to pry open—but when it’s missing, it immediately sets off alarm bells. And while this may not seem to have much to do with writing or publishing, it actually gets at an important truth about how books are packaged and presented, and, in particular, about the misunderstood, often derided role of the humble blurb.

Recently, there’s been a lot of discussion about the problem of fake online reviews, with some authors paying “critics” outright to post flattering quotes on sites like Amazon. The blurbs that we see on most novels have often been lumped into the same category, based on the observation, which is certainly correct, that they wouldn’t be there at all if they weren’t complimentary. Yet the parallel isn’t quite exact. It’s true that most blurbs end up on the cover of a book through some kind of relationship between blurber and subject: in many cases, they’re authors whom the writer or editor knows personally, or clients of the same publishing house who can be approached through a friendly intermediary. More rarely, a wonderful blurb is obtained from a respected author through luck or tenacity alone. And in my own limited experience, such blurbs are at least sincere: writers know that they have little to gain by having their name attached to a laudatory quote on a bad book, and if they don’t care for it or aren’t interested, they’re more likely just not to respond.

That said, no one should mistake a blurb for a purely objective review—but that doesn’t mean readers should ignore them, either. A browser looking at a new release in a bookstore, especially from an unknown author, is generally operating with very limited information, which means that he or she needs to rely on a few simple heuristics. Does the book look like it’s worth reading? Is it a novel that the publisher believes in? The cover art, the jacket copy, and even the font are important clues, although they’re often most useful as negative indicators. When I see a badly designed book with an ugly typeface, it doesn’t tell me anything about the content, but it implies that if the publisher didn’t, or couldn’t, take such superficial elements seriously, the text itself isn’t likely to be any better. (Obviously, this isn’t always true. And we should be careful about drawing any conclusions from the opposite case, since Daniel Kahneman has shown that a nice font may make us less aware of an author’s sloppy writing or reasoning.)

The same principle applies to blurbs. This isn’t to say that awful books can’t receive glowing blurbs from other authors—it happens all the time. But the absence of at least one decent blurb should make us cautious, if nothing else. Blurbs may be largely meaningless, but they do take time and effort to obtain, and like cover art, design, and other cosmetic elements, they serve as a sort of rough index to the publisher’s commitment to the book itself. In some ways, they’re best understood as an element of cover design: a novel without a blurb looks a little naked, at least to my eyes, and what it says is ultimately less important than the fact that it’s there at all. There aren’t many authors, not even Nicole Krauss, whose testimonials, as far as fiction is concerned, will tempt me to buy a book, but their absence will often warn me away. That’s the nature of a blurb: it’s less a stamp of approval than a protective seal that tells us, at the very least, that there’s nothing outright toxic inside.

Written by nevalalee

November 14, 2012 at 9:44 am

What I learned from my first novel

with 8 comments

Five months ago, my novel The Icon Thief was published by Penguin, and if it seemed at the time like the end of a journey, I see clearly now that it was just the beginning of another. In many ways, the most challenging part of the past year has been adjusting my survival skills as a writer, which had been built up by years of mostly solitary work, to the realities of living with a book in actual stores. And the transition hasn’t always been easy. Daniel Kahneman, the Nobel Prize-winning author of Thinking, Fast and Slow, likes to talk about optimistic bias—the delusion that we ourselves are more likely to succeed where countless others have failed—and it’s especially endemic among aspiring writers, who are required by definition to be irrationally optimistic. Every unpublished novel is a potential bestseller, just as every unwritten page is a potential masterpiece, and learning to live with a real physical book, which won’t always live up to your expectations, is something every writer needs to learn. Here, then, are some lessons that the past few months have taught me:

1. Promotion is great, but placement is better. When The Icon Thief came out, I did everything I could to transform myself from an obsessive introvert, which is basically what every writer has to become in order to finish a book in the first place, to a tireless promoter who could sell his book in person, in print, and in all other media. What I’ve since learned is that while such activities can be gratifying for their own sake, and will sell books here and there, they generally don’t have a lasting effect on a novel’s success. What sells most books, aside from word of mouth, is placement: do readers see the book when they go into stores? Every instance of placement in the big national chains—whether a book is on the front table, in the new releases section, or in a display where browsers are likely to notice it—is a chance to reach that precious audience of readers who are actively looking for something to buy. It’s by far the largest factor in a debut novel’s early sales—more than advertising, more than promotion. And it’s something that is ultimately out of the writer’s hands.

2. Don’t sweat the numbers. During the first week of my novel’s release, like any writer with a pulse, I was checking my Amazon sales ranking every hour. After a while, I was down to every day, then every week, and now I look only rarely, if ever. The same goes with BookScan figures and other measures of the book’s sales: I used to dutifully look over the charts every Friday and wonder why sales were spiking in Houston but flat in Boise, Idaho. In time, though, I found that I was falling into the same trap of those who have plenty of data but not enough real information: I was reading too much into tiny fluctuations and seeing patterns that weren’t there. In the end, such noise only serves as a distraction from the real business of writing, which involves a lot of diligent labor without reference to how your book is doing in Baton Rouge. In the old days, writers would receive sales figures from their publishers on a quarterly or semiannual basis, and I’d argue that they were better off. Turn off the numbers—you’ll be happier in the end.

3. Play the long game. Last month, I learned that my longtime editor at Penguin, who had acquired The Icon Thief and its sequel almost two years ago, was leaving to take another job. At first, I was rocked by the news, but my agent wisely pointed out that the timing here—with one book already out in stores, the second locked and ready to go, and a third a few months from completion—was about as good as it could get, and that changing editors is something that happens to every writer at one point or another. And he was right. Unless you’re the kind of author who has exactly one book to write, you’re going spend the rest of your career in the writing game, which is just like anything else in life: the same ups and downs happen to everyone, but not necessarily in the same order. When you take the long view, you find that the rules of engagement haven’t really changed from when you were first starting out: you’re still writing for yourself and a few ideal readers. And the more you keep that in mind, the better chance you have of coming out the other end alive.

Written by nevalalee

August 15, 2012 at 10:18 am

Is it better to be lucky than good?

leave a comment »

Over the past few days, I’ve been devouring the book Thinking, Fast and Slow by Nobel laureate Daniel Kahneman, which I’d mentioned here before but only recently got around to reading. It is, as promised, rife with fascinating insights and stories—my wife says that I seem to have underlined every sentence—and I’m still only halfway through. In particular, Chapter 17, “Regression to the Mean,” is one that everyone should read, even if it’s just standing up at Barnes & Noble. The chapter is only ten pages long, but it’s packed with more useful insights than a shelf of ordinary books, and I can all but guarantee that it will subtly change the way you think about a lot of things. The key passage, at least to my eyes, is one that begins with Kahneman sharing what he calls his favorite equation:

Success = talent + luck
Great success = a little more talent + a lot of luck

This is something that most of us know intuitively, but Kahneman takes it one step further. Basically, if we accept the premise that a single instance of exceptionally good performance is due largely to luck—or, more precisely, to positive factors outside the performer’s control—then our best guess about the next performance is that it won’t be quite as good, as the performer’s luck regresses to the mean. We can’t predict anything about luck except for the fact that, in general, it will be more or less average. As a result, someone who has excellent luck on one occasion, like an athlete who makes a great ski jump, will probably only have average luck the next time out—and the better the original performance, the more extreme the regression will be. And while we might be tempted to ascribe all kinds of causal factors to the change, it’s really nothing but simple mathematics.

This is obviously true of sports, given the important role that luck plays in most sporting events, but it’s also fascinating to think about its implications for the arts. In particular, regression to the mean is the most likely explanation for what I call “the New Yorker feature curse” in my recent article in Salon. When we interview movie stars or directors based on a recent great success, it’s likely that we’ve caught them just before they regress to the mean, which is why their next project—the one we’ve spent the entire article extolling—often seems like a relative disappointment. And this has nothing to do with the talent of the subjects involved. The movies are such a volatile business that even successful filmmakers can only be expected to succeed perhaps half the time, so it shouldn’t be surprising when a big success is followed by a movie that seems like a failure in comparison, and vice versa. For a particularly stark example, one need look no further than the recent career of Woody Allen, who, in Match Point, had a character say:

The man who said “I’d rather be lucky than good” saw deeply into life. People are afraid to face how great a part of life is dependent on luck. It’s scary to think so much is out of one’s control. There are moments in a match when the ball hits the top of the net, and for a split second, it can either go forward or fall back. With a little luck, it goes forward, and you win. Or maybe it doesn’t, and you lose.

And this applies to literature as well. If athletes have the Sports Illustrated cover jinx and directors have the New Yorker curse, novelists have second-novel syndrome: the big debut novel followed by a sophomore slump. We like to ascribe all kinds of causal explanations to this—pressure, time constraints, authorial self-indulgence—but most often, it’s just another case of regression to the mean. Luck, as I’ve learned firsthand, plays an enormous role in a book’s publication and reception, and it’s mathematically unsound to expect lightning to strike twice. This is true, most obviously, of a book’s commercial prospects, but also, oddly, of its artistic merits. Luck plays a larger role in a novel’s quality than many of us would like to admit: like ski jumpers and golf players, we benefit from moments of serendipity and inspiration that may never return. Until, of course, we try again.

A writer’s intuition, right or wrong

with 4 comments

Intuition is getting a bad rap these days. As both the book and movie of Moneyball have made clear, the intuition of baseball scouts is about as useful as random chance, and the same might be said of stock pickers, political pundits, and all other supposed sources of insight whose usefulness is rarely put to a rigorous test. Intuition, it seems, is really just another word for blind guessing, at least as far as accuracy is concerned. The recent book Thinking, Fast and Slow, by Nobel Prize-winning economist Daniel Kahneman, goes even further, providing countless illustrations of how misleading our intuition can be, and how easily it can be distracted by irrelevant factors. (For example, something as simple as rolling a certain number on a rigged roulette wheel can influence our estimates of, say, how many African countries are in the United Nations. Don’t ask me how or why, but Kahneman’s data speaks for itself.)

And yet it’s hard to give up on intuition entirely. For one thing, it’s faster. I believe it was Julian Jaynes who pointed out that intuition is really just another word for the acceleration of experience: after we’ve been forced to make decisions under similar circumstances a certain number of times, the intermediate logic falls away, and we’re left with what feels like an intuitive response. Play it in slow motion, and all the steps are still there, in infinitesimal form. This kind of intuition strikes me as essentially different from the sort debunked above, and it’s especially useful in the arts, when no amount of statistical analysis can take the place of the small, mysterious judgment calls that every artist makes on a daily basis. In writing, as in everything else, the fundamentals of craft are acquired with difficulty, then gradually internalized, freeing the writer’s conscious mind to deal with unique problems while intuition takes care of the rest. And without such intuitive shortcuts, a long, complex project like a novel would take forever to complete.

Every artist develops this sort of intuition sooner or later, making it possible to skip such intermediate steps. As I’ve noted before, Robert Graves has described it as proleptic or “slantwise” thinking, a form of logic that goes from A to C without pausing for B. All great creative artists have this faculty, and the greater the artist, the more pronounced it becomes. One of the most compelling descriptions of poetic intuition I’ve ever seen comes from John Gardner’s The Art of Fiction, in a brief aside about Shakespeare. Gardner points to the fact that in Hamlet, the normally indecisive prince has no trouble sending Rosencrantz and Guidenstern to their deaths offstage, and with almost no explanation, a detail that strikes some readers as inconsistent. “If pressed,” Gardner writes, “Shakespeare might say that he expects us to recognize that the fox out-foxed is an old motif in literature—he could make up the tiresome details if he had to.” Fair enough. But then Gardner continues:

But the explanation I’ve put in Shakespeare’s mouth is probably not the true one. The truth is very likely that almost without bothering to think it out, Shakespeare saw by a flash of intuition that the whole question was unimportant, off the point; and so like Mozart, the white shark of music, he snapped straight to the heart of the matter…Shakespeare’s instinct told him, “Get back to the business between Hamlet and Claudius,” and, sudden as lightning, he was back.

That intuition, “sudden as lightning,” is what every writer hopes to develop. And while none of us have it to the extent that Shakespeare did, it’s always satisfying to see it flash forth, even in a modest way. Earlier this week, while reading through the final version of City of Exiles, I noticed a place where the momentum of the story seemed to flag. I made a note of this, then moved on. Later that day, I was working on something else entirely when I suddenly realized how to fix the problem, which was just a matter of eliminating or tightening a couple of paragraphs. After making these changes, I read the chapter over again, but this was almost a formality: I knew the revisions would work. There’s no way of objectively measuring this, of course, and there were probably other approaches that would have worked as well or better. But intuition provided one possible solution when I needed it. And without many such moments, right or wrong, I’d never finish a novel at all.

Written by nevalalee

November 23, 2011 at 10:00 am

%d bloggers like this: