Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Daniel Kahneman

The law of small numbers

leave a comment »

If he believes in the law of small numbers, the scientist will have exaggerated confidence in the validity of conclusions based on small samples. To illustrate, suppose he is engaged in studying which of two toys infants will prefer to play with. Of the first five infants studied, four have shown a preference for the same toy. Many a psychologist will feel some confidence at this point, that the null hypothesis of no preference is false. Fortunately, such a conviction is not a sufficient condition for journal publication, although it may do for a book.

Amos Tversky and Daniel Kahneman, “Belief in the Law of Small Numbers”

Written by nevalalee

March 3, 2018 at 7:30 am

The A/B Test

with 2 comments

In this week’s issue of The New York Times Magazine, there’s a profile of Mark Zuckerberg by Farhad Manjoo, who describes how the founder of Facebook is coming to terms with his role in the world in the aftermath of last year’s election. I find myself thinking about Zuckerberg a lot these days, arguably even more than I use Facebook itself. We just missed overlapping in college, and with one possible exception, which I’ll mention later, he’s the most influential figure to emerge from those ranks in the last two decades. Manjoo depicts him as an intensely private man obliged to walk a fine line in public, leading him to be absurdly cautious about what he says: “When I asked if he had chatted with Obama about the former president’s critique of Facebook, Zuckerberg paused for several seconds, nearly to the point of awkwardness, before answering that he had.” Zuckerberg is trying to figure out what he believes—and how to act—under conditions of enormous scrutiny, but he also has more resources at his disposal than just about anyone else in history. Here’s the passage in the article that stuck with me the most:

The people who work on News Feed aren’t making decisions that turn on fuzzy human ideas like ethics, judgment, intuition, or seniority. They are concerned only with quantifiable outcomes about people’s actions on the site. That data, at Facebook, is the only real truth…This ideal runs so deep that the people who make News Feed often have to put aside their own notions of what’s best. “One of the things we’ve all learned over the years is that our intuition can be wrong a fair amount of the time,” John Hegeman, the vice president of product management and a News Feed team member, told me. “There are things you don’t expect will happen. And we learn a lot from that process: Why didn’t that happen, and what might that mean?”

Reading this, I began to reflect on how rarely we actually test our intuitions. I’ve spoken a lot on this blog about the role of intuitive thinking in the arts and sciences, mostly because it doesn’t get the emphasis it deserves, but there’s also no guarantee that intuition will steer us in the right direction. The psychologist Daniel Kahneman has devoted his career to showing how we tend to overvalue our gut reactions, particularly if we’ve been fortunate enough to be right in the past, and the study of human irrationality has become a rich avenue of research in the social sciences, which are often undermined by poor hunches of their own. It may not even be a matter of right or wrong. An intuitive choice may be better or worse than the alternative, but for the most part, we’ll never know. One of the quirks of Silicon Valley culture is that it claims to base everything on raw data, but it’s often in the service of notions that are outlandish, untested, and easy to misrepresent. Facebook comes closer than any company in existence to the ideal of an endless A/B test, in which the user base is randomly divided into two or more groups to see which approaches are the most effective. It’s the best lab ever developed for testing our hunches about human behavior. (Most controversially, Facebook modified the news feeds of hundreds of thousands of users to adjust the number of positive or negative posts, in order to gauge the emotional impact, and it has conducted similar tests on voter turnout.) And it shouldn’t surprise us if many of our intuitions turn out to be mistaken. If anything, we should expect them to be right about half the time—and if we can nudge that percentage just a little bit upward, in theory, it should give us a significant competitive advantage.

So what good is intuition, anyway? I like to start with William Goldman’s story about the Broadway producer George Abbott, who once passed a choreographer holding his head in his hands while the dancers stood around doing nothing. When Abbott asked what was wrong, the choreographer said that he couldn’t figure out what to do next. Abbott shot back: “Well, have them do something! That way we’ll have something to change.” Intuition, as I’ve argued before, is mostly about taking you from zero ideas to one idea, which you can then start to refine. John W. Campbell makes much the same argument in what might be his single best editorial, “The Value of Panic,” which begins with a maxim from the Harvard professor Wayne Batteau: “In total ignorance, try anything. Then you won’t be so ignorant.” Campbell argues that this provides an evolutionary rationale for panic, in which an animal acts “in a manner entirely different from the normal behavior patterns of the organism.” He continues:

Given: An organism with N characteristic behavior modes available. Given: An environmental situation which cannot be solved by any of the N available behavior modes, but which must be solved immediately if the organism is to survive. Logical conclusion: The organism will inevitably die. But…if we introduce Panic, allowing the organism to generate a purely random behavior mode not a member of the N modes characteristically available?

Campbell concludes: “When the probability of survival is zero on the basis of all known factors—it’s time to throw in an unknown.” In extreme situations, the result is panic; under less intense circumstances, it’s a blind hunch. You can even see them as points on a spectrum, the purpose of which is to provide us with a random action or idea that can then be revised into something better, assuming that we survive for long enough. But sometimes the animal just gets eaten.

The idea of refinement, revision, or testing is inseparable from intuition, and Zuckerberg has been granted the most powerful tool imaginable for asking hard questions and getting quantifiable answers. What he does with it is another matter entirely. But it’s also worth looking at his only peer from college who could conceivably challenge him in terms of global influence. On paper, Mark Zuckerberg and Jared Kushner have remarkable similarities. Both are young Jewish men—although Kushner is more observant—who were born less than four years and sixty miles apart. Kushner, whose acceptance to Harvard was so manifestly the result of his family’s wealth that it became a case study in a book on the subject, was a member of the final clubs that Zuckerberg badly wanted to join, or so Aaron Sorkin would have us believe. Both ended up as unlikely media magnates of a very different kind: Kushner, like Charles Foster Kane, took over a New York newspaper from a man named Carter. Yet their approaches to their newfound positions couldn’t be more different. Kushner has been called “a shadow secretary of state” whose portfolio includes Mexico, China, the Middle East, and the reorganization of the federal government, but it feels like one long improvisation, on the apparent assumption that he can wing it and succeed where so many others have failed. As Bruce Bartlett writes in the New York Times, without a staff, Kushner “is just a dilettante meddling in matters he lacks the depth or the resources to grasp,” and we may not have a chance to recover if his intuitions are wrong. In other words, he resembles his father-in-law, as Frank Bruni notes:

I’m told by insiders that when Trump’s long-shot campaign led to victory, he and Kushner became convinced not only that they’d tapped into something that everybody was missing about America, but that they’d tapped into something that everybody was missing about the two of them.

Zuckerberg and Kushner’s lives ran roughly in parallel for a long time, but now they’re diverging at a point at which they almost seem to be offering us two alternate versions of the future, like an A/B test with only one possible outcome. Neither is wholly positive, but that doesn’t make the choice any less stark. And if you think this sounds farfetched, bookmark this post, and read it again in about six years.

The second system effect

with 2 comments

Kevin Costner in The Postman

Note: This post originally appeared, in a slightly different form, on July 14, 2015.

Why are second novels or movies so consistently underwhelming? Even if you account for such variables as heightened pressure, compressed turnaround time, and unrealistic expectations, the track record for works of art from The Postman to the second season of True Detective suggests that the sophomore slump, whatever it reflects, is real. For the economist Daniel Kahneman in Thinking, Fast and Slow, it’s a case of regression to the mean: any artistic breakthrough is by definition an outlier, since only exceptional efforts survive to come to light at all, and later attempts revert back to the artist’s natural level of ability. There’s also a sense in which a massive success removes many of the constraints that allowed for good work to happen in the first place. By now, it’s a cliché to note that the late installments in a popular series, from Harry Potter to A Song of Ice and Fire, feel like they haven’t been edited. It’s certainly true that authors who have sold a million copies have greater leverage when it comes to disregarding editorial notes, if they even receive them at all. Editors are as human as anyone else, and since commercial outcomes are such a crapshoot, you can’t blame them for not wanting to get in the way of a good thing. It didn’t hurt Rowling or Martin, but in the case of, say, the later novels of Thomas Harris, you could make a case that a little more editorial control might have been nice. And I’ve noted elsewhere that this may have more to do with the need to schedule blockbuster novels for a release date long in advance, whether they’re ready or not.

Yet there’s also a third, even more plausible explanation, which I first encountered in The Mythical Man-Month by Frederick P. Brooks, Jr., a seminal work on software engineering. Writing about what he calls “the second system effect,” Brooks notes:

An architect’s first work is apt to be spare and clean. He knows he doesn’t know what he’s doing, so he does it carefully and with great restraint.

As he designs the first work, frill after frill and embellishment after embellishment occur to him. These get stored away to be used “next time.” Sooner or later the first system is finished, and the architect, with firm confidence and a demonstrated mastery of that class of systems, is ready to build a second system.

The second is the most dangerous system a man ever designs. When he does his third and later ones, his prior experiences will confirm each other as to the general characteristics of such systems, and their differences will identify those parts of his experience that are particular are not generalizable.

Francis Ford Coppola

Brooks concludes: “The general tendency is to over-design the second system, using all the ideas and frills that were cautiously sidetracked on the first one.” And it’s startling how well this statement describes so many sophomore efforts in film and literature. It’s the difference between Easy Rider and The Last Movie, Sex Lies and Videotape and Kafka, Donnie Darko and Southland Tales, in which a spare, disciplined freshman work is succeeded by a movie that contains everything up to and including the kitchen sink. When you first try your hand at any kind of storytelling, you discover that the natural arc of the project tends toward removal and subtraction: you cut, pare back, and streamline, either because of your natural caution or because you don’t have the resources you need. Every edit is necessary, but it also carries a charge of regret. If your constraints are removed for your second project, this only adds fuel to an artist’s natural tendency to overindulge. And while the result may be a likable mess—a lot of us prefer Mallrats to Clerks—it rarely exhibits the qualities that first drew us to an artist’s work. (Even in movies made by committee, there’s an assumption that viewers want a bigger, louder, and busier version of what worked the first time around, which leads to so much of the narrative inflation that we see in blockbuster sequels.)

So what’s an artist to do? Brooks has some advice that everyone trying to follow up his or her first effort should keep in mind:

How does the architect avoid the second-system effect? Well, obviously he can’t skip his second system. But he can be conscious of the peculiar hazards of that system, and exert extra self-discipline to avoid functional ornamentation and to avoid extrapolation of functions that are obviated by changes in assumptions and purposes.

Translated into artistic terms, this means nothing more or less than treating a second attempt as exactly as hazardous as it really is. If anything, the track record of sophomore efforts should make writers even more aware of those risks, and even more relentless about asking the hard questions after a big success has made it possible to stop. When Francis Ford Coppola followed The Godfather with The Conversation, it was both a regathering and an act of discipline—in a movie largely about craft and constraints—that enabled the grand gestures to come. Coppola certainly wasn’t beyond insane acts of overreaching, but in this case, his instincts were sound. And I have a feeling that a lot of writers and filmmakers, in hindsight, wish that they could have skipped their second system and gone straight to their third.

Written by nevalalee

November 25, 2016 at 9:00 am

The oilman’s divorce, or the problem of luck

leave a comment »

Harold Hamm

Money may not be everything, but it’s certainly a clarifying force, especially when seventeen billion dollars are on the line. Right now, the news is filled with coverage of what may end up being the most expensive divorce in the history of the world, a courtroom battle currently being waged between Oklahoma oilman Harold Hamm and his wife Sue Ann. This kind of story possesses a certain guilty fascination of its own—if Ms. Hamm gets even a quarter of her husband’s fortune, she’ll be richer than Oprah—but it also raises some unexpected philosophical questions. At the moment, Hamm is claiming that his stake in Continental Resources, the company he founded almost five decades ago, was built entirely on luck, since the discovery of oil deposits depends on so many factors that are fundamentally beyond human control. Ms. Hamm, by contrast, says that it was all thanks to her husband’s skill and hard work. Which leaves us with the odd prospect of one of the world’s wealthiest men disclaiming all responsibility for his own success, while his soon-to-be ex-wife insists that he isn’t giving himself enough credit.

Of course, it isn’t hard to see why they’d stake out their particular positions. Under laws governing equitable division, assets that were actively earned over the course of the marriage are split between the two parties, while passive assets, or those in which chance played a role, remain undivided. The underlying presumption is that a marriage is a partnership in which both parties collectively participate, and the work of one can’t be separated from the presence of the other—an argument that breaks down when luck is involved. If Sue Ann can successfully argue that Harold’s fortune was derived from his own efforts, she’s entitled to up to half of everything; if Harold makes the case that it’s due to luck, he clings to the lion’s share. In the end, it’s likely that the decision will fall somewhere in the middle, and I’m very curious to hear Judge Howard Haralson’s final ruling on the subject. Because while Harold Hamm is clearly pressing his case beyond what he really believes about his own abilities—”He must be squirming inside,” as a psychologist notes in the NBC News story—he isn’t entirely wrong. And it took only a multibillion-dollar divorce suit for him to admit it.

Daniel Kahneman

We all tend to underrate the role that luck plays in human endeavor, whether successful or otherwise, especially when we’ve done well for ourselves. The truth is probably closer to what Daniel Kahneman sets forth in Thinking, Fast and Slow, using what he says is his favorite equation:

Success = talent + luck
Great success = a little more talent + a lot of luck

That’s particularly true of something like oil and gas, which remains a punishingly uncertain industry even with modern geological and technological expertise. There’s a reason why more than a few wildcatters used divining rods to locate petroleum deposits: even with considerable skill and experience, a lot of it comes down to luck, persistence, and time. David Mamet likes to say that everybody gets a break in Hollywood after twenty-five years, except that some get it at the beginning and others at the end, and the important thing is to be the one who stays after everyone else has gone home. And wildcatting, along with so much else, works much the same way.

So while it’s easy to dismiss Hamm’s argument as disingenuous—his success in building one of the largest energy companies in the country can’t be entirely divorced from his own actions and decisions—I’d prefer to think of it as a moment of rare candor, however little he might believe in it himself. As John Bogle likes to point out, choosing an investment fund manager based on past performance is a little like staking money on the winner of a coin-flipping contest: given enough participants, someone is bound to get ten heads in a row, but that doesn’t mean you should bet on the streak in the future. That’s as true of running a company as managing a mutual fund. What seems like a smart decision may really have been a lucky break, and it’s only in retrospect, as we construct a narrative to make sense of what happened, that we emerge with the myth of the business visionary. If Kahneman’s formula holds true, as it does in most things in life, Harold Hamm might well be entitled to keep most of what he earned. Admitting this would involve giving up many of our most cherished notions about success. But with seventeen billion dollars at stake, even one of the richest men in the world might have to concede that it’s better to be lucky than good.

Written by nevalalee

August 27, 2014 at 8:58 am

Transformed by print

leave a comment »

A page from my first draft

Somewhere in his useful and opinionated book Trial and Error, the legendary pulp writer Jack Woodford says that if you feel that your work isn’t as good as the fiction you see in stores, there’s a simple test to see if this is actually true. Take a page from a recent novel you admire—or one that has seen big sales or critical praise—and type the whole thing out unchanged. When you see the words in your own typewriter or computer screen, stripped of their superficial prettiness, they suddenly seem a lot less impressive. There’s something about professional typesetting that elevates even the sloppiest prose: it attains a kind of dignity and solidity that can be hard to see in its unpublished form. It isn’t just a story now; it’s an art object. And restoring it to the malleable, unpolished medium of a manuscript page often reveals how arbitrary the author’s choices really were, just as we tend to be hard on our own work because our rough drafts don’t look as nice as the stories we see in print.

There’s something undeniably mysterious about how visual cues affect the way we think about the words we’re reading, whether they’re our own or others. Daniel Kahneman has written about how we tend to read texts more critically when they’re printed in unattractive fonts, and Errol Morris recently ran an online experiment to test this by asking readers for their opinions about a short written statement, without revealing that some saw it in Baskerville and others in Comic Sans. (Significantly more of those who read it in Baskerville thought the argument was persuasive, while those who saw it in Comic Sans were less impressed, presumably because they were too busy clawing out their eyes.) Kindle, as in so many other respects, is the great leveler: it strips books of their protective sheen and forces us to evaluate them on their own merits. And I’d be curious to see a study on how the average review varies between those who read a novel in print and those who saw it in electronic form.

"Arkady arrived at the museum at ten..."

This is is also why I can’t bear to read my own manuscripts in anything other than Times New Roman, which is the font in which they were originally composed. When I’m writing a story, I’m primarily thinking about the content, yes, but I’m also consciously shaping how the text appears on the screen. As I’ve mentioned before, I’ve acquired a lot of odd tics and aversions from years spent staring at my own words on a computer monitor, and I’ve evolved just as many strategies for coping. I don’t like the look of a ragged right margin, for instance, so all my manuscripts are justified and hyphenated, at least until they go out to readers. I generally prefer it when the concluding line of a paragraph ends somewhere on the left half of the page, and I’ll often rewrite the text accordingly. And I like my short lines of dialogue to be exactly one page width long. All this disappears, of course, the second the manuscript is typeset, but as a way of maintaining my sanity throughout the writing process, these rituals play an important role.

And I don’t seem to mind their absence when I finally see my work in print, which introduces another level of detachment: these words don’t look like mine anymore, but someone else’s. (There are occasional happy exceptions: by sheer accident, the line widths in The Year’s Best Science Fiction Vol. 29 happen to exactly match the ones I use at home, so “The Boneless One” looks pretty much like it did on my computer, down to the shape of the paragraphs.) Last week, I finally received an advance copy of my novel Eternal Empire, hot off the presses, and I was struck by how little it felt like a book I’d written. Part of this is because it’s been almost a year since I finished the first draft, I’ve been working on unrelated projects since then, and a lot has happened in the meantime. But there’s also something about the cold permanence of the printed page that keeps me at arm’s length from my work. Once a story can no longer be changed, it ceases to be quite as alive as it once was. It’s still special. But it’s no longer a part of you.

Written by nevalalee

August 12, 2013 at 8:35 am

A few holiday thoughts on happiness

leave a comment »

Daniel Kahneman

The use of time is one of the areas of life over which people have some control. Few individuals can will themselves to have a sunnier disposition, but some may be able to arrange their lives to spend less of their day commuting, and more time doing things they enjoy with people they like…

Not surprisingly, a headache will make a person miserable, and the second best predictor of the feelings of a day is whether a person did or did not have contacts with friends or relatives. It is only a slight exaggeration to say that happiness is the experience of spending time with people you love and who love you.

Daniel Kahneman, Thinking, Fast and Slow

Written by nevalalee

December 25, 2012 at 9:50 am

Blurbed for your protection

leave a comment »

A few days ago, my wife was opening a new package of vitamins when she noticed that the familiar foil seal over the bottle’s mouth was missing. We’d bought the vitamins a while ago, and it was possible that we’d just opened them earlier without remembering it, but that didn’t seem likely—and the mouth of the bottle was perfectly clean, which implied that the seal had never been there in the first place. After a bit of discussion, we decided to throw the bottle away. The odds of there being anything wrong with it were vanishingly small, but there didn’t seem to be any point in taking the chance. That’s the function of a protective seal: when it’s there, you don’t think about it—unless you’re annoyed by how hard it is to pry open—but when it’s missing, it immediately sets off alarm bells. And while this may not seem to have much to do with writing or publishing, it actually gets at an important truth about how books are packaged and presented, and, in particular, about the misunderstood, often derided role of the humble blurb.

Recently, there’s been a lot of discussion about the problem of fake online reviews, with some authors paying “critics” outright to post flattering quotes on sites like Amazon. The blurbs that we see on most novels have often been lumped into the same category, based on the observation, which is certainly correct, that they wouldn’t be there at all if they weren’t complimentary. Yet the parallel isn’t quite exact. It’s true that most blurbs end up on the cover of a book through some kind of relationship between blurber and subject: in many cases, they’re authors whom the writer or editor knows personally, or clients of the same publishing house who can be approached through a friendly intermediary. More rarely, a wonderful blurb is obtained from a respected author through luck or tenacity alone. And in my own limited experience, such blurbs are at least sincere: writers know that they have little to gain by having their name attached to a laudatory quote on a bad book, and if they don’t care for it or aren’t interested, they’re more likely just not to respond.

That said, no one should mistake a blurb for a purely objective review—but that doesn’t mean readers should ignore them, either. A browser looking at a new release in a bookstore, especially from an unknown author, is generally operating with very limited information, which means that he or she needs to rely on a few simple heuristics. Does the book look like it’s worth reading? Is it a novel that the publisher believes in? The cover art, the jacket copy, and even the font are important clues, although they’re often most useful as negative indicators. When I see a badly designed book with an ugly typeface, it doesn’t tell me anything about the content, but it implies that if the publisher didn’t, or couldn’t, take such superficial elements seriously, the text itself isn’t likely to be any better. (Obviously, this isn’t always true. And we should be careful about drawing any conclusions from the opposite case, since Daniel Kahneman has shown that a nice font may make us less aware of an author’s sloppy writing or reasoning.)

The same principle applies to blurbs. This isn’t to say that awful books can’t receive glowing blurbs from other authors—it happens all the time. But the absence of at least one decent blurb should make us cautious, if nothing else. Blurbs may be largely meaningless, but they do take time and effort to obtain, and like cover art, design, and other cosmetic elements, they serve as a sort of rough index to the publisher’s commitment to the book itself. In some ways, they’re best understood as an element of cover design: a novel without a blurb looks a little naked, at least to my eyes, and what it says is ultimately less important than the fact that it’s there at all. There aren’t many authors, not even Nicole Krauss, whose testimonials, as far as fiction is concerned, will tempt me to buy a book, but their absence will often warn me away. That’s the nature of a blurb: it’s less a stamp of approval than a protective seal that tells us, at the very least, that there’s nothing outright toxic inside.

Written by nevalalee

November 14, 2012 at 9:44 am

%d bloggers like this: