Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Malcolm Gladwell

The Potion of Circe

with one comment

Daniel Ellsberg

Note: I’m taking a few days off, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on December 29, 2016.

In 1968, Daniel Ellsberg, the military analyst who would later become famous for leaking the Pentagon Papers, had a meeting with Henry Kissinger. At the time, Kissinger had spent most of his career as a consultant and an academic, and he was about to enter government service—as the National Security Advisor to Richard Nixon—for the first time. (Their conversation is described in Ellsberg’s memoir Secrets, and I owe my own discovery of it to a surprisingly fine article on Ellsberg and Edward Snowden by Malcolm Gladwell in The New Yorker.) Ellsberg, who had been brought in for a discussion about the Vietnam War, had a word of advice for Kissinger. He said:

Henry, there’s something I would like to tell you, for what it’s worth, something I wish I had been told years ago. You’ve been a consultant for a long time, and you’ve dealt a great deal with top secret information. But you’re about to receive a whole slew of special clearances, maybe fifteen or twenty of them, that are higher than top secret…I have a pretty good sense of what the effects of receiving these clearances are on a person who didn’t previously know they even existed. And the effects of reading the information that they will make available to you.

At first, Ellsberg said, Kissinger would feel “exhilarated” at having access to so much information. But he cautioned: 

Second, almost as fast, you will feel like a fool for having studied, written, talked about these subjects, criticized and analyzed decisions made by presidents for years without having known of the existence of all this information, which presidents and others had and you didn’t, and which must have influenced their decisions in ways you couldn’t even guess…You will feel like a fool, and that will last for about two weeks. Then, after you’ve started reading all this daily intelligence input and become used to using what amounts to whole libraries of hidden information…you will forget there ever was a time when you didn’t have it, and you’ll be aware only of the fact that you have it now and most others don’t…and that all those other people are fools.

Over a longer period of time—not too long, but a matter of two or three years—you’ll eventually become aware of the limitations of this information…In the meantime it will have become very hard for you to learn from anybody who doesn’t have these clearances. Because you’ll be thinking as you listen to them: ‘What would this man be telling me if he knew what I know?”

Henry Kissinger and Richard Nixon

After a while, Ellsberg concluded, this “mental exercise” would become so tortuous that Kissinger might cease to pay attention altogether: “The danger is, you’ll become something like a moron. You’ll become incapable of learning from most people in the world, no matter how much experience they may have in their particular areas that may be much greater than yours.” Ellsberg compared this sort of secret information to the potion of Circe, which turned Odysseus’s men into swine and left them incapable of working with other humans. And it’s a warning worth bearing in mind even for those of us who don’t have access to classified intelligence. Ellsberg’s admonition is really about distinguishing between raw information—which can be acquired with nothing but patience, money, or the right clearances—and the more elusive quality of insight. It applies to everyone who has ever wound up with more facts on a specific subject than anybody else he or she knows, which is just as true of writers of theses, research papers, and works of nonfiction as it is of government advisors. In researching my book Astounding, for instance, I’ve seen thousands of pages of letters and other documents that very few other living people have studied. They aren’t classified, but they’re hard to obtain and inconvenient to read, and I’m reasonably sure that I’m the only person in recent years who has tried to absorb them in their entirety. But a lot of other people could have done it. I didn’t have to be smart: I just had to be willing to reach out to the right librarians, sit in a chair for long periods, stare at a microfilm reader, and take decent notes.

There’s something to be said, of course, for being the one who actually goes out and does it. And there’s a sense in which this kind of drudgery is an indispensable precursor to insight: you’re more likely to come up with something worthwhile if you’ve mined the ore yourself, and there’s a big difference between taking the time to unearth it personally and having it handed to you. Reading a hundred grainy pages to discover the one fact you need isn’t the same thing as finding it on Wikipedia. It’s necessary, if not sufficient, and as Ellsberg notes, the “moron” stage is one that everyone needs to pass through in order to emerge on the other side. (A lot of us are also feeling nostalgic these days for the kind of government moron whom Ellsberg describes, who at least respected the information he had, rather than ignoring or dismissing any data that didn’t suit his political needs.) But it’s important to draw a line between the kind of expertise that accumulates steadily as a function of time—which any good drudge can acquire—and the kind that builds up erratically through thought and experience. It’s obvious in other people, but it can be hard to see it in ourselves. For long stretches, we’ll have acquired just enough knowledge to be dangerous, and we can only hope that we won’t do any lasting damage. And even if we’ve been warned, it’s a lesson that has to be learned firsthand. As Ellsberg ends the story: “Kissinger hadn’t interrupted this long warning…He seemed to understand that it was heartfelt, and he didn’t take it as patronizing, as I’d feared. But I knew it was too soon for him to appreciate fully what I was saying. He didn’t have the clearances yet.”

Written by nevalalee

March 1, 2018 at 9:00 am

Blivet or not

with 2 comments

In the June 1964 issue of Analog, which first went on sale on May 7, readers were treated to the drawing reproduced above, along with a note from editor John W. Campbell:

This outrageous piece of draftsmanship evidently escaped from the Finagle & Diddle Engineering works. If the contributor of this item—sent anonymously for some reason—will identify himself, we will happily pay $10 (ten bucks) or a two-year subscription to Analog.

A few months later, in the October issue, Campbell provided an update, although the source of the image proved frustratingly elusive:

It’s impossible to publish even a small fraction of the letters that outrageous piece of draftsmanship evoked. There were well over one hundred fifty letters on that one item alone—and while we have long been aware of the unusually high level of intelligence of Analog’s readership, the high level of honesty was a new and pleasant discovery. Not one of all those letters claimed to be the original contributor, or demanded the ten dollars!

Campbell added that readers had directed his attention to other instances of the illusion, which he said was sometimes called a “blivit” [sic], in recent issues of such publications as Road & Track, QST, The SAE Journal, “various and sundry house organs,” and textbooks on topology and psychology. And in December, he printed a letter from Edward G. Robles, Jr. of Sacramento, California, who claimed that the image had originated at the Jet Propulsion Laboratory in Pasadena.

To the best of my knowledge, the earliest verified appearance in print of the illusion most commonly known as the “blivet”—although there are anecdotal reports, as we’ll see shortly, from decades earlier—was in the March 23, 1964 issue of Aviation Week and Space Technology, in an advertisement for California Technical Industries, a company based in Belmont. The ad, a detail of which is pictured below, caught the eye of Donald Schuster, a professor of psychology at Iowa State University, who wrote in a short item in the American Journal of Psychology:

In my opinion, it is a matter of a new type of ambiguous figure. Unlike other ambiguous drawings and geometric figures…it is the shift in the optical focal point which plays a role in perception and interpretation here. If the observer focuses on the left-hand side of the figure at reading distance, he sees three legs, and the right-hand side remains blurred and fuzzy; if he focuses on the right-hand side, he sees a U-shaped object, like a chain joint/horizontal brace. Only if he looks at the middle or slowly allows his view to pass over the figure does he come to realize that he is looking at an “impossible object.”

The following year, it was featured on the March 1965 cover of Mad, which referred to it as “The Mad Poiuyt.” It inspired a flood of replies pointing out that it had previously appeared in such publications as Engineering Digest, The Airman, The Red Rag, The Society of Automotive Engineers Journal, Popular Mechanics, and the letters column of the July 1964 issue of Popular Science, from a reader who said that he first saw it in The Circulator, published by the Honeywell Regulator Company in Minneapolis. (I owe most of this information to David Singmaster’s Sources in Recreational Mathematics, an archived version of which can be found here.) It also made one last appearance in Analog, in February 1969, in which Campbell discussed the phenomenon of endlessly ascending tones, illustrated by a picture of a blivet in the form of a tuning fork.

The invention of the blivet has been convincingly attributed to the Swedish graphic artist Oscar Reutersvärd, the originator of many other impossible figures, who asserted in a letter quoted in Bruno Ernst’s The Eye Beguiled that he had drawn “figures of the devil’s fork type” in Stockholm in the thirties. For its explosion in popularity in the sixties, however, we can look a little closer to home. In the October issue of Analog that I mentioned above, Campbell printed a letter from James E. Tunnell of Industrial Camera, based in Oakland, California, which featured a blivet on its company letterhead. Tunnell wrote:

While a student in grade school some twenty years ago, I saw for the very first time, in my old red mathematics book, a drawing much as that shown in the upper-left hand corner of this letter, and very much like that in your publication.

In 1952, when we started business, this design was undertaken to serve as a logo. We have used it on our letterhead, on the back of business cards…and on the automobiles we use as you can see on close inspection of the attached photograph.

In the field of higher mathematics, this model is known as a Two-Slot, Mark 4, Blivit—origin unknown—and during the past ten years has gotten into the hands of many organizations through our business dealings with them.

For reasons that I’ll explain in a moment, the italics are mine. And I’ll just note for now that it’s only half an hour by car from Oakland to Belmont, where California Technical Industries was based.

When you put all this information together, an intriguing pattern emerges. The blivet can plausibly be said to have first been drawn in the thirties by Reutersvärd. From there, it migrated into at least one textbook, until it ended up as the logo of Industrial Camera. Various individuals and groups were thereby exposed to it over the next decade until, suddenly, it seemed to be everywhere at once—it showed up in Analog just six weeks after its appearance in Aviation Week, which seems too soon for one instance to have directly inspired the other. In other words, it went viral. And the evidence, while limited, implies that it owed its overnight emergence to many of the criteria that have been proposed for other kinds of social epidemics. It was a “sticky” image that couldn’t be forgotten after the viewer had seen it. After a long gestational period, it took root in an existing community of scientists and engineers with a network of small publications and newsletters in which it could be easily shared. There was also a geographical factor involved, since many of these organizations were based in the Bay Area. (Social epidemics have a curious way of starting in my home state. Malcolm Gladwell’s classic case study in The Tipping Point is Rebecca Wells’s novel Divine Secrets of the Ya-Ya Sisterhood, which first took hold in independent bookstores in Northern California, and I’ve elsewhere pointed to dianetics as an equally quintessential example.) The tipping point here may well have been its appearance in Analog, which was read both by a core audience of professionals and by a larger popular audience, and from there, it moved by a series of logical gradations to Mad, which, in the sixties, was an important gateway in which nerd culture passed invisibly into mass culture. It began as an inside joke, or even a meme, and before long, it became so ubiquitous that it seemed like it had always been there. If you’re looking for instances of virality, the blivet is a nice one, since it’s such a distinctive image that it doesn’t seem likely to have spread except by contagion. I don’t have the time to dig into it properly, but I offer it up to any academic who wants to trace its origins and dissemination more systematically. After all, it didn’t have just one tipping point, but two. Or maybe three. I guess it depends on how you look at it.

The dianetics epidemic

with 6 comments

Dianetics: The Modern Science of Mental Health

In his bestselling book The Tipping Point, Malcolm Gladwell devotes several pages to a discussion of the breakout success of the novel Divine Secrets of the Ya-Ya Sisterhood. After its initial release in 1996, it sold reasonably well in hardcover, receiving “a smattering of reviews,” but it became an explosive phenomenon in paperback, thanks primarily to what Gladwell calls “the critical role that groups play in social epidemics.” He writes:

The first bestseller list on which Ya-Ya Sisterhood appeared was the Northern California Independent Bookseller’s list. Northern California…was where seven hundred and eight hundred people first began showing up at [Rebecca Wells’s] readings. It was where the Ya-Ya epidemic began. Why? Because…the San Francisco area is home to one of the country’s strongest book club cultures, and from the beginning Ya-Ya was what publishers refer to as a “book club book.” It was the kind of emotionally sophisticated, character-driven, multilayered novel that invites reflection and discussion, and book groups were flocking to it. The groups of women who were coming to Wells’s readings were members of reading groups, and they were buying extra copies not just for family and friends but for other members of the group. And because Ya-Ya was being talked about and read in groups, the book itself became that much stickier. It’s easier to remember and appreciate something, after all, if you discuss it for two hours with your best friends. It becomes a social experience, an object of conversation. Ya-Ya’s roots in book group culture tipped it into a larger word-of-mouth epidemic.

You could say much the same thing about a very different book that became popular in California nearly five decades earlier. Scientology has exhibited an unexpected degree of staying power among a relatively small number of followers, but Dianetics: The Modern Science of Mental Health, the work that that made L. Ron Hubbard famous, was a textbook case of a viral phenomenon. Just three months elapsed between the book’s publication on May 9, 1950 and Hubbard’s climactic rally at the Shrine Auditorium on August 10, and its greatest impact on the wider culture occurred over a period of less than a year. And its dramatic spread and decline had all the hallmarks of virality. In the definitive Hubbard biography Bare-Faced Messiah, Russell Miller writes:

For the first few days after publication of Dianetics: The Modern Science of Mental Health, it appeared as if the publisher’s caution about the book’s prospects had been entirely justified. Early indications were that it had aroused little interest; certainly it was ignored by most reviewers. But suddenly, towards the end of May, the line on the sales graph at the New York offices of Hermitage House took a steep upturn.

By midsummer, it was selling a thousand copies a day, and by late fall, over seven hundred dianetics clubs had been established across the country. As Miller writes: “Dianetics became, virtually overnight, a national ‘craze’ somewhat akin to the canasta marathons and pyramid clubs that had briefly flourished in the hysteria of postwar America.”

Divine Secrets of the Ya-Ya Sisterhood

The result was a quintessential social epidemic, and I’m a little surprised that Gladwell, who is so hungry for case studies, has never mentioned it. The book itself was “sticky,” with its promise of a new science of mental health that could be used by anyone and that got results every time. Like Ya-Ya, it took root in an existing group—in this case, the science fiction community, which was the natural audience for its debut in the pages of Astounding. Just as the ideal book club selection is one that inspires conversations, dianetics was a shared experience: in order to be audited, you needed to involve at least one other person. Auditing, as the therapy was originally presented, seemed so easy that anyone could try it, and many saw it as a kind of parlor game. (In his biography of Robert A. Heinlein, William H. Patterson shrewdly compares it to the “Freuding parties” that became popular in Greenwich Village in the twenties.) Even if you didn’t want to be audited yourself, dianetics became such a topic of discussion among fans that summer that you had to read the book to be a part of it. It also benefited from the presence of what Gladwell calls mavens, connectors, and salesmen. John W. Campbell was the ultimate maven, an information broker who, as one of Gladwell’s sources puts it, “wants to solve other people’s problems, generally by solving his own.” The connectors included prominent members of the fan community, notably A.E. van Vogt, who ended up running the Los Angeles foundation, and Forrest Ackerman, Hubbard’s agent and “the number one fan.” And the salesman was Hubbard himself, who threw himself into the book’s promotion on the West Coast. As Campbell wrote admiringly to Heinlein: “When Ron wants to, he can put on a personality that would be a confidence man’s delight—persuasive, gentle, intimately friendly. The perfect bedside manner, actually.”

In all epidemics, geography plays a crucial role, and in the case of dianetics, it had profound consequences on individual careers. One of Campbell’s priorities was to sell the therapy to his top writers, much as the Church of Scientology later reached out to movie stars, and the single greatest predictor of how an author would respond was his proximity to the centers of fan culture. Two of the most important converts were van Vogt, who was in Los Angeles, and Theodore Sturgeon, who lived in New York, where he was audited by Campbell himself. Isaac Asimov, by contrast, had moved from Manhattan to Boston just the year before, and Heinlein, fascinatingly, had left Hollywood, where he had been working on the film Destination Moon, in February of 1950. Heinlein was intrigued by dianetics, but because he was in Colorado Springs with his wife Ginny, who refused to have anything to do with it, he was unable to find an auditing partner. And it’s worth wondering what might have ensued if he had remained in Southern California for another six months. (Such accidents of place and time can have significant aftereffects. Van Vogt had moved from the Ottawa area to Los Angeles in 1944, and his involvement with dianetics took him out of writing for the better part of a decade, at the very moment when science fiction was breaking into the culture as a whole. His absence during this critical period, which made celebrities out of Heinlein and Asimov, feels like a big part of the reason why van Vogt has mostly disappeared from the popular consciousness. And it might never have happened if he had stayed in Canada.) The following year, dianetics as a movement fizzled out, due largely to Hubbard’s own behavior—although he might also have sensed that it wouldn’t last. But it soon mutated into another form. And before long, Hubbard would begin to spread a few divine secrets of his own.

The Potion of Circe

with one comment

Daniel Ellsberg

In 1968, Daniel Ellsberg, the military analyst who would later become famous for leaking the Pentagon Papers, had a meeting with Henry Kissinger. At the time, Kissinger had spent most of his career as a consultant and an academic, and he was about to enter government service—as the National Security Advisor to Richard Nixon—for the first time. (Their conversation is described in Ellsberg’s memoir Secrets, and I owe my own discovery of it to a surprisingly fine article on Ellsberg and Edward Snowden by Malcolm Gladwell in The New Yorker.) Ellsberg, who had been brought in for a discussion about the Vietnam War, had a word of advice for Kissinger. He said:

Henry, there’s something I would like to tell you, for what it’s worth, something I wish I had been told years ago. You’ve been a consultant for a long time, and you’ve dealt a great deal with top secret information. But you’re about to receive a whole slew of special clearances, maybe fifteen or twenty of them, that are higher than top secret…I have a pretty good sense of what the effects of receiving these clearances are on a person who didn’t previously know they even existed. And the effects of reading the information that they will make available to you.

At first, Ellsberg said, Kissinger would feel “exhilarated” at having access to so much information. But he cautioned: 

Second, almost as fast, you will feel like a fool for having studied, written, talked about these subjects, criticized and analyzed decisions made by presidents for years without having known of the existence of all this information, which presidents and others had and you didn’t, and which must have influenced their decisions in ways you couldn’t even guess…You will feel like a fool, and that will last for about two weeks. Then, after you’ve started reading all this daily intelligence input and become used to using what amounts to whole libraries of hidden information…you will forget there ever was a time when you didn’t have it, and you’ll be aware only of the fact that you have it now and most others don’t…and that all those other people are fools.

Over a longer period of time—not too long, but a matter of two or three years—you’ll eventually become aware of the limitations of this information…In the meantime it will have become very hard for you to learn from anybody who doesn’t have these clearances. Because you’ll be thinking as you listen to them: ‘What would this man be telling me if he knew what I know?”

Henry Kissinger and Richard Nixon

After a while, Ellsberg concluded, this “mental exercise” would become so tortuous that Kissinger might cease to pay attention altogether: “The danger is, you’ll become something like a moron. You’ll become incapable of learning from most people in the world, no matter how much experience they may have in their particular areas that may be much greater than yours.” Ellsberg compared this sort of secret information to the potion of Circe, which turned Odysseus’s men into swine and left them incapable of working with other humans. And it’s a warning worth bearing in mind even for those of us who don’t have access to classified intelligence. Ellsberg’s admonition is really about distinguishing between raw information—which can be acquired with nothing but patience, money, or the right clearances—and the more elusive quality of insight. It applies to everyone who has ever wound up with more facts on a specific subject than anybody else he or she knows, which is just as true of writers of theses, research papers, and works of nonfiction as it is of government advisors. In researching my book Astounding, for instance, I’ve seen thousands of pages of letters and other documents that very few other living people have studied. They aren’t classified, but they’re hard to obtain and inconvenient to read, and I’m reasonably sure that I’m the only person in recent years who has tried to absorb them in their entirety. But a lot of other people could have done it. I didn’t have to be smart: I just had to be willing to reach out to the right librarians, sit in a chair for long periods, stare at a microfilm reader, and take decent notes.

There’s something to be said, of course, for being the one who actually goes out and does it. And there’s a sense in which this kind of drudgery is an indispensable precursor to insight: you’re more likely to come up with something worthwhile if you’ve mined the ore yourself, and there’s a big difference between taking the time to unearth it personally and having it handed to you. Reading a hundred grainy pages to discover the one fact you need isn’t the same thing as finding it on Wikipedia. It’s necessary, if not sufficient, and as Ellsberg notes, the “moron” stage is one that everyone needs to pass through in order to emerge on the other side. (I also suspect that we’ll start to feel nostalgic soon for the kind of government moron whom Ellsberg describes, who at least respected the information he had, rather than ignoring or dismissing any data that didn’t suit his political needs.) But it’s important to draw a line between the kind of expertise that accumulates steadily as a function of time—which any good drudge can acquire—and the kind that builds up erratically through thought and experience. It’s obvious in other people, but it can be hard to see it in ourselves. For long stretches, we’ll have acquired just enough knowledge to be dangerous, and we can only hope that we won’t do any lasting damage. And even if we’ve been warned, it’s a lesson that has to be learned firsthand. As Ellsberg ends the story: “Kissinger hadn’t interrupted this long warning…He seemed to understand that it was heartfelt, and he didn’t take it as patronizing, as I’d feared. But I knew it was too soon for him to appreciate fully what I was saying. He didn’t have the clearances yet.”

Written by nevalalee

December 29, 2016 at 8:24 am

Malcolm in the Middle

with 2 comments

Malcolm Gladwell

Last week, the journalism blog Our Bad Media accused the author Malcolm Gladwell of lapses in reporting that it alleged fell just short of plagiarism. In multiple instances, Gladwell took details in his pieces for The New Yorker, without attribution, from sources that were the only possible places where such information could have been obtained. For instance, an anecdote about the construction of the Troy-Greenfield railroad was based closely an academic article by the historian John Sawyer, which isn’t readily available online, and which includes facts that appear nowhere else. Gladwell doesn’t mention Sawyer anywhere. And while it’s hard to make a case that any of this amounts to plagiarism in the strictest sense, it’s undeniably sloppy, as well as a disservice to readers who might want to learn more. In a statement responding to the allegations, New Yorker editor David Remnick wrote:

The issue is not really about Malcolm. And, to be clear, it isn’t about plagiarism. The issue is an ongoing editorial challenge known to writers and editors everywhere—to what extent should a piece of journalism, which doesn’t have the apparatus of academic footnotes, credit secondary sources? It’s an issue that can get complicated when there are many sources with overlapping information. There are cases where the details of an episode have passed into history and are widespread in the literature. There are cases that involve a unique source. We try to make judgments about source attribution with fairness and in good faith. But we don’t always get it right…We sometimes fall short, but our hope is always to give readers and sources the consideration they deserve.

Remnick’s response is interesting on a number of levels, but I’d like to focus on one aspect: the idea that after a certain point, details “have passed into history,” or, to quote Peter Canby, The New Yorker‘s own director of fact checking, a quote or idea can “escape its authorship” after it has been disseminated widely enough. In some cases, there’s no ambiguity over whether a fact has the status of public information; if we want to share a famous story about Immanuel Kant’s work habits, for instance, we don’t necessarily need to trace the quote back to where it first appeared. On the opposite end of the spectrum, we have something like a quotation from a particular interview with a living person, which ought to be attributed to its original source, and which Gladwell has occasionally failed to do. And in the middle, we have a wild gray area of factual information that might be considered common property, but which has only appeared in a limited number of places. Evidently, there’s a threshold—or, if you like, a tipping point—at which a fact or quote has been cited enough to take on a life of its own, and the real question is when that moment takes place.

Ian McEwan

It’s especially complicated in genres like fiction and narrative nonfiction, which, as Remnick notes, lack the scholarly apparatus of more academic writing. A few years ago, Ian McEwan fell into an absurd controversy over details in Atonement that were largely derived from a memoir by the wartime nurse Lucilla Andrews. McEwan credits Andrews in his acknowledgments, and his use of such materials inspired a ringing defense from none other than Thomas Pynchon:

Unless we were actually there, we must turn to people who were, or to letters, contemporary reporting, the encyclopedia, the Internet, until, with luck, at some point, we can begin to make a few things of our own up. To discover in the course of research some engaging detail we know can be put into a story where it will do some good can hardly be classed as a felonious act—it is simply what we do.

You could argue, on a similar level, that assimilating information and presenting it in a readable form is simply what Gladwell does, too. Little if anything that Gladwell writes is based on original research; he’s a popularizer, and a brilliant one, who compiles ideas from other sources and presents them in an attractive package. The result shades into a form of creative writing, rather than straight journalism, and at that point, the attribution of sources indeed starts to feel like a judgment call.

But it also points to a limitation in the kind of writing that Gladwell does so well. As I’ve pointed out in my own discussion of the case of Jonah Lehrer, whose transgressions were significantly more troubling, there’s tremendous pressure on writers like Gladwell—a public figure and a brand name as much as a writer—to produce big ideas on a regular basis. At times, this leads him to spread himself a little too thin; a lot of his recent work consists of him reading a single book and delivering its insights with a Gladwellian twist. At his best, he adds real value as a synthesizer and interpreter, but he’s also been guilty of distorting the underlying material in his efforts to make it digestible. And a great deal of what makes his pieces so seductive lies in the fact that so much of the process has been erased: they come to us as seamless products, ready for a TED talk, that elide the messy work of consolidation and selection. If Gladwell was more open about his sources, he’d be more useful, but also less convincing. Which may be why the tension between disclosure and readability that Remnick describes is so problematic in his case. Gladwell really ought to show his work, but he’s made it this far precisely because he doesn’t.

How is a writer like an entrepreneur?

leave a comment »

Ron Popeil

Like most people, I’ll occasionally come up with what I’m convinced is a great invention or business idea. Now that I have a newborn daughter in the house, my brainstorms tend to center around products for babies: for instance, a reliable baby glove that won’t slip off within seconds of being pulled on, leaving my daughter’s sharp nails free to claw at her little face. I’m not particularly tempted to follow up on these ideas, of course, partially because products for babies can be hard to test and market—as an acquaintance of mine recently pointed out, the best idea in the world isn’t worth much after it sends one kid to the emergency room—and because I lack the skills and inclination to develop a business idea into something more. I’ve spent all my time learning how to write, and I suspect that a real entrepreneur would respond to my ideas for a killer baby app with the same impatience with which I regard people who insist that they’re full of great ideas for novels, if only they had the time to write them down.

Writers and entrepreneurs have at least one thing in common: success in either field rarely comes down to one magic idea. Rather, in the latter case, it’s the habit of entrepreneurship itself, and the ability to develop ideas and bring them to completion, that typifies the best startups. Paul Graham, the programmer, essayist, and venture capitalist I’ve quoted here before, likes to say that he’s investing in the personalities of the founders, not in the product they’re currently selling. A recent Vanity Fair piece by Randall Stross on Graham’s Y Combinator, a sort of startup boot camp where teams of young entrepreneurs pitch ideas for funding, points out that a team’s initial concept will often change between the time their application is accepted and the day of their actual interview. “We liked you guys more than the idea,” Graham tells one group at the start of a meeting, and he cautions that a company’s goals will often change as the founders figure out what problem they’re trying to solve. As the article notes:

Graham is much more interested in the founders than in the proposed business idea. When he sees a strong team of founders with the qualities that he believes favor success, he will overlook a weak idea.

Paul Graham

And if a writer often resembles a kind of serial entrepreneur, it’s because he’s distinctive less because his ideas are better than anyone else’s—good ideas, as we all know, are cheap—than because he’s relentlessly resourceful, and knows what to do with an idea when he sees it. In short, he’s like Ron Popeil, the pitchman behind the Ronco Food Dehydrator, Mr. Microphone, and the Pocket Fisherman. As Malcom Gladwell observes in a famous New Yorker piece, Popeil isn’t just a salesman, but an inveterate tinkerer, the kind of man who might “lay awake at night thinking of a way to chop an onion so that the only tears you shed were tears of joy.” And because he has the skills not just to come with with an idea, but to package and market it, he’s done so again and again. Similarly, as a writer, I have plenty of room for improvement, but if there’s one thing I’ve learned, it’s that given a decent idea and sufficient time—a few weeks for a short story, nine months to a year for a novel—I can turn it into a finished manuscript. Whether anyone else will want to buy it is another matter. But as Stephen Sondheim says in another context, it will be a proper song.

Of course, the fact that I had to quit my job to figure out writing to my own satisfaction hints at another point of similarity. As Graham notes elsewhere:

Statistically, if you want to avoid failure, it would seem like the most important thing is to quit your day job. Most founders of failed startups don’t quit their day jobs, and most founders of successful ones do. If startup failure were a disease, the CDC would be issuing bulletins warning people to avoid day jobs.

Yet I wouldn’t say that quitting one’s day job is what makes a successful entrepreneur. More likely, it’s the other way around: true entrepreneurs, like writers, tend to be people who just aren’t happy doing anything else, so it’s only a matter of time before they decide to devote all of their energies to it. I probably could have learned how to write a decent novel while holding down another job on the side—and plenty of other authors have done so—but quitting my job, while not the cause, was certainly an effect of where my life ended up taking me. And although a writer’s life, like an entrepreneur’s, is hardly an easy one, it allows me to say, in Popeil’s enticing words: “But wait, there’s more…”

Written by nevalalee

March 11, 2013 at 9:50 am

The road to mastery

with 5 comments

“Ten thousand hours,” writes Malcom Gladwell in Outliers, “is the magic number of greatness.” That is, ten thousand hours of hard practice, at minimum, is a necessary prerequisite for success in any field, whether it’s chess, the violin, or even, dare I say it, writing. There’s also the variously attributed but widely accepted rule that a writer needs to crank out a million words, over roughly ten years, before achieving a basic level of technical competence. Both of these numbers are, obviously, sort of bogus—many people will require more time, a few much less. But they’re also useful. Ultimately, the underlying message in both cases is the same: mastery in any field takes years of commitment. And if you need some kind of number to guide you on your way, like Dumbo’s magic feather, that’s fine.

Because the only real path to mastery is staying in the game. Terry Rossio, on his very useful Wordplay site, makes a similar point, noting that when he was just starting out as a writer, he realized that anyone who spent ten years at a job—”grocery clerk, college professor, machinist, airline pilot”—had no choice but to become an expert at it. He concludes:

This insight freed me from the fear of picking a so-called “impossible” job. I could pick any field I wanted, free of intimidation, because it was guaranteed I would become an expert…if I was willing to stick to it for ten years. So I picked the job I really wanted deep in my heart: writing for movies.

The concept of a necessary amount of time to achieve expertise is what inspired the old master/apprentice relationship, in which, for instance, a focus puller would spend ten years observing what a cinematographer did, and at the end, be ready to shoot a movie himself. Writing doesn’t offer such neat arrangements, but it still requires the same investment of time, along with an occasional push in the right direction.

In fact, the best argument for writing full-time is that it allows you to accelerate this process. In the nearly four years I spent at my first job in New York, I wrote perhaps 30,000 words of fiction, only a fraction of which was published. After quitting my job, in the five years since, I’ve written about 600,000 words, not to mention another 100,000 words for this blog—a number that gives even me pause. While not all these words were great, they’re getting better, and close to half are going to end up in print. The number of hours is harder to quantify, but it’s probably something like 7,500, which, combined with the untold hours I spent writing bad fiction earlier in my life, has brought me close to Gladwell’s number. And if I hadn’t spent the past five years doing little else, I wouldn’t even be a third of the way there.

Of course, time by itself isn’t enough. The road to mastery is paved with well-intentioned grinders who work diligently on the same story or comic for years without showing any sign of improving. (The cartoonist Missy Pena memorably described this type to Todd VanDerWerff of the A.V. Club at this year’s Comic-Con. VanDerWerff writes: “Plenty of people who get—and deserve—bad reviews come back year after year after year, never quite getting what it is they could do better, treating the whole thing as a kind of weird theater.”) But even if time isn’t a sufficient condition, it’s at least a necessary one. Every great writer has served an apprenticeship, even if he or she doesn’t like to admit it, and if you haven’t rushed into print, you can always deny it when the time comes. As Hemingway said, when a suitcase filled with his old unpublished stories was lost: “It’s none of their business that you have to learn how to write. Let them think you were born that way.”

Written by nevalalee

September 21, 2011 at 9:05 am

%d bloggers like this: