Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Jonah Lehrer

Subterranean fact check blues

leave a comment »

In Jon Ronson’s uneven but worthwhile book So You’ve Been Publicly Shamed, there’s a fascinating interview with Jonah Lehrer, the superstar science writer who was famously hung out to dry for a variety of scholarly misdeeds. His troubles began when a journalist named Michael C. Moynihan noticed that six quotes attributed to Bob Dylan in Lehrer’s Imagine appeared to have been fabricated. Looking back on this unhappy period, Lehrer blames “a toxic mixture of insecurity and ambition” that led him to take shortcuts—a possibility that occurred to many of us at the time—and concludes:

And then one day you get an email saying that there’s these…Dylan quotes, and they can’t be explained, and they can’t be found anywhere else, and you were too lazy, too stupid, to ever check. I can only wish, and I wish this profoundly, I’d had the temerity, the courage, to do a fact check on my last book. But as anyone who does a fact check knows, they’re not particularly fun things to go through. Your story gets a little flatter. You’re forced to grapple with all your mistakes, conscious and unconscious.

There are at least two striking points about this moment of introspection. One is that the decision whether or not to fact-check a book was left to the author himself, which feels like it’s the wrong way around, although it’s distressingly common. (“Temerity” also seems like exactly the wrong word here, but that’s another story.) The other is that Lehrer avoided asking someone to check his facts because he saw it as a painful, protracted process that obliged him to confront all the places where he had gone wrong.

He’s probably right. A fact check is useful in direct proportion to how much it hurts, and having just endured one recently for my article on L. Ron Hubbard—a subject on whom no amount of factual caution is excessive—I can testify that, as Lehrer says, it isn’t “particularly fun.” You’re asked to provide sources for countless tiny statements, and if you can’t find it in your notes, you just have to let it go, even if it kills you. (As far as I can recall, I had to omit exactly one sentence from the Hubbard piece, on a very minor point, and it still rankles me.) But there’s no doubt in my mind that it made the article better. Not only did it catch small errors that otherwise might have slipped into print, but it forced me to go back over every sentence from another angle, critically evaluating my argument and asking whether I was ready to stand by it. It wasn’t fun, but neither are most stages of writing, if you’re doing it right. In a couple of months, I’ll undergo much the same process with my book, as I prepare the endnotes and a bibliography, which is the equivalent of my present self performing a fact check on my past. This sort of scholarly apparatus might seem like a courtesy to the reader, and it is, but it’s also good for the book itself. Even Lehrer seems to recognize this, stating in his attempt at an apology in a keynote speech for the Knight Foundation:

If I’m lucky enough to write again, I won’t write a thing that isn’t fact-checked and fully footnoted. Because here is what I’ve learned: unless I’m willing to continually grapple with my failings—until I’m forced to fix my first draft, and deal with criticism of the second, and submit the final for a good, independent scrubbing—I won’t create anything worth keeping around.

For a writer whose entire brand is built around counterintuitive, surprising insights, this realization might seem bluntly obvious, but it only speaks to how resistant most writers, including me, are to any kind of criticism. We might take it better if we approached it with the notion that it isn’t simply for the sake of our readers, or our hypothetical critics, or even the integrity of the subject matter, but for ourselves. A footnote lurking in the back of the book makes for a better sentence on the page, if only because of the additional pass that it requires. It would help if we saw such standards—the avoidance of plagiarism, the proper citation of sources—not as guidelines imposed by authority from above, but as a set of best practices that well up from inside the work itself. A few days ago, there yet was another plagiarism controversy, which, in what Darin Morgan once called “one of those coincidences found only in real life and great fiction,” also involved Bob Dylan. As Andrea Pitzer of Slate recounts it:

During his official [Nobel] lecture recorded on June 4, laureate Bob Dylan described the influence on him of three literary works from his childhood: The Odyssey, All Quiet on the Western Front, and Moby-Dick. Soon after, writer Ben Greenman noted that in his lecture Dylan seemed to have invented a quote from Moby-Dick…I soon discovered that the Moby-Dick line Dylan dreamed up last week seems to be cobbled together out of phrases on the website SparkNotes, the online equivalent of CliffsNotes…Across the seventy-eight sentences in the lecture that Dylan spends describing Moby-Dick, even a cursory inspection reveals that more than a dozen of them appear to closely resemble lines from the SparkNotes site.

Without drilling into it too deeply, I’ll venture to say that if this all seems weird, it’s because Bob Dylan, of all people, after receiving the Nobel Prize for Literature, might have cribbed statements from an online study guide written by and for college students. But isn’t that how it always goes? Anecdotally speaking, plagiarists seem to draw from secondary or even tertiary sources, like encyclopedias, since the sort of careless or hurried writer vulnerable to indulging in it in the first place isn’t likely to grapple with the originals. The result is an inevitable degradation of information, like a copy of a copy. As Edward Tufte memorably observes in Visual Explanations: “Incomplete plagiarism leads to dequantification.” In context, he’s talking about the way in which illustrations and statistical graphics tend to lose data the more often they get copied. (In The Visual Display of Quantitative Information, he cites a particularly egregious example, in which a reproduction of a scatterplot “forgot to plot the points and simply retraced the grid lines from the original…The resulting figure achieves a graphical absolute zero, a null data-ink ratio.”) But it applies to all kinds of plagiarism, and it makes for a more compelling argument, I think, than the equally valid point that the author is cheating the source and the reader. In art or literature, it’s better to argue from aesthetics than ethics. If fact-checking strengthens a piece of writing, then plagiarism, with its effacing of sources and obfuscation of detail, can only weaken it. One is the opposite of the other, and it’s no surprise that the sins of plagiarism and fabrication tend to go together. They’re symptoms of the same underlying sloppiness, and this is why writers owe it to themselves—not to hypothetical readers or critics—to weed them out. A writer who is sloppy on small matters of fact can hardly avoid doing the same on the higher levels of an argument, and policing the one is a way of keeping an eye on the other. It isn’t always fun. But if you’re going to be a writer, as Dylan himself once said: “Now you’re gonna have to get used to it.”

Malcolm in the Middle

with 2 comments

Malcolm Gladwell

Last week, the journalism blog Our Bad Media accused the author Malcolm Gladwell of lapses in reporting that it alleged fell just short of plagiarism. In multiple instances, Gladwell took details in his pieces for The New Yorker, without attribution, from sources that were the only possible places where such information could have been obtained. For instance, an anecdote about the construction of the Troy-Greenfield railroad was based closely an academic article by the historian John Sawyer, which isn’t readily available online, and which includes facts that appear nowhere else. Gladwell doesn’t mention Sawyer anywhere. And while it’s hard to make a case that any of this amounts to plagiarism in the strictest sense, it’s undeniably sloppy, as well as a disservice to readers who might want to learn more. In a statement responding to the allegations, New Yorker editor David Remnick wrote:

The issue is not really about Malcolm. And, to be clear, it isn’t about plagiarism. The issue is an ongoing editorial challenge known to writers and editors everywhere—to what extent should a piece of journalism, which doesn’t have the apparatus of academic footnotes, credit secondary sources? It’s an issue that can get complicated when there are many sources with overlapping information. There are cases where the details of an episode have passed into history and are widespread in the literature. There are cases that involve a unique source. We try to make judgments about source attribution with fairness and in good faith. But we don’t always get it right…We sometimes fall short, but our hope is always to give readers and sources the consideration they deserve.

Remnick’s response is interesting on a number of levels, but I’d like to focus on one aspect: the idea that after a certain point, details “have passed into history,” or, to quote Peter Canby, The New Yorker‘s own director of fact checking, a quote or idea can “escape its authorship” after it has been disseminated widely enough. In some cases, there’s no ambiguity over whether a fact has the status of public information; if we want to share a famous story about Immanuel Kant’s work habits, for instance, we don’t necessarily need to trace the quote back to where it first appeared. On the opposite end of the spectrum, we have something like a quotation from a particular interview with a living person, which ought to be attributed to its original source, and which Gladwell has occasionally failed to do. And in the middle, we have a wild gray area of factual information that might be considered common property, but which has only appeared in a limited number of places. Evidently, there’s a threshold—or, if you like, a tipping point—at which a fact or quote has been cited enough to take on a life of its own, and the real question is when that moment takes place.

Ian McEwan

It’s especially complicated in genres like fiction and narrative nonfiction, which, as Remnick notes, lack the scholarly apparatus of more academic writing. A few years ago, Ian McEwan fell into an absurd controversy over details in Atonement that were largely derived from a memoir by the wartime nurse Lucilla Andrews. McEwan credits Andrews in his acknowledgments, and his use of such materials inspired a ringing defense from none other than Thomas Pynchon:

Unless we were actually there, we must turn to people who were, or to letters, contemporary reporting, the encyclopedia, the Internet, until, with luck, at some point, we can begin to make a few things of our own up. To discover in the course of research some engaging detail we know can be put into a story where it will do some good can hardly be classed as a felonious act—it is simply what we do.

You could argue, on a similar level, that assimilating information and presenting it in a readable form is simply what Gladwell does, too. Little if anything that Gladwell writes is based on original research; he’s a popularizer, and a brilliant one, who compiles ideas from other sources and presents them in an attractive package. The result shades into a form of creative writing, rather than straight journalism, and at that point, the attribution of sources indeed starts to feel like a judgment call.

But it also points to a limitation in the kind of writing that Gladwell does so well. As I’ve pointed out in my own discussion of the case of Jonah Lehrer, whose transgressions were significantly more troubling, there’s tremendous pressure on writers like Gladwell—a public figure and a brand name as much as a writer—to produce big ideas on a regular basis. At times, this leads him to spread himself a little too thin; a lot of his recent work consists of him reading a single book and delivering its insights with a Gladwellian twist. At his best, he adds real value as a synthesizer and interpreter, but he’s also been guilty of distorting the underlying material in his efforts to make it digestible. And a great deal of what makes his pieces so seductive lies in the fact that so much of the process has been erased: they come to us as seamless products, ready for a TED talk, that elide the messy work of consolidation and selection. If Gladwell was more open about his sources, he’d be more useful, but also less convincing. Which may be why the tension between disclosure and readability that Remnick describes is so problematic in his case. Gladwell really ought to show his work, but he’s made it this far precisely because he doesn’t.

The Silver Standard

leave a comment »

It’s never easy to predict the future, and on this particular blog, I try to avoid such prognostications, but just this once, I’m going to go out on a limb: I think there’s a good chance that Nate Silver will be Time‘s Man of the Year. Silver wasn’t the only talented analyst tracking the statistical side of the presidential race, but he’s by far the most visible, and he serves as the public face for the single most important story of this election: the triumph of information. Ultimately, the polls, at least in the aggregate, were right. Silver predicted the results correctly in all fifty states, and although he admits that his call in Florida could have gone either way, it’s still impressive—especially when you consider that his forecasts of the vote in individual swing states were startlingly accurate, differing from the final tally overall by less than 1.5%. At the moment, Silver is in an enviable position: he’s a public intellectual whose word, at least for now, carries immense weight among countless informed readers, regardless of the subject. And the real question is what he intends to do with this power.

I’ve been reading Silver for years, but after seeing him deliver a talk last week at the Chicago Humanities Festival, I emerged feeling even more encouraged by his newfound public stature. Silver isn’t a great public speaker: his presentation consisted mostly of slides drawn from his new book, The Signal and the Noise, and he sometimes comes across as a guy who spent the last six months alone in a darkened room, only to be thrust suddenly, blinking, into the light. Yet there’s something oddly reassuring about his nerdy, somewhat awkward presence. This isn’t someone like Jonah Lehrer, whose polished presentations at TED tend to obscure the fact that he doesn’t have many original ideas of his own, as was recently made distressingly clear. Silver is the real thing, a creature of statistics and spreadsheets who claims, convincingly, that if Excel were an Olympic sport, he’d be competing on the U.S. team. In person, he’s more candid and profane than in his lucid, often technical blog posts, but the impression one gets is of a man who has far more ideas in his head than he’s able to express in a short talk.

And his example is an instructive one, even to those of us who pay attention to politics only every couple of years, and who don’t have much of an interest in poker or baseball, his two other great obsessions. Silver is a heroic figure in an age of information. In his talk, he pointed out that ninety percent of the information in the world was created over the last two years, which makes it all the more important to find ways of navigating it effectively. With all the data at our disposal, it’s easy to find evidence for any argument we want to make: as the presidential debates made clear, there’s always a favorable poll or study to cite in our favor. Silver may have found his niche in politics, but he’s really an exemplar of how to intelligently read any body of publicly available information. We all have access to the same numbers: the question is how to interpret them, and, even more crucially, how to deal with information that doesn’t support our own beliefs. (Silver admits that he’s generally left of center in his own politics, but I almost wish that he were a closet conservative who was simply reporting the numbers as objectively as he could.)

But the most important thing about Silver is that he isn’t a witch. He predicted the election results better than almost anyone else, but he wasn’t alone: all of the major poll aggregators called the presidential race correctly, often using nothing more complicated than a simple average of polls, which implies that what Silver did was relatively simple, once you’ve made the decision to follow the data wherever it goes. And unlike most pundits, Silver has an enormous incentive to be painstaking in his methods. He knows that his reputation is based entirely on his accuracy, which made the conservative accusation that he was skewing the results seem ludicrous even at the time: he had much more to lose, over the long term, by being wrong. And it makes me very curious about his next move. At his talk, Silver pointed out that politics, unlike finance, was an easy target for statistical rigor: “You can look smart just by being pretty good.” Whether he can move beyond the polls into other fields remains to be seen, but I suspect that he’ll be both smart and cautious. And I can’t wait to see what he does next.

Written by nevalalee

November 12, 2012 at 10:18 am

Jonah Lehrer’s blues

with 2 comments

Back in June, when it was first revealed that Jonah Lehrer had reused some of his own work without attribution on the New Yorker blog, an editor for whom I’d written articles in the past sent me an email with the subject line: “Mike Daisey…Jonah Lehrer?” When he asked if I’d be interested in writing a piece about it, I said I’d give it a shot, although I also noted: “I don’t think I’d lump Lehrer in with Daisey just yet.” And in fact, I’ve found myself writing about Lehrer surprisingly often, in pieces for The Daily Beast, The Rumpus, and this blog. If I’ve returned to Lehrer more than once, it’s because I enjoyed a lot of his early work, was mystified by his recent problems, and took a personal interest in his case because we’re about the same age and preoccupied with similar issues of creativity and imagination. But with the revelation that he fabricated quotes in his book and lied about it, as uncovered by Michael C. Moynihan of Tablet, it seems that we may end up lumping Lehrer in with Mike Daisey after all. And this makes me very sad.

What strikes me now is the fact that most of Lehrer’s problems seem to have been the product of haste. He evidently repurposed material on his blog from previously published works because he wasn’t able to produce new content at the necessary rate. The same factor seems to have motivated his uncredited reuse of material in Imagine. And the Bob Dylan quotes he’s accused of fabricating in the same book are so uninteresting (“It’s a hard thing to describe. It’s just this sense that you got something to say”) that it’s difficult to attribute them to calculated fraud. Rather, I suspect that it was just carelessness: the original quotes were garbled in editing, compression, or revision, with Lehrer forgetting where Dylan’s quote left off and his own paraphrase begin. A mistake entered one draft and persisted into the next until it wound up in the finished book. And if there’s one set of errors like this, there are likely to be others—Lehrer’s mistakes just happened to be caught by an obsessive Dylan fan and a very good journalist.

Such errors are embarrassing, but they aren’t hard to understand. I’ve learned from experience that if I quote something in an article, I’d better check it against the source at least twice, because all kinds of gremlins can get their claws into it in the meantime. What sets Lehrer’s example apart is that the error survived until the book was in print, which implies an exceptional amount of sloppiness, and when the mistake was revealed, Lehrer only made it worse by lying. As Daisey recently found out, it isn’t the initial mistake that kills you, but the coverup. If Lehrer had simply granted that he couldn’t source the quote and blamed it on an editing error, it would have been humiliating, but not catastrophic. Instead, he spun a comically elaborate series of lies about having access to unreleased documentary footage and being in contact with Bob Dylan’s management, fabrications that fell apart at once. And while I’ve done my best to interpret his previous lapses as generously as possible, I don’t know if I can do that anymore.

In my piece on The Rumpus, I said that Lehrer’s earlier mistakes were venial sins, not mortal ones. Now that he’s slid into the area of mortal sin—not so much for the initial mistake, but for the lies that followed—it’s unclear what comes next. At the time, I wrote:

Lehrer, who has written so often about human irrationality, can only benefit from this reminder of his own fallibility, and if he’s as smart as he seems, he’ll use it in his work, which until now has reflected wide reading and curiosity, but not experience.

Unfortunately, this is no longer true. I don’t think this is the end of Lehrer’s story: he’s undeniably talented, and if James Frey, of all people, can reinvent himself, Lehrer should be able to do so as well. And yet I’m afraid that there are certain elements of his previous career that will be closed off forever. I don’t think we can take his thoughts on the creative process seriously any longer, now that we’ve seen how his own process was so fatally flawed. There is a world elsewhere, of course. And Lehrer is still so young. But where he goes from here is hard to imagine.

Written by nevalalee

July 31, 2012 at 10:01 am

The greatest stories ever sold

with 2 comments

Last week, The Rumpus published an essay I’d written about Jonah Lehrer, the prolific young writer on science and creativity who had been caught reusing portions of previously published articles on his blog at The New Yorker. I defended Lehrer from some of the more extreme charges—for one thing, I dislike the label “self-plagiarism,” which misrepresents what he actually did—and tried my best to understand the reasons behind this very public lapse of judgment. And while only Lehrer really knows what he was thinking, I think it’s fair to conclude, as I do in my essay, that his case is inseparable from the predicament of many contemporary writers, who are essentially required to become nonstop marketers of themselves. The acceleration of all media has produced a ravenous appetite for content, especially online, forcing authors to run a Red Queen’s race to keep up with demand. And when a writer is expected to blog, publish articles, give talks, and produce new books on a regular basis, it’s no surprise if the work starts to suffer.

The irony, of course, is that I’m just as guilty of this as anyone else. I think of myself primarily as a novelist, but over the past couple of years, I’ve found myself wearing a lot of different hats. I blog every day. I work as hard as possible to get interviews, panel discussions, and radio appearances to talk about my work. I’ve been known to use Twitter and Facebook. And I publish a lot of nonfiction, up to and including my essay at The Rumpus itself. I do it mostly because I like it—and I like getting paid for it when I can—but I also do it to get my name out there, along with, hopefully, the title of my book. I suspect that a lot of other writers would say the same thing, and that few guest reviews, essays, or opinion pieces are ever published without some ulterior motive on the part of the author, especially if that author happens to have a novel in stores. And while I think that most readers are aware of this, and adjust their perceptions accordingly, it’s also worth asking what this does to the writer’s own work.

The process of marketing puts any decent writer in a bind. To become a good novelist, you need to develop a skill set centered on solitude and introversion: you have to be physically and emotionally capable of sitting at a desk, alone, without distraction, for weeks or months at a time. The instant your novel comes out, however, you’re suddenly expected to develop the opposite set of skills, becoming extroverted, gregarious, and willing to invest huge amounts of energy into selling yourself in public. Very few writers, aside from the occasional outlier like Gore Vidal or Norman Mailer, have ever seemed comfortable in both roles, which create a real tension in a writer’s life. As I note in my article on Lehrer, the kind of routine required of most mainstream authors these days is antithetical to the kind of solitary, unrewarding activity needed for real creative work. Creativity requires uninterrupted time, silence, and the ability to concentrate on one problem to the exclusion of everything else. Marketing yourself at the same time is more like juggling, or, even better, like spinning plates, with different parts of your life receiving more or less attention until they need a nudge to keep them going.

When an author lets one of the plates fall, as Lehrer has done so publicly, it’s reasonable to ask whether the costs of this kind of career outweigh the rewards. I’ve often wondered about this myself. And the only answer I can give is that none of this is worth doing unless the different parts give you satisfaction for their own sake. There’s no guarantee that any of the work you do will pay off in a tangible way, so if you spend your time on something only for its perceived marketing benefits, the result will be cynical or worse. And my own attitudes about this have changed over time. This blog began, frankly, as an attempt to build an online audience in advance of The Icon Thief, but after blogging every day for almost two years, it’s become something much more—a huge part of my identity as a writer. The same is true, I hope, of my essays and short fiction. No one piece counts for much, but when I stand back and take them all together, I start to dimly glimpse the shape of my career. I wouldn’t have done half of this without the imperatives of the market. And for that, weirdly, I’m grateful.

Written by nevalalee

July 18, 2012 at 10:12 am

The secret of creativity

with 2 comments

On Tuesday, in an article in The Daily Beast, I sampled some of the recent wave of books on consciousness and creativity, including Imagine by Jonah Lehrer and The Power of Habit by Charles Duhigg, and concluded that while such books might make us feel smarter, they aren’t likely to make us more creative or rational than we already were. As far as creativity is concerned, I note, there are no easy answers: even the greatest creative geniuses, like Bach, tend to have the same ratio of hits to misses as their forgotten contemporaries, which means that the best way to have a good idea is simply to have as many ideas, good or bad, as possible. And I close my essay with some genuinely useful advice from Dean Simonton, whom I’ve quoted on this blog before: “The best a creative genius can do is to be as prolific as possible in generating products in hope that at least some subset will survive the test of time.”

So does that mean that all other advice on creativity is worthless? I hope not, because otherwise, I’ve been wasting a lot of time on this blog. I’ve devoted countless posts to discussing creativity tools like intentional randomness and mind maps, talking about various methods of increasing serendipity, and arguing for the importance of thinking in odd moments, like washing the dishes or shaving. For my own part, I still have superstitious habits about creativity that I follow every day. I never write a chapter or essay without doing a mind map, for instance—I did the one below before writing the article in the Beast—and I still generate a random quote from Shakespeare whenever I’m stuck on a problem. And these tricks seem to work, at least for me: I always end up with something that would have occurred to me if I hadn’t taken the time.

Yet the crucial word is that last one. Because the more I think about it, the more convinced I am that every useful creativity tool really boils down to just one thing—increasing the amount of time, and the kinds of time, I spend thinking about a problem. When I do a mind map, for instance, I follow a fixed, almost ritualistic set of steps: I take out a pad of paper, write a keyword or two at the center in marker, and let my pen wander across the page. All these steps take time. Which means that making a mind map generates a blank space of forty minutes or so in which I’m just thinking about the problem at hand. And it’s become increasingly clear to me that it isn’t the mind map that matters; it’s the forty minutes. The mind map is just an excuse for me to sit at my desk and think. (This is one reason why I still make my mind maps by hand, rather than with a software program—it extends the length of the process.)

In the end, the only thing that can generate ideas is time spent thinking about them. (Even apparently random moments of insight are the result of long conscious preparation.) I’ve addressed this topic before in my post about Blinn’s Law, in which I speculate that every work of art—a novel, a movie, a work of nonfiction—requires a certain amount of time to be fully realized, no matter how far technology advances, and that much of what we do as artists consists of finding excuses to sit alone at our desks for the necessary year or so. Nearly every creativity tool amounts to a way of tricking my brain into spending time on a problem, either by giving it a pleasant and relatively undemanding task, like drawing a mind map, or seducing it with a novel image or idea that makes its train of thought momentarily more interesting. But the magic isn’t in the trick itself; it’s in the time that follows. And that’s the secret of creativity.

Written by nevalalee

June 7, 2012 at 9:52 am

The right kind of randomness

leave a comment »

Yesterday, while talking about my search for serendipity in the New York Times, I wrote: “What the [Times‘s] recommendation engine thought I might like to see was far less interesting than what other people unlike me were reading at the same time.” The second I typed that sentence, I knew it wasn’t entirely true, and the more I thought about it, the more questions it seemed to raise. Because, really, most readers of the Times aren’t that much unlike me. The site attracts a wide range of visitors, but its ideal audience, the one it targets and the one that embodies how most of its readers probably like to think of themselves, is fairly consistent: educated, interested in the politics and the arts, more likely to watch Mad Men than Two and a Half Men, and rather more liberal than otherwise. The “Most Emailed” list isn’t exactly a random sampling of interesting stories, then, but a sort of idealized picture of what the perfect Times subscriber, with equal access to all parts of the paper, is reading at that particular moment.

As a result, the “serendipity” we find there tends to be skewed in predictable ways. For instance, you’re much more likely to see a column by Paul Krugman than by my conservative college classmate Ross Douthat, who may be a good writer who makes useful points, but you’d never know it based on how often his columns are shared. (I don’t have any hard numbers to back this up, but I’d guess that Douthat’s columns make the “Most Emailed” list only a fraction of the time.) If I were really in search of true serendipity—that is, to quote George Steiner, if I was trying to find what I wasn’t looking for—I’d read the most viewed or commented articles on, say, the National Review, or, better yet, the National Enquirer, the favorite paper of both Victor Niederhoffer and Nassim Nicholas Taleb. But I don’t. What I really want as a reader, it seems, isn’t pure randomness, but the right kind of randomness. It’s serendipity as curated by the writers and readers of the New York Times, which, while interesting, is only a single slice of the universe of randomness at my disposal.

Is this wrong? Not necessarily. In fact, I’d say there are at least two good reasons to stick to a certain subset of randomness, at least on a daily basis. The first reason has something in common with Brian Uzzi’s fascinating research on the collaborative process behind hit Broadway shows, as described in Jonah Lehrer’s Imagine. What Uzzi discovered is that the most successful shows tended to be the work of teams of artists who weren’t frequent collaborators, but weren’t strangers, either. An intermediate level of social intimacy—not too close, but not too far away—seemed to generate the best results, since strangers struggled to find ways of working together, while those who worked together all the time tended to fall into stale, repetitive patterns. And this strikes me as being generally true of the world of ideas as well. Ideas that are too similar don’t combine in interesting ways, but those that are too far apart tend to uselessly collide. What you want, ideally, is to live in a world of good ideas that want to cohere and set off chains of associations, and for this, an intermediate level of unfamiliarity seems to work the best.

And the second reason is even more important: it’s that randomness alone isn’t enough. It’s good, of course, to seek out new sources of inspiration and ideas, but if done indiscriminately, the result is likely to be nothing but static. Twitter, for instance, is as pure a slice of randomness as you could possibly want, but we very properly try to manage our feeds to include those people we like and find interesting, rather than exposing ourselves to the full noise of the Twitterverse. (That way lies madness.) Even the most enthusiastic proponent of intentional randomness, like me, has to admit that not all sources of information are created equal, and that it’s sometimes necessary to use a trusted home base for our excursions into the unknown. When people engage in bibliomancy—that is, in telling the future by opening a book to a random page—there’s a reason why they’ve historically used books like Virgil or the Bible, rather than Harlequin romance: any book would generate the necessary level of randomness, but you need a basic level of richness and meaning as well. What I’m saying, I guess, is that if you’re going to be random, you may as well be systematic about it. And the New York Times isn’t a bad place to start.

Written by nevalalee

May 23, 2012 at 10:42 am

Yo-Yo Ma, detective

with 3 comments

I always look at a piece of music like a detective novel. Maybe the novel is about a murder. Well, who committed the murder? Why did he do it? My job is to retrace the story so that the audience feels the suspense. So that when the climax comes, they’re right there with me, listening to my beautiful detective story. It’s all about making people care about what happens next.

Yo-Yo Ma, quoted by Jonah Lehrer in Imagine

Written by nevalalee

May 19, 2012 at 9:00 am

Posted in Quote of the Day

Tagged with , ,

Charles Darwin and the triumph of literary genius

with 3 comments

Last week, while browsing at Open Books in Chicago, I made one of those serendipitous discoveries that are the main reason I love used bookstores: a vintage copy of The Tangled Bank by Stanley Edgar Hyman, which I picked up for less than seven dollars. Both the author and his work are mostly forgotten these days—Hyman is remembered, if anything, for his marriage to Shirley Jackson—but this book caught my attention right away. It’s an ambitious attempt to consider Darwin, Marx, Frazer, and Freud as imaginative writers who made their arguments using the strategies of narrative artists and storytellers, and as such, it’s a great bedside book, if not completely successful. Hyman obsessively details how books like Das Kapital mimic the tropes of narrative art (“The dramatic movement of Capital consists of four descents into suffering and horror, which we might see as four acts of a drama”) while neglecting the main point: if authors like this are storytellers, it’s because they’ve turned themselves into the protagonists of their own books, with their attempts to impose order on reality as their most enduring literary monuments.

And we may never see such protagonists again. If Darwin or Freud are literary characters as memorable as Pickwick or Hamlet, it’s in the tradition of the solitary man of genius considering the world through the lens of his own experience, a figure who has, of necessity, gone out of fashion in the sciences. As Jonah Lehrer recently pointed out in the New Yorker, the era of the lone genius is over:

Today…science papers by multiple authors contain more than twice as many citations as those by individuals. This trend was even more apparent when it came to so-called “home-run papers”—publications with at least a hundred citations. These were more than six times as likely to come from a team of scientists.

The explanation for this is easy enough to understand: most remaining scientific problems are far too hard for any one person to solve. Scientists are increasingly forced to specialize, and tackling important problems requires a greater degree of collaboration than ever before. This leads to its own kind of creative exhilaration, and perhaps a different model of genius, as that of a visionary who can guide and direct a diverse team of talents, like Steve Jobs or Robert Oppenheimer. But it’s unlikely that we’ll ever get to know such thinkers as living men and women, at least not as well as the ones profiled in The Tangled Bank.

Of these four, the one who interests me the most these days is Darwin, whose birthday was this past Sunday. (And while I’m on the subject, if you haven’t picked up a copy of Darwin Slept Here, by my good friend Eric Simons, you really should.) Darwin emerges in his own works as a fascinating figure, a ceaseless experimenter whose work is inseparable from the image of the man himself. One of the pleasures of The Tangled Bank lies in its reminder of how ingenious a scientist Darwin was. To  compare the area of geological formations on a topographical map, he cut them out and weighed the paper. He tickled aphids with a fine hair and made artificial leaves for earthworms by rubbing triangular pieces of paper with raw fat. And this impression of Sherlockian thoroughness, of leaving no experimental stone unturned, is more than just a literary delight: it’s an integral part of the persuasiveness of The Origin of Species, which is convincing as an argument largely because we’re so charmed by the author’s voice.

As Daniel C. Dennett has famously argued, Darwinian evolution is probably the best idea of all time, but it’s also impossible to separate the idea from the man, who survives in his own work as one of the great literary characters of the nineteenth century. It’s true that if Darwin, or Alfred Russel Wallace, hadn’t arrived at the principle of natural selection, somebody else would have done so eventually: it’s one of those ideas that seem obvious in retrospect. (After reading The Origin of Species, Thomas Huxley is supposed to have said: “How extremely stupid not to have thought of that!”) But there’s no denying that the force and appeal of the book itself, which Darwin worked on quietly for years, bears a great deal of the credit for the theory’s rapid acceptance, at least among reasonable readers. Without that presentation, and the author’s personality, the history of the world might have been very different. And for that, we have literary genius to thank.

Written by nevalalee

February 13, 2012 at 11:30 am

Thinking in groups, thinking alone

leave a comment »

Where do good ideas come from? A recent issue of the New Yorker offers up a few answers, in a fascinating article on the science of groupthink by Jonah Lehrer, who debunks some widely cherished notions about creative collaboration. Lehrer suggests that brainstorming—narrowly defined as a group activity in which a roomful of people generates as many ideas as possible without pausing to evaluate or criticize—is essentially useless, or at least less effective than spirited group debate or working alone. The best kind of collaboration, he says, occurs when people from diverse backgrounds are thrown together in an environment where they can argue, share ideas, or simply meet by chance, and he backs this up with an impressive array of data, ranging from studies of the genesis of Broadway musicals to the legendary Building 20 at MIT, where individuals as different as Amar Bose and Noam Chomsky thrived in an environment in which the walls between disciplines could literally be torn down.

What I love about Lehrer’s article is that its vision of productive group thinking isn’t that far removed from my sense of what writers and other creative artists need to do on their own. The idea of subjecting the ideas in brainstorming sessions to a rigorous winnowing process has close parallels to Dean Simonton’s Darwinian model of creativity: quality, he notes, is a probabilistic function of quantity, so the more ideas you have, the better—but only if they’re subjected to the discipline of natural selection. This selection can occur in the writer’s mind, in a group, or in the larger marketplace, but the crucial thing is that it take place at all. Free association or productivity isn’t enough without that extra step of revision, or rendering, which in most cases requires a strong external point of view. Hence the importance of outside readers and editors to every writer, no matter how successful.

The premise that creativity flowers most readily from interactions between people from different backgrounds has parallels in one’s inner life as well. In The Act of Creation, Arthur Koestler concludes that bisociation, or the intersection of two unrelated areas of knowledge in unexpected ways, is the ultimate source of creativity. On the highest plane, the most profound innovations in science and the arts often occur when an individual of genius changes fields. On a more personal level, nearly every good story idea I’ve ever had came from the juxtaposition of two previously unrelated concepts, either done on purpose—as in my focused daydreaming with science magazines, which led to stories like “Kawataro,” “The Boneless One,” and “Ernesto”—or by accident. Even accidents, however, can benefit from careful planning, as in the design of the Pixar campus, as conceived by Steve Jobs, in which members of different departments have no choice but to cross paths on their way to the bathroom or cafeteria.

Every creative artist needs to find ways of maximizing this sort of serendipity in his or her own life. My favorite personal example is my own home library: partially out of laziness, my bookshelves have always been a wild jumble of volumes in no particular order, an arrangement that sometimes makes it hard to find a specific book when I need it, but also leads to serendipitous arrangements of ideas. I’ll often be looking for one book when another catches my eye, even if I haven’t read it in years, which takes me, in turn, in unexpected directions. Even more relevant to Lehrer’s article is the importance of talking to people from different fields: writers benefit enormously from working around people who aren’t writers, which is why college tends to be a more creatively fertile period than graduate school. “It is the human friction,” Lehrer concludes, “that makes the sparks.” And we should all arrange our lives accordingly.

Written by nevalalee

February 1, 2012 at 10:26 am

%d bloggers like this: