Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Jonah Lehrer

Malcolm in the Middle

with 2 comments

Malcolm Gladwell

Last week, the journalism blog Our Bad Media accused the author Malcolm Gladwell of lapses in reporting that it alleged fell just short of plagiarism. In multiple instances, Gladwell took details in his pieces for The New Yorker, without attribution, from sources that were the only possible places where such information could have been obtained. For instance, an anecdote about the construction of the Troy-Greenfield railroad was based closely an academic article by the historian John Sawyer, which isn’t readily available online, and which includes facts that appear nowhere else. Gladwell doesn’t mention Sawyer anywhere. And while it’s hard to make a case that any of this amounts to plagiarism in the strictest sense, it’s undeniably sloppy, as well as a disservice to readers who might want to learn more. In a statement responding to the allegations, New Yorker editor David Remnick wrote:

The issue is not really about Malcolm. And, to be clear, it isn’t about plagiarism. The issue is an ongoing editorial challenge known to writers and editors everywhere—to what extent should a piece of journalism, which doesn’t have the apparatus of academic footnotes, credit secondary sources? It’s an issue that can get complicated when there are many sources with overlapping information. There are cases where the details of an episode have passed into history and are widespread in the literature. There are cases that involve a unique source. We try to make judgments about source attribution with fairness and in good faith. But we don’t always get it right…We sometimes fall short, but our hope is always to give readers and sources the consideration they deserve.

Remnick’s response is interesting on a number of levels, but I’d like to focus on one aspect: the idea that after a certain point, details “have passed into history,” or, to quote Peter Canby, The New Yorker‘s own director of fact checking, a quote or idea can “escape its authorship” after it has been disseminated widely enough. In some cases, there’s no ambiguity over whether a fact has the status of public information; if we want to share a famous story about Immanuel Kant’s work habits, for instance, we don’t necessarily need to trace the quote back to where it first appeared. On the opposite end of the spectrum, we have something like a quotation from a particular interview with a living person, which ought to be attributed to its original source, and which Gladwell has occasionally failed to do. And in the middle, we have a wild gray area of factual information that might be considered common property, but which has only appeared in a limited number of places. Evidently, there’s a threshold—or, if you like, a tipping point—at which a fact or quote has been cited enough to take on a life of its own, and the real question is when that moment takes place.

Ian McEwan

It’s especially complicated in genres like fiction and narrative nonfiction, which, as Remnick notes, lack the scholarly apparatus of more academic writing. A few years ago, Ian McEwan fell into an absurd controversy over details in Atonement that were largely derived from a memoir by the wartime nurse Lucilla Andrews. McEwan credits Andrews in his acknowledgments, and his use of such materials inspired a ringing defense from none other than Thomas Pynchon:

Unless we were actually there, we must turn to people who were, or to letters, contemporary reporting, the encyclopedia, the Internet, until, with luck, at some point, we can begin to make a few things of our own up. To discover in the course of research some engaging detail we know can be put into a story where it will do some good can hardly be classed as a felonious act—it is simply what we do.

You could argue, on a similar level, that assimilating information and presenting it in a readable form is simply what Gladwell does, too. Little if anything that Gladwell writes is based on original research; he’s a popularizer, and a brilliant one, who compiles ideas from other sources and presents them in an attractive package. The result shades into a form of creative writing, rather than straight journalism, and at that point, the attribution of sources indeed starts to feel like a judgment call.

But it also points to a limitation in the kind of writing that Gladwell does so well. As I’ve pointed out in my own discussion of the case of Jonah Lehrer, whose transgressions were significantly more troubling, there’s tremendous pressure on writers like Gladwell—a public figure and a brand name as much as a writer—to produce big ideas on a regular basis. At times, this leads him to spread himself a little too thin; a lot of his recent work consists of him reading a single book and delivering its insights with a Gladwellian twist. At his best, he adds real value as a synthesizer and interpreter, but he’s also been guilty of distorting the underlying material in his efforts to make it digestible. And a great deal of what makes his pieces so seductive lies in the fact that so much of the process has been erased: they come to us as seamless products, ready for a TED talk, that elide the messy work of consolidation and selection. If Gladwell was more open about his sources, he’d be more useful, but also less convincing. Which may be why the tension between disclosure and readability that Remnick describes is so problematic in his case. Gladwell really ought to show his work, but he’s made it this far precisely because he doesn’t.

The Silver Standard

leave a comment »

It’s never easy to predict the future, and on this particular blog, I try to avoid such prognostications, but just this once, I’m going to go out on a limb: I think there’s a good chance that Nate Silver will be Time‘s Man of the Year. Silver wasn’t the only talented analyst tracking the statistical side of the presidential race, but he’s by far the most visible, and he serves as the public face for the single most important story of this election: the triumph of information. Ultimately, the polls, at least in the aggregate, were right. Silver predicted the results correctly in all fifty states, and although he admits that his call in Florida could have gone either way, it’s still impressive—especially when you consider that his forecasts of the vote in individual swing states were startlingly accurate, differing from the final tally overall by less than 1.5%. At the moment, Silver is in an enviable position: he’s a public intellectual whose word, at least for now, carries immense weight among countless informed readers, regardless of the subject. And the real question is what he intends to do with this power.

I’ve been reading Silver for years, but after seeing him deliver a talk last week at the Chicago Humanities Festival, I emerged feeling even more encouraged by his newfound public stature. Silver isn’t a great public speaker: his presentation consisted mostly of slides drawn from his new book, The Signal and the Noise, and he sometimes comes across as a guy who spent the last six months alone in a darkened room, only to be thrust suddenly, blinking, into the light. Yet there’s something oddly reassuring about his nerdy, somewhat awkward presence. This isn’t someone like Jonah Lehrer, whose polished presentations at TED tend to obscure the fact that he doesn’t have many original ideas of his own, as was recently made distressingly clear. Silver is the real thing, a creature of statistics and spreadsheets who claims, convincingly, that if Excel were an Olympic sport, he’d be competing on the U.S. team. In person, he’s more candid and profane than in his lucid, often technical blog posts, but the impression one gets is of a man who has far more ideas in his head than he’s able to express in a short talk.

And his example is an instructive one, even to those of us who pay attention to politics only every couple of years, and who don’t have much of an interest in poker or baseball, his two other great obsessions. Silver is a heroic figure in an age of information. In his talk, he pointed out that ninety percent of the information in the world was created over the last two years, which makes it all the more important to find ways of navigating it effectively. With all the data at our disposal, it’s easy to find evidence for any argument we want to make: as the presidential debates made clear, there’s always a favorable poll or study to cite in our favor. Silver may have found his niche in politics, but he’s really an exemplar of how to intelligently read any body of publicly available information. We all have access to the same numbers: the question is how to interpret them, and, even more crucially, how to deal with information that doesn’t support our own beliefs. (Silver admits that he’s generally left of center in his own politics, but I almost wish that he were a closet conservative who was simply reporting the numbers as objectively as he could.)

But the most important thing about Silver is that he isn’t a witch. He predicted the election results better than almost anyone else, but he wasn’t alone: all of the major poll aggregators called the presidential race correctly, often using nothing more complicated than a simple average of polls, which implies that what Silver did was relatively simple, once you’ve made the decision to follow the data wherever it goes. And unlike most pundits, Silver has an enormous incentive to be painstaking in his methods. He knows that his reputation is based entirely on his accuracy, which made the conservative accusation that he was skewing the results seem ludicrous even at the time: he had much more to lose, over the long term, by being wrong. And it makes me very curious about his next move. At his talk, Silver pointed out that politics, unlike finance, was an easy target for statistical rigor: “You can look smart just by being pretty good.” Whether he can move beyond the polls into other fields remains to be seen, but I suspect that he’ll be both smart and cautious. And I can’t wait to see what he does next.

Written by nevalalee

November 12, 2012 at 10:18 am

Jonah Lehrer’s blues

with 2 comments

Back in June, when it was first revealed that Jonah Lehrer had reused some of his own work without attribution on the New Yorker blog, an editor for whom I’d written articles in the past sent me an email with the subject line: “Mike Daisey…Jonah Lehrer?” When he asked if I’d be interested in writing a piece about it, I said I’d give it a shot, although I also noted: “I don’t think I’d lump Lehrer in with Daisey just yet.” And in fact, I’ve found myself writing about Lehrer surprisingly often, in pieces for The Daily Beast, The Rumpus, and this blog. If I’ve returned to Lehrer more than once, it’s because I enjoyed a lot of his early work, was mystified by his recent problems, and took a personal interest in his case because we’re about the same age and preoccupied with similar issues of creativity and imagination. But with the revelation that he fabricated quotes in his book and lied about it, as uncovered by Michael C. Moynihan of Tablet, it seems that we may end up lumping Lehrer in with Mike Daisey after all. And this makes me very sad.

What strikes me now is the fact that most of Lehrer’s problems seem to have been the product of haste. He evidently repurposed material on his blog from previously published works because he wasn’t able to produce new content at the necessary rate. The same factor seems to have motivated his uncredited reuse of material in Imagine. And the Bob Dylan quotes he’s accused of fabricating in the same book are so uninteresting (“It’s a hard thing to describe. It’s just this sense that you got something to say”) that it’s difficult to attribute them to calculated fraud. Rather, I suspect that it was just carelessness: the original quotes were garbled in editing, compression, or revision, with Lehrer forgetting where Dylan’s quote left off and his own paraphrase begin. A mistake entered one draft and persisted into the next until it wound up in the finished book. And if there’s one set of errors like this, there are likely to be others—Lehrer’s mistakes just happened to be caught by an obsessive Dylan fan and a very good journalist.

Such errors are embarrassing, but they aren’t hard to understand. I’ve learned from experience that if I quote something in an article, I’d better check it against the source at least twice, because all kinds of gremlins can get their claws into it in the meantime. What sets Lehrer’s example apart is that the error survived until the book was in print, which implies an exceptional amount of sloppiness, and when the mistake was revealed, Lehrer only made it worse by lying. As Daisey recently found out, it isn’t the initial mistake that kills you, but the coverup. If Lehrer had simply granted that he couldn’t source the quote and blamed it on an editing error, it would have been humiliating, but not catastrophic. Instead, he spun a comically elaborate series of lies about having access to unreleased documentary footage and being in contact with Bob Dylan’s management, fabrications that fell apart at once. And while I’ve done my best to interpret his previous lapses as generously as possible, I don’t know if I can do that anymore.

In my piece on The Rumpus, I said that Lehrer’s earlier mistakes were venial sins, not mortal ones. Now that he’s slid into the area of mortal sin—not so much for the initial mistake, but for the lies that followed—it’s unclear what comes next. At the time, I wrote:

Lehrer, who has written so often about human irrationality, can only benefit from this reminder of his own fallibility, and if he’s as smart as he seems, he’ll use it in his work, which until now has reflected wide reading and curiosity, but not experience.

Unfortunately, this is no longer true. I don’t think this is the end of Lehrer’s story: he’s undeniably talented, and if James Frey, of all people, can reinvent himself, Lehrer should be able to do so as well. And yet I’m afraid that there are certain elements of his previous career that will be closed off forever. I don’t think we can take his thoughts on the creative process seriously any longer, now that we’ve seen how his own process was so fatally flawed. There is a world elsewhere, of course. And Lehrer is still so young. But where he goes from here is hard to imagine.

Written by nevalalee

July 31, 2012 at 10:01 am

The greatest stories ever sold

with 2 comments

Last week, The Rumpus published an essay I’d written about Jonah Lehrer, the prolific young writer on science and creativity who had been caught reusing portions of previously published articles on his blog at The New Yorker. I defended Lehrer from some of the more extreme charges—for one thing, I dislike the label “self-plagiarism,” which misrepresents what he actually did—and tried my best to understand the reasons behind this very public lapse of judgment. And while only Lehrer really knows what he was thinking, I think it’s fair to conclude, as I do in my essay, that his case is inseparable from the predicament of many contemporary writers, who are essentially required to become nonstop marketers of themselves. The acceleration of all media has produced a ravenous appetite for content, especially online, forcing authors to run a Red Queen’s race to keep up with demand. And when a writer is expected to blog, publish articles, give talks, and produce new books on a regular basis, it’s no surprise if the work starts to suffer.

The irony, of course, is that I’m just as guilty of this as anyone else. I think of myself primarily as a novelist, but over the past couple of years, I’ve found myself wearing a lot of different hats. I blog every day. I work as hard as possible to get interviews, panel discussions, and radio appearances to talk about my work. I’ve been known to use Twitter and Facebook. And I publish a lot of nonfiction, up to and including my essay at The Rumpus itself. I do it mostly because I like it—and I like getting paid for it when I can—but I also do it to get my name out there, along with, hopefully, the title of my book. I suspect that a lot of other writers would say the same thing, and that few guest reviews, essays, or opinion pieces are ever published without some ulterior motive on the part of the author, especially if that author happens to have a novel in stores. And while I think that most readers are aware of this, and adjust their perceptions accordingly, it’s also worth asking what this does to the writer’s own work.

The process of marketing puts any decent writer in a bind. To become a good novelist, you need to develop a skill set centered on solitude and introversion: you have to be physically and emotionally capable of sitting at a desk, alone, without distraction, for weeks or months at a time. The instant your novel comes out, however, you’re suddenly expected to develop the opposite set of skills, becoming extroverted, gregarious, and willing to invest huge amounts of energy into selling yourself in public. Very few writers, aside from the occasional outlier like Gore Vidal or Norman Mailer, have ever seemed comfortable in both roles, which create a real tension in a writer’s life. As I note in my article on Lehrer, the kind of routine required of most mainstream authors these days is antithetical to the kind of solitary, unrewarding activity needed for real creative work. Creativity requires uninterrupted time, silence, and the ability to concentrate on one problem to the exclusion of everything else. Marketing yourself at the same time is more like juggling, or, even better, like spinning plates, with different parts of your life receiving more or less attention until they need a nudge to keep them going.

When an author lets one of the plates fall, as Lehrer has done so publicly, it’s reasonable to ask whether the costs of this kind of career outweigh the rewards. I’ve often wondered about this myself. And the only answer I can give is that none of this is worth doing unless the different parts give you satisfaction for their own sake. There’s no guarantee that any of the work you do will pay off in a tangible way, so if you spend your time on something only for its perceived marketing benefits, the result will be cynical or worse. And my own attitudes about this have changed over time. This blog began, frankly, as an attempt to build an online audience in advance of The Icon Thief, but after blogging every day for almost two years, it’s become something much more—a huge part of my identity as a writer. The same is true, I hope, of my essays and short fiction. No one piece counts for much, but when I stand back and take them all together, I start to dimly glimpse the shape of my career. I wouldn’t have done half of this without the imperatives of the market. And for that, weirdly, I’m grateful.

Written by nevalalee

July 18, 2012 at 10:12 am

The secret of creativity

with 2 comments

On Tuesday, in an article in The Daily Beast, I sampled some of the recent wave of books on consciousness and creativity, including Imagine by Jonah Lehrer and The Power of Habit by Charles Duhigg, and concluded that while such books might make us feel smarter, they aren’t likely to make us more creative or rational than we already were. As far as creativity is concerned, I note, there are no easy answers: even the greatest creative geniuses, like Bach, tend to have the same ratio of hits to misses as their forgotten contemporaries, which means that the best way to have a good idea is simply to have as many ideas, good or bad, as possible. And I close my essay with some genuinely useful advice from Dean Simonton, whom I’ve quoted on this blog before: “The best a creative genius can do is to be as prolific as possible in generating products in hope that at least some subset will survive the test of time.”

So does that mean that all other advice on creativity is worthless? I hope not, because otherwise, I’ve been wasting a lot of time on this blog. I’ve devoted countless posts to discussing creativity tools like intentional randomness and mind maps, talking about various methods of increasing serendipity, and arguing for the importance of thinking in odd moments, like washing the dishes or shaving. For my own part, I still have superstitious habits about creativity that I follow every day. I never write a chapter or essay without doing a mind map, for instance—I did the one below before writing the article in the Beast—and I still generate a random quote from Shakespeare whenever I’m stuck on a problem. And these tricks seem to work, at least for me: I always end up with something that would have occurred to me if I hadn’t taken the time.

Yet the crucial word is that last one. Because the more I think about it, the more convinced I am that every useful creativity tool really boils down to just one thing—increasing the amount of time, and the kinds of time, I spend thinking about a problem. When I do a mind map, for instance, I follow a fixed, almost ritualistic set of steps: I take out a pad of paper, write a keyword or two at the center in marker, and let my pen wander across the page. All these steps take time. Which means that making a mind map generates a blank space of forty minutes or so in which I’m just thinking about the problem at hand. And it’s become increasingly clear to me that it isn’t the mind map that matters; it’s the forty minutes. The mind map is just an excuse for me to sit at my desk and think. (This is one reason why I still make my mind maps by hand, rather than with a software program—it extends the length of the process.)

In the end, the only thing that can generate ideas is time spent thinking about them. (Even apparently random moments of insight are the result of long conscious preparation.) I’ve addressed this topic before in my post about Blinn’s Law, in which I speculate that every work of art—a novel, a movie, a work of nonfiction—requires a certain amount of time to be fully realized, no matter how far technology advances, and that much of what we do as artists consists of finding excuses to sit alone at our desks for the necessary year or so. Nearly every creativity tool amounts to a way of tricking my brain into spending time on a problem, either by giving it a pleasant and relatively undemanding task, like drawing a mind map, or seducing it with a novel image or idea that makes its train of thought momentarily more interesting. But the magic isn’t in the trick itself; it’s in the time that follows. And that’s the secret of creativity.

Written by nevalalee

June 7, 2012 at 9:52 am

The right kind of randomness

leave a comment »

Yesterday, while talking about my search for serendipity in the New York Times, I wrote: “What the [Times‘s] recommendation engine thought I might like to see was far less interesting than what other people unlike me were reading at the same time.” The second I typed that sentence, I knew it wasn’t entirely true, and the more I thought about it, the more questions it seemed to raise. Because, really, most readers of the Times aren’t that much unlike me. The site attracts a wide range of visitors, but its ideal audience, the one it targets and the one that embodies how most of its readers probably like to think of themselves, is fairly consistent: educated, interested in the politics and the arts, more likely to watch Mad Men than Two and a Half Men, and rather more liberal than otherwise. The “Most Emailed” list isn’t exactly a random sampling of interesting stories, then, but a sort of idealized picture of what the perfect Times subscriber, with equal access to all parts of the paper, is reading at that particular moment.

As a result, the “serendipity” we find there tends to be skewed in predictable ways. For instance, you’re much more likely to see a column by Paul Krugman than by my conservative college classmate Ross Douthat, who may be a good writer who makes useful points, but you’d never know it based on how often his columns are shared. (I don’t have any hard numbers to back this up, but I’d guess that Douthat’s columns make the “Most Emailed” list only a fraction of the time.) If I were really in search of true serendipity—that is, to quote George Steiner, if I was trying to find what I wasn’t looking for—I’d read the most viewed or commented articles on, say, the National Review, or, better yet, the National Enquirer, the favorite paper of both Victor Niederhoffer and Nassim Nicholas Taleb. But I don’t. What I really want as a reader, it seems, isn’t pure randomness, but the right kind of randomness. It’s serendipity as curated by the writers and readers of the New York Times, which, while interesting, is only a single slice of the universe of randomness at my disposal.

Is this wrong? Not necessarily. In fact, I’d say there are at least two good reasons to stick to a certain subset of randomness, at least on a daily basis. The first reason has something in common with Brian Uzzi’s fascinating research on the collaborative process behind hit Broadway shows, as described in Jonah Lehrer’s Imagine. What Uzzi discovered is that the most successful shows tended to be the work of teams of artists who weren’t frequent collaborators, but weren’t strangers, either. An intermediate level of social intimacy—not too close, but not too far away—seemed to generate the best results, since strangers struggled to find ways of working together, while those who worked together all the time tended to fall into stale, repetitive patterns. And this strikes me as being generally true of the world of ideas as well. Ideas that are too similar don’t combine in interesting ways, but those that are too far apart tend to uselessly collide. What you want, ideally, is to live in a world of good ideas that want to cohere and set off chains of associations, and for this, an intermediate level of unfamiliarity seems to work the best.

And the second reason is even more important: it’s that randomness alone isn’t enough. It’s good, of course, to seek out new sources of inspiration and ideas, but if done indiscriminately, the result is likely to be nothing but static. Twitter, for instance, is as pure a slice of randomness as you could possibly want, but we very properly try to manage our feeds to include those people we like and find interesting, rather than exposing ourselves to the full noise of the Twitterverse. (That way lies madness.) Even the most enthusiastic proponent of intentional randomness, like me, has to admit that not all sources of information are created equal, and that it’s sometimes necessary to use a trusted home base for our excursions into the unknown. When people engage in bibliomancy—that is, in telling the future by opening a book to a random page—there’s a reason why they’ve historically used books like Virgil or the Bible, rather than Harlequin romance: any book would generate the necessary level of randomness, but you need a basic level of richness and meaning as well. What I’m saying, I guess, is that if you’re going to be random, you may as well be systematic about it. And the New York Times isn’t a bad place to start.

Written by nevalalee

May 23, 2012 at 10:42 am

Yo-Yo Ma, detective

with 3 comments

I always look at a piece of music like a detective novel. Maybe the novel is about a murder. Well, who committed the murder? Why did he do it? My job is to retrace the story so that the audience feels the suspense. So that when the climax comes, they’re right there with me, listening to my beautiful detective story. It’s all about making people care about what happens next.

Yo-Yo Ma, quoted by Jonah Lehrer in Imagine

Written by nevalalee

May 19, 2012 at 9:00 am

Posted in Quote of the Day

Tagged with , ,


Get every new post delivered to your Inbox.

Join 9,441 other followers

%d bloggers like this: