Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Imagine

Subterranean fact check blues

leave a comment »

In Jon Ronson’s uneven but worthwhile book So You’ve Been Publicly Shamed, there’s a fascinating interview with Jonah Lehrer, the superstar science writer who was famously hung out to dry for a variety of scholarly misdeeds. His troubles began when a journalist named Michael C. Moynihan noticed that six quotes attributed to Bob Dylan in Lehrer’s Imagine appeared to have been fabricated. Looking back on this unhappy period, Lehrer blames “a toxic mixture of insecurity and ambition” that led him to take shortcuts—a possibility that occurred to many of us at the time—and concludes:

And then one day you get an email saying that there’s these…Dylan quotes, and they can’t be explained, and they can’t be found anywhere else, and you were too lazy, too stupid, to ever check. I can only wish, and I wish this profoundly, I’d had the temerity, the courage, to do a fact check on my last book. But as anyone who does a fact check knows, they’re not particularly fun things to go through. Your story gets a little flatter. You’re forced to grapple with all your mistakes, conscious and unconscious.

There are at least two striking points about this moment of introspection. One is that the decision whether or not to fact-check a book was left to the author himself, which feels like it’s the wrong way around, although it’s distressingly common. (“Temerity” also seems like exactly the wrong word here, but that’s another story.) The other is that Lehrer avoided asking someone to check his facts because he saw it as a painful, protracted process that obliged him to confront all the places where he had gone wrong.

He’s probably right. A fact check is useful in direct proportion to how much it hurts, and having just endured one recently for my article on L. Ron Hubbard—a subject on whom no amount of factual caution is excessive—I can testify that, as Lehrer says, it isn’t “particularly fun.” You’re asked to provide sources for countless tiny statements, and if you can’t find it in your notes, you just have to let it go, even if it kills you. (As far as I can recall, I had to omit exactly one sentence from the Hubbard piece, on a very minor point, and it still rankles me.) But there’s no doubt in my mind that it made the article better. Not only did it catch small errors that otherwise might have slipped into print, but it forced me to go back over every sentence from another angle, critically evaluating my argument and asking whether I was ready to stand by it. It wasn’t fun, but neither are most stages of writing, if you’re doing it right. In a couple of months, I’ll undergo much the same process with my book, as I prepare the endnotes and a bibliography, which is the equivalent of my present self performing a fact check on my past. This sort of scholarly apparatus might seem like a courtesy to the reader, and it is, but it’s also good for the book itself. Even Lehrer seems to recognize this, stating in his attempt at an apology in a keynote speech for the Knight Foundation:

If I’m lucky enough to write again, I won’t write a thing that isn’t fact-checked and fully footnoted. Because here is what I’ve learned: unless I’m willing to continually grapple with my failings—until I’m forced to fix my first draft, and deal with criticism of the second, and submit the final for a good, independent scrubbing—I won’t create anything worth keeping around.

For a writer whose entire brand is built around counterintuitive, surprising insights, this realization might seem bluntly obvious, but it only speaks to how resistant most writers, including me, are to any kind of criticism. We might take it better if we approached it with the notion that it isn’t simply for the sake of our readers, or our hypothetical critics, or even the integrity of the subject matter, but for ourselves. A footnote lurking in the back of the book makes for a better sentence on the page, if only because of the additional pass that it requires. It would help if we saw such standards—the avoidance of plagiarism, the proper citation of sources—not as guidelines imposed by authority from above, but as a set of best practices that well up from inside the work itself. A few days ago, there yet was another plagiarism controversy, which, in what Darin Morgan once called “one of those coincidences found only in real life and great fiction,” also involved Bob Dylan. As Andrea Pitzer of Slate recounts it:

During his official [Nobel] lecture recorded on June 4, laureate Bob Dylan described the influence on him of three literary works from his childhood: The Odyssey, All Quiet on the Western Front, and Moby-Dick. Soon after, writer Ben Greenman noted that in his lecture Dylan seemed to have invented a quote from Moby-Dick…I soon discovered that the Moby-Dick line Dylan dreamed up last week seems to be cobbled together out of phrases on the website SparkNotes, the online equivalent of CliffsNotes…Across the seventy-eight sentences in the lecture that Dylan spends describing Moby-Dick, even a cursory inspection reveals that more than a dozen of them appear to closely resemble lines from the SparkNotes site.

Without drilling into it too deeply, I’ll venture to say that if this all seems weird, it’s because Bob Dylan, of all people, after receiving the Nobel Prize for Literature, might have cribbed statements from an online study guide written by and for college students. But isn’t that how it always goes? Anecdotally speaking, plagiarists seem to draw from secondary or even tertiary sources, like encyclopedias, since the sort of careless or hurried writer vulnerable to indulging in it in the first place isn’t likely to grapple with the originals. The result is an inevitable degradation of information, like a copy of a copy. As Edward Tufte memorably observes in Visual Explanations: “Incomplete plagiarism leads to dequantification.” In context, he’s talking about the way in which illustrations and statistical graphics tend to lose data the more often they get copied. (In The Visual Display of Quantitative Information, he cites a particularly egregious example, in which a reproduction of a scatterplot “forgot to plot the points and simply retraced the grid lines from the original…The resulting figure achieves a graphical absolute zero, a null data-ink ratio.”) But it applies to all kinds of plagiarism, and it makes for a more compelling argument, I think, than the equally valid point that the author is cheating the source and the reader. In art or literature, it’s better to argue from aesthetics than ethics. If fact-checking strengthens a piece of writing, then plagiarism, with its effacing of sources and obfuscation of detail, can only weaken it. One is the opposite of the other, and it’s no surprise that the sins of plagiarism and fabrication tend to go together. They’re symptoms of the same underlying sloppiness, and this is why writers owe it to themselves—not to hypothetical readers or critics—to weed them out. A writer who is sloppy on small matters of fact can hardly avoid doing the same on the higher levels of an argument, and policing the one is a way of keeping an eye on the other. It isn’t always fun. But if you’re going to be a writer, as Dylan himself once said: “Now you’re gonna have to get used to it.”

Jonah Lehrer’s blues

with 2 comments

Back in June, when it was first revealed that Jonah Lehrer had reused some of his own work without attribution on the New Yorker blog, an editor for whom I’d written articles in the past sent me an email with the subject line: “Mike Daisey…Jonah Lehrer?” When he asked if I’d be interested in writing a piece about it, I said I’d give it a shot, although I also noted: “I don’t think I’d lump Lehrer in with Daisey just yet.” And in fact, I’ve found myself writing about Lehrer surprisingly often, in pieces for The Daily Beast, The Rumpus, and this blog. If I’ve returned to Lehrer more than once, it’s because I enjoyed a lot of his early work, was mystified by his recent problems, and took a personal interest in his case because we’re about the same age and preoccupied with similar issues of creativity and imagination. But with the revelation that he fabricated quotes in his book and lied about it, as uncovered by Michael C. Moynihan of Tablet, it seems that we may end up lumping Lehrer in with Mike Daisey after all. And this makes me very sad.

What strikes me now is the fact that most of Lehrer’s problems seem to have been the product of haste. He evidently repurposed material on his blog from previously published works because he wasn’t able to produce new content at the necessary rate. The same factor seems to have motivated his uncredited reuse of material in Imagine. And the Bob Dylan quotes he’s accused of fabricating in the same book are so uninteresting (“It’s a hard thing to describe. It’s just this sense that you got something to say”) that it’s difficult to attribute them to calculated fraud. Rather, I suspect that it was just carelessness: the original quotes were garbled in editing, compression, or revision, with Lehrer forgetting where Dylan’s quote left off and his own paraphrase begin. A mistake entered one draft and persisted into the next until it wound up in the finished book. And if there’s one set of errors like this, there are likely to be others—Lehrer’s mistakes just happened to be caught by an obsessive Dylan fan and a very good journalist.

Such errors are embarrassing, but they aren’t hard to understand. I’ve learned from experience that if I quote something in an article, I’d better check it against the source at least twice, because all kinds of gremlins can get their claws into it in the meantime. What sets Lehrer’s example apart is that the error survived until the book was in print, which implies an exceptional amount of sloppiness, and when the mistake was revealed, Lehrer only made it worse by lying. As Daisey recently found out, it isn’t the initial mistake that kills you, but the coverup. If Lehrer had simply granted that he couldn’t source the quote and blamed it on an editing error, it would have been humiliating, but not catastrophic. Instead, he spun a comically elaborate series of lies about having access to unreleased documentary footage and being in contact with Bob Dylan’s management, fabrications that fell apart at once. And while I’ve done my best to interpret his previous lapses as generously as possible, I don’t know if I can do that anymore.

In my piece on The Rumpus, I said that Lehrer’s earlier mistakes were venial sins, not mortal ones. Now that he’s slid into the area of mortal sin—not so much for the initial mistake, but for the lies that followed—it’s unclear what comes next. At the time, I wrote:

Lehrer, who has written so often about human irrationality, can only benefit from this reminder of his own fallibility, and if he’s as smart as he seems, he’ll use it in his work, which until now has reflected wide reading and curiosity, but not experience.

Unfortunately, this is no longer true. I don’t think this is the end of Lehrer’s story: he’s undeniably talented, and if James Frey, of all people, can reinvent himself, Lehrer should be able to do so as well. And yet I’m afraid that there are certain elements of his previous career that will be closed off forever. I don’t think we can take his thoughts on the creative process seriously any longer, now that we’ve seen how his own process was so fatally flawed. There is a world elsewhere, of course. And Lehrer is still so young. But where he goes from here is hard to imagine.

Written by nevalalee

July 31, 2012 at 10:01 am

The secret of creativity

with 2 comments

On Tuesday, in an article in The Daily Beast, I sampled some of the recent wave of books on consciousness and creativity, including Imagine by Jonah Lehrer and The Power of Habit by Charles Duhigg, and concluded that while such books might make us feel smarter, they aren’t likely to make us more creative or rational than we already were. As far as creativity is concerned, I note, there are no easy answers: even the greatest creative geniuses, like Bach, tend to have the same ratio of hits to misses as their forgotten contemporaries, which means that the best way to have a good idea is simply to have as many ideas, good or bad, as possible. And I close my essay with some genuinely useful advice from Dean Simonton, whom I’ve quoted on this blog before: “The best a creative genius can do is to be as prolific as possible in generating products in hope that at least some subset will survive the test of time.”

So does that mean that all other advice on creativity is worthless? I hope not, because otherwise, I’ve been wasting a lot of time on this blog. I’ve devoted countless posts to discussing creativity tools like intentional randomness and mind maps, talking about various methods of increasing serendipity, and arguing for the importance of thinking in odd moments, like washing the dishes or shaving. For my own part, I still have superstitious habits about creativity that I follow every day. I never write a chapter or essay without doing a mind map, for instance—I did the one below before writing the article in the Beast—and I still generate a random quote from Shakespeare whenever I’m stuck on a problem. And these tricks seem to work, at least for me: I always end up with something that would have occurred to me if I hadn’t taken the time.

Yet the crucial word is that last one. Because the more I think about it, the more convinced I am that every useful creativity tool really boils down to just one thing—increasing the amount of time, and the kinds of time, I spend thinking about a problem. When I do a mind map, for instance, I follow a fixed, almost ritualistic set of steps: I take out a pad of paper, write a keyword or two at the center in marker, and let my pen wander across the page. All these steps take time. Which means that making a mind map generates a blank space of forty minutes or so in which I’m just thinking about the problem at hand. And it’s become increasingly clear to me that it isn’t the mind map that matters; it’s the forty minutes. The mind map is just an excuse for me to sit at my desk and think. (This is one reason why I still make my mind maps by hand, rather than with a software program—it extends the length of the process.)

In the end, the only thing that can generate ideas is time spent thinking about them. (Even apparently random moments of insight are the result of long conscious preparation.) I’ve addressed this topic before in my post about Blinn’s Law, in which I speculate that every work of art—a novel, a movie, a work of nonfiction—requires a certain amount of time to be fully realized, no matter how far technology advances, and that much of what we do as artists consists of finding excuses to sit alone at our desks for the necessary year or so. Nearly every creativity tool amounts to a way of tricking my brain into spending time on a problem, either by giving it a pleasant and relatively undemanding task, like drawing a mind map, or seducing it with a novel image or idea that makes its train of thought momentarily more interesting. But the magic isn’t in the trick itself; it’s in the time that follows. And that’s the secret of creativity.

Written by nevalalee

June 7, 2012 at 9:52 am

The right kind of randomness

leave a comment »

Yesterday, while talking about my search for serendipity in the New York Times, I wrote: “What the [Times‘s] recommendation engine thought I might like to see was far less interesting than what other people unlike me were reading at the same time.” The second I typed that sentence, I knew it wasn’t entirely true, and the more I thought about it, the more questions it seemed to raise. Because, really, most readers of the Times aren’t that much unlike me. The site attracts a wide range of visitors, but its ideal audience, the one it targets and the one that embodies how most of its readers probably like to think of themselves, is fairly consistent: educated, interested in the politics and the arts, more likely to watch Mad Men than Two and a Half Men, and rather more liberal than otherwise. The “Most Emailed” list isn’t exactly a random sampling of interesting stories, then, but a sort of idealized picture of what the perfect Times subscriber, with equal access to all parts of the paper, is reading at that particular moment.

As a result, the “serendipity” we find there tends to be skewed in predictable ways. For instance, you’re much more likely to see a column by Paul Krugman than by my conservative college classmate Ross Douthat, who may be a good writer who makes useful points, but you’d never know it based on how often his columns are shared. (I don’t have any hard numbers to back this up, but I’d guess that Douthat’s columns make the “Most Emailed” list only a fraction of the time.) If I were really in search of true serendipity—that is, to quote George Steiner, if I was trying to find what I wasn’t looking for—I’d read the most viewed or commented articles on, say, the National Review, or, better yet, the National Enquirer, the favorite paper of both Victor Niederhoffer and Nassim Nicholas Taleb. But I don’t. What I really want as a reader, it seems, isn’t pure randomness, but the right kind of randomness. It’s serendipity as curated by the writers and readers of the New York Times, which, while interesting, is only a single slice of the universe of randomness at my disposal.

Is this wrong? Not necessarily. In fact, I’d say there are at least two good reasons to stick to a certain subset of randomness, at least on a daily basis. The first reason has something in common with Brian Uzzi’s fascinating research on the collaborative process behind hit Broadway shows, as described in Jonah Lehrer’s Imagine. What Uzzi discovered is that the most successful shows tended to be the work of teams of artists who weren’t frequent collaborators, but weren’t strangers, either. An intermediate level of social intimacy—not too close, but not too far away—seemed to generate the best results, since strangers struggled to find ways of working together, while those who worked together all the time tended to fall into stale, repetitive patterns. And this strikes me as being generally true of the world of ideas as well. Ideas that are too similar don’t combine in interesting ways, but those that are too far apart tend to uselessly collide. What you want, ideally, is to live in a world of good ideas that want to cohere and set off chains of associations, and for this, an intermediate level of unfamiliarity seems to work the best.

And the second reason is even more important: it’s that randomness alone isn’t enough. It’s good, of course, to seek out new sources of inspiration and ideas, but if done indiscriminately, the result is likely to be nothing but static. Twitter, for instance, is as pure a slice of randomness as you could possibly want, but we very properly try to manage our feeds to include those people we like and find interesting, rather than exposing ourselves to the full noise of the Twitterverse. (That way lies madness.) Even the most enthusiastic proponent of intentional randomness, like me, has to admit that not all sources of information are created equal, and that it’s sometimes necessary to use a trusted home base for our excursions into the unknown. When people engage in bibliomancy—that is, in telling the future by opening a book to a random page—there’s a reason why they’ve historically used books like Virgil or the Bible, rather than Harlequin romance: any book would generate the necessary level of randomness, but you need a basic level of richness and meaning as well. What I’m saying, I guess, is that if you’re going to be random, you may as well be systematic about it. And the New York Times isn’t a bad place to start.

Written by nevalalee

May 23, 2012 at 10:42 am

Yo-Yo Ma, detective

with 3 comments

I always look at a piece of music like a detective novel. Maybe the novel is about a murder. Well, who committed the murder? Why did he do it? My job is to retrace the story so that the audience feels the suspense. So that when the climax comes, they’re right there with me, listening to my beautiful detective story. It’s all about making people care about what happens next.

Yo-Yo Ma, quoted by Jonah Lehrer in Imagine

Written by nevalalee

May 19, 2012 at 9:00 am

Posted in Quote of the Day

Tagged with , ,

%d bloggers like this: