Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Bob Dylan

Subterranean fact check blues

leave a comment »

In Jon Ronson’s uneven but worthwhile book So You’ve Been Publicly Shamed, there’s a fascinating interview with Jonah Lehrer, the superstar science writer who was famously hung out to dry for a variety of scholarly misdeeds. His troubles began when a journalist named Michael C. Moynihan noticed that six quotes attributed to Bob Dylan in Lehrer’s Imagine appeared to have been fabricated. Looking back on this unhappy period, Lehrer blames “a toxic mixture of insecurity and ambition” that led him to take shortcuts—a possibility that occurred to many of us at the time—and concludes:

And then one day you get an email saying that there’s these…Dylan quotes, and they can’t be explained, and they can’t be found anywhere else, and you were too lazy, too stupid, to ever check. I can only wish, and I wish this profoundly, I’d had the temerity, the courage, to do a fact check on my last book. But as anyone who does a fact check knows, they’re not particularly fun things to go through. Your story gets a little flatter. You’re forced to grapple with all your mistakes, conscious and unconscious.

There are at least two striking points about this moment of introspection. One is that the decision whether or not to fact-check a book was left to the author himself, which feels like it’s the wrong way around, although it’s distressingly common. (“Temerity” also seems like exactly the wrong word here, but that’s another story.) The other is that Lehrer avoided asking someone to check his facts because he saw it as a painful, protracted process that obliged him to confront all the places where he had gone wrong.

He’s probably right. A fact check is useful in direct proportion to how much it hurts, and having just endured one recently for my article on L. Ron Hubbard—a subject on whom no amount of factual caution is excessive—I can testify that, as Lehrer says, it isn’t “particularly fun.” You’re asked to provide sources for countless tiny statements, and if you can’t find it in your notes, you just have to let it go, even if it kills you. (As far as I can recall, I had to omit exactly one sentence from the Hubbard piece, on a very minor point, and it still rankles me.) But there’s no doubt in my mind that it made the article better. Not only did it catch small errors that otherwise might have slipped into print, but it forced me to go back over every sentence from another angle, critically evaluating my argument and asking whether I was ready to stand by it. It wasn’t fun, but neither are most stages of writing, if you’re doing it right. In a couple of months, I’ll undergo much the same process with my book, as I prepare the endnotes and a bibliography, which is the equivalent of my present self performing a fact check on my past. This sort of scholarly apparatus might seem like a courtesy to the reader, and it is, but it’s also good for the book itself. Even Lehrer seems to recognize this, stating in his attempt at an apology in a keynote speech for the Knight Foundation:

If I’m lucky enough to write again, I won’t write a thing that isn’t fact-checked and fully footnoted. Because here is what I’ve learned: unless I’m willing to continually grapple with my failings—until I’m forced to fix my first draft, and deal with criticism of the second, and submit the final for a good, independent scrubbing—I won’t create anything worth keeping around.

For a writer whose entire brand is built around counterintuitive, surprising insights, this realization might seem bluntly obvious, but it only speaks to how resistant most writers, including me, are to any kind of criticism. We might take it better if we approached it with the notion that it isn’t simply for the sake of our readers, or our hypothetical critics, or even the integrity of the subject matter, but for ourselves. A footnote lurking in the back of the book makes for a better sentence on the page, if only because of the additional pass that it requires. It would help if we saw such standards—the avoidance of plagiarism, the proper citation of sources—not as guidelines imposed by authority from above, but as a set of best practices that well up from inside the work itself. A few days ago, there yet was another plagiarism controversy, which, in what Darin Morgan once called “one of those coincidences found only in real life and great fiction,” also involved Bob Dylan. As Andrea Pitzer of Slate recounts it:

During his official [Nobel] lecture recorded on June 4, laureate Bob Dylan described the influence on him of three literary works from his childhood: The Odyssey, All Quiet on the Western Front, and Moby-Dick. Soon after, writer Ben Greenman noted that in his lecture Dylan seemed to have invented a quote from Moby-Dick…I soon discovered that the Moby-Dick line Dylan dreamed up last week seems to be cobbled together out of phrases on the website SparkNotes, the online equivalent of CliffsNotes…Across the seventy-eight sentences in the lecture that Dylan spends describing Moby-Dick, even a cursory inspection reveals that more than a dozen of them appear to closely resemble lines from the SparkNotes site.

Without drilling into it too deeply, I’ll venture to say that if this all seems weird, it’s because Bob Dylan, of all people, after receiving the Nobel Prize for Literature, might have cribbed statements from an online study guide written by and for college students. But isn’t that how it always goes? Anecdotally speaking, plagiarists seem to draw from secondary or even tertiary sources, like encyclopedias, since the sort of careless or hurried writer vulnerable to indulging in it in the first place isn’t likely to grapple with the originals. The result is an inevitable degradation of information, like a copy of a copy. As Edward Tufte memorably observes in Visual Explanations: “Incomplete plagiarism leads to dequantification.” In context, he’s talking about the way in which illustrations and statistical graphics tend to lose data the more often they get copied. (In The Visual Display of Quantitative Information, he cites a particularly egregious example, in which a reproduction of a scatterplot “forgot to plot the points and simply retraced the grid lines from the original…The resulting figure achieves a graphical absolute zero, a null data-ink ratio.”) But it applies to all kinds of plagiarism, and it makes for a more compelling argument, I think, than the equally valid point that the author is cheating the source and the reader. In art or literature, it’s better to argue from aesthetics than ethics. If fact-checking strengthens a piece of writing, then plagiarism, with its effacing of sources and obfuscation of detail, can only weaken it. One is the opposite of the other, and it’s no surprise that the sins of plagiarism and fabrication tend to go together. They’re symptoms of the same underlying sloppiness, and this is why writers owe it to themselves—not to hypothetical readers or critics—to weed them out. A writer who is sloppy on small matters of fact can hardly avoid doing the same on the higher levels of an argument, and policing the one is a way of keeping an eye on the other. It isn’t always fun. But if you’re going to be a writer, as Dylan himself once said: “Now you’re gonna have to get used to it.”

The stress test

with 8 comments

Bob Dylan in Don't Look Back

When you’re trying to figure out how the world works, you can often learn a lot from extreme cases. If you’re putting together a computer simulation, for instance, you can test it for degeneracy by entering zeroes—or another lower or upper bound—for all the parameters and watching how the model responds. In engineering, it’s the principle behind the stress test, in which you subject a system or a machine to unrealistic conditions in order to find its breaking point. Semiconductor manufacturers, for example, talk about process corners, which are the extremes of the parameters within which an integrated circuit is supposed to keep working. As part of the design process, they’ll make corner lots, which are essentially batches of chips that have been deliberately fabricated with these extreme values, and test them against various conditions to see how they hold up. The result can be graphed on a chart called a shmoo plot, which allows you to visualize the operating range of the device that you’re developing. Even if these conditions seem unlikely to come up in practice, they can provide you with valuable data that wouldn’t be obvious using more moderate or conservative assumptions. They can show you the limits of the design. And they can allow you to prepare for “black swan” events that occur more often than experience itself would imply.

Over the last month, we’ve experienced two unforgettable examples of such extreme values in the real world. The first, obviously, is the strange case of Donald Trump, who sometimes behaves as if someone had created a political candidate using an avatar editor in a video game and turned all the knobs to their lowest setting. Trump isn’t qualified to hold office. He isn’t a likable human being. You can’t even say that he appeals to the ideologues, since his ideas are either nonexistent, repulsive, or so unreliable as to be meaningless. He isn’t a good debater; he’s at war with the establishment within his own party; he’s gone out of his way to alienate entire groups of voters; and these days, he doesn’t even seem all that interested in campaigning. Yet his support has held more or less steady at forty percent. It’s alarming, but it’s also an immensely important piece of information. Trump’s share of the popular vote, whatever it turns out to be, represents the effective floor for a Republican nominee in this country. It’s hard to imagine what he possibly could have done to make it harder on himself. As a result, he’s established a baseline for candidates in the future, and he’s taught us that the marginal difference between the worst and the best conservative candidate amounts to something like ten percentage points. If this were a simulation, we’d have trouble believing it.

Donald Trump

But we recently saw another test case, at the opposite end of the spectrum, when Bob Dylan was awarded the Nobel Prize in Literature. Dylan has failed to even acknowledge the honor, which has led at least one member of the Swedish Academy to call his behavior “impolite and arrogant.” Yet his response only underlines what many of us subconsciously realized when the award was first announced. It’s going to be harder to take the Nobel Prize seriously in the future, not because Dylan isn’t a deserving recipient, but because when you put the prize next to him, it looks small. Dylan, like Trump, is an extreme case: he’s already acquired all the wealth, critical acclaim, and popular success that any artist could desire. This means that his selection gives us valuable insight into the real worth of a Nobel Prize, when you’ve stripped away all of the usual benefits that it confers. The answer isn’t all that flattering to the prize itself. In fact, it starts to look like it doesn’t mean anything. When you give the most prestigious award in existence to one of the world’s most famous men, it’s a stress test, not just for the Nobel Prize, but for all prizes whatsoever. The committee presumably hoped to make a statement by picking a popular artist, but it would have been better off continuing to award European poets and playwrights who are virtually unknown outside their native countries. By presenting it to Dylan, they’ve inadvertently exposed their own irrelevance.

And such examples are interesting primarily because of the light that they shed on more routine cases. Trump is less illuminating in isolation, since I doubt we’ll see a candidate like him ever again, than in the perspective he affords on all the little Trumps with whom he surrounds himself. He tells us how large a proportion of the Republican base is utterly indifferent to its candidate’s strengths or weaknesses, which is a data point that needs to be taken into account in every future election. Bob Dylan’s lesson is less obvious, but even more instructive. There’s only one Dylan, but he’s just an extreme instance of what every artist ought to be: you stick to your principles, you don’t sell out, you follow your own intuitions rather than those of your audience, and you find satisfaction in the work itself. We should all be little Dylans. If the Nobel Prize doesn’t make a difference to him, then maybe any material reward whatsoever shouldn’t matter to any working artist. (And yes, this includes the money, which few artists would turn down, but which ultimately seems unnecessary, or at least beside the point.) From now on, whenever we hear that someone has won an award, we should ask ourselves: “How would this change Bob Dylan’s life?” The answer is that it wouldn’t, which should serve as a reminder to those who strive to embody his virtues without his fame. The Nobel committee couldn’t add a cubit to Dylan’s stature, any more than Trump could lower the bottom any further. And we’ve learned a lot from them both—which doesn’t make it any less stressful.

Written by nevalalee

October 27, 2016 at 9:13 am

The power of three

with 2 comments

Bob Dylan

I didn’t invent this style. It had been shown to me in the early sixties by Lonnie Johnson…Lonnie took me aside one night and showed me a style of playing based on an odd- instead of even-number system. He had me play chords and he demonstrated how to do it. This was just something he knew about, not necessarily something he used because he did so many different kinds of songs. He said, “This might help you,” and I had the idea that he was showing me something secretive, though it didn’t make sense to me at that time because I needed to strum the guitar in order to get my ideas across. It’s a highly controlled system of playing and relates to the notes of a scale, how they combine numerically, how they form melodies out of triplets and are axiomatic to the rhythm and the chord changes…

The system works in a cyclical way. Because you’re thinking in odd numbers instead of even numbers, you’re playing with a different value system. Popular music is usually based on the number two and then filled in with fabrics, colors, effects, and technical wizardry to make a point. But the total effect is usually depressing and oppressive and a dead end which at the most can only last in a nostalgic way. If you’re using an odd numerical system, things that strengthen a performance automatically begin to happen and make it memorable for the ages. You don’t have to plan or think ahead…There’s no mystery to it and it’s not a technical trick. The scheme is for real.

For me, this style would be most advantageous, like a delicate design that would arrange the structure of whatever piece I was performing. The listener would recognize and feel the dynamics immediately. Things could explode or retreat back at any time and there would be no way to predict the consciousness of any song. And because this works on its own mathematical formula, it can’t miss. I’m not a numerologist. I don’t know why the number three is more metaphysically powerful than the number two, but it is. Passion and enthusiasm, which sometimes can be enough to sway a crowd, aren’t even necessary. You can manufacture faith out of nothing and there are an infinite number of patterns and lines that connect from key to key—all deceptively simple. You gain power with the least amount of effort, trust that the listeners make their own connections, and it’s very seldom that they don’t. Miscalculations can also cause no serious harm. As long as you recognize it, you can turn the dynamic around architecturally in a second.

Bob Dylan, Chronicles

Written by nevalalee

October 15, 2016 at 7:30 am

Altered states of conscientiousness

with 2 comments

Bob Dylan in Don't Look Back

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What pop culture is best consumed in an altered state?”

When Bob Dylan first met the Beatles, the story goes, he was astonished to learn that they’d never used drugs. (Apparently, the confusion was all caused by a mondegreen: Dylan misheard a crucial lyric from “I Want to Hold Your Hand” as “I get high” instead of “I can’t hide.”) This was back in the early days, of course, and later, the Beatles would become part of the psychedelic culture in ways that can’t be separated from their greatest achievements. Still, it’s revealing that their initial triumphs emerged from a period of clean living. Drugs can encourage certain qualities, but musicianship and disciplined invention aren’t among them, and I find it hard to believe that Lennon and McCartney would have gained much, if anything, from controlled substances without that essential foundation—certainly not to the point where Dylan would have wanted to meet them in the first place. For artists, drugs are a kind of force multiplier, an ingredient that can enhance elements that are already there, but can’t generate something from nothing. As Norman Mailer, who was notably ambivalent about his own drug use, liked to say, drugs are a way of borrowing on the future, but those seeds can wither and die if they don’t fall on soil that has been prepared beforehand.

Over the years, I’ve read a lot written by or about figures in the drug culture, from Carlos Castaneda to Daniel Pinchbeck to The Electric Kool-Aid Acid Test, and I’m struck by a common pattern: if drugs lead to a state of perceived insight, it usually takes the form of little more than a conviction that everyone should try drugs. Drug use has been a transformative experience for exceptional individuals as different as Aldous Huxley, Robert Anton Wilson, and Steve Jobs, but it tends to be private, subjective, and uncommunicable. As such, it doesn’t have much to do with art, which is founded on its functional objectivity—that is, on its capacity to be conveyed more or less intact from one mind to the next. And it creates a lack of critical discrimination that can be dangerous to artists when extended over time. If marijuana, as South Park memorably pointed out, makes you fine with being bored, it’s the last thing artists need, since art boils down to nothing but a series of deliberate strategies for dealing with, confronting, or eradicating boredom. When you’re high, you’re easily amused, which makes you less likely to produce anything that can sustain the interest of someone who isn’t in the same state of chemical receptivity.

2001: A Space Odyssey

And the same principle applies to the artistic experience from the opposite direction. When someone says that 2001 is better on pot, that isn’t saying much, since every movie seems better on pot. Again, however, this has a way of smoothing out and trivializing a movie’s real merits. Kubrick’s film comes as close as any ever made to encouraging a transcendent state without the need of mind-altering substances, and his own thoughts on the subject are worth remembering:

[Drug use] tranquilizes the creative personality, which thrives on conflict and on the clash and ferment of ideas…One of the things that’s turned me against LSD is that all the people I know who use it have a peculiar inability to distinguish between things that are really interesting and stimulating and things that appear so in the state of universal bliss the drug induces on a good trip. They seem to completely lose their critical faculties and disengage themselves from some of the most stimulating areas of life.

Which isn’t to say that a temporary relaxation of the faculties doesn’t have its place. I’ll often have a beer while watching a movie or television show, and my philosophy here is similar to that of chef David Chang, who explains his preference for “the lightest, crappiest beer”:

Let me make one ironclad argument for shitty beer: It pairs really well with food. All food. Think about how well champagne pairs with almost anything. Champagne is not a flavor bomb! It’s bubbly and has a little hint of acid and is cool and crisp and refreshing. Cheap beer is, no joke, the champagne of beers.

And a Miller Lite—which I’m not embarrassed to proclaim as my beer of choice—pairs well with almost any kind of entertainment, since it both gives and demands so little. At minimum, it makes me the tiniest bit more receptive to whatever I’m being shown, not enough to forgive its flaws, but enough to encourage me to meet it halfway. For much the same reason, I no longer drink while working: even that little extra nudge can be fatal when it comes to evaluating whether something I’ve written is any good. Because Kubrick, as usual, deserves the last word: “Perhaps when everything is beautiful, nothing is beautiful.”

Written by nevalalee

March 20, 2015 at 9:16 am

Jonah Lehrer’s blues

with 2 comments

Back in June, when it was first revealed that Jonah Lehrer had reused some of his own work without attribution on the New Yorker blog, an editor for whom I’d written articles in the past sent me an email with the subject line: “Mike Daisey…Jonah Lehrer?” When he asked if I’d be interested in writing a piece about it, I said I’d give it a shot, although I also noted: “I don’t think I’d lump Lehrer in with Daisey just yet.” And in fact, I’ve found myself writing about Lehrer surprisingly often, in pieces for The Daily Beast, The Rumpus, and this blog. If I’ve returned to Lehrer more than once, it’s because I enjoyed a lot of his early work, was mystified by his recent problems, and took a personal interest in his case because we’re about the same age and preoccupied with similar issues of creativity and imagination. But with the revelation that he fabricated quotes in his book and lied about it, as uncovered by Michael C. Moynihan of Tablet, it seems that we may end up lumping Lehrer in with Mike Daisey after all. And this makes me very sad.

What strikes me now is the fact that most of Lehrer’s problems seem to have been the product of haste. He evidently repurposed material on his blog from previously published works because he wasn’t able to produce new content at the necessary rate. The same factor seems to have motivated his uncredited reuse of material in Imagine. And the Bob Dylan quotes he’s accused of fabricating in the same book are so uninteresting (“It’s a hard thing to describe. It’s just this sense that you got something to say”) that it’s difficult to attribute them to calculated fraud. Rather, I suspect that it was just carelessness: the original quotes were garbled in editing, compression, or revision, with Lehrer forgetting where Dylan’s quote left off and his own paraphrase begin. A mistake entered one draft and persisted into the next until it wound up in the finished book. And if there’s one set of errors like this, there are likely to be others—Lehrer’s mistakes just happened to be caught by an obsessive Dylan fan and a very good journalist.

Such errors are embarrassing, but they aren’t hard to understand. I’ve learned from experience that if I quote something in an article, I’d better check it against the source at least twice, because all kinds of gremlins can get their claws into it in the meantime. What sets Lehrer’s example apart is that the error survived until the book was in print, which implies an exceptional amount of sloppiness, and when the mistake was revealed, Lehrer only made it worse by lying. As Daisey recently found out, it isn’t the initial mistake that kills you, but the coverup. If Lehrer had simply granted that he couldn’t source the quote and blamed it on an editing error, it would have been humiliating, but not catastrophic. Instead, he spun a comically elaborate series of lies about having access to unreleased documentary footage and being in contact with Bob Dylan’s management, fabrications that fell apart at once. And while I’ve done my best to interpret his previous lapses as generously as possible, I don’t know if I can do that anymore.

In my piece on The Rumpus, I said that Lehrer’s earlier mistakes were venial sins, not mortal ones. Now that he’s slid into the area of mortal sin—not so much for the initial mistake, but for the lies that followed—it’s unclear what comes next. At the time, I wrote:

Lehrer, who has written so often about human irrationality, can only benefit from this reminder of his own fallibility, and if he’s as smart as he seems, he’ll use it in his work, which until now has reflected wide reading and curiosity, but not experience.

Unfortunately, this is no longer true. I don’t think this is the end of Lehrer’s story: he’s undeniably talented, and if James Frey, of all people, can reinvent himself, Lehrer should be able to do so as well. And yet I’m afraid that there are certain elements of his previous career that will be closed off forever. I don’t think we can take his thoughts on the creative process seriously any longer, now that we’ve seen how his own process was so fatally flawed. There is a world elsewhere, of course. And Lehrer is still so young. But where he goes from here is hard to imagine.

Written by nevalalee

July 31, 2012 at 10:01 am

“You stitched and pressed and packed and drove”

leave a comment »

Sometimes you say things in songs even if there’s a small chance of them being true. And sometimes you say things that have nothing to do with the truth of what you want to say and sometimes you say things that everyone knows to be true. Then again, at the same time, you’re thinking that the only truth on earth is that there is no truth on it. Whatever you are saying, you’re saying in a ricky-tick way. There’s never time to reflect. You stitched and pressed and packed and drove, is what you did.

Bob Dylan, Chronicles, Volume One

Written by nevalalee

December 10, 2011 at 8:00 am

Posted in Quote of the Day

Tagged with

Quote of the Day

leave a comment »

Creativity is not like a freight train going down the tracks. It’s something that has to be caressed and treated with a great deal of respect…You’ve got to program your brain not to think too much.

Bob Dylan

Written by nevalalee

May 25, 2011 at 7:52 am

Posted in Quote of the Day

Tagged with

%d bloggers like this: