Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Edward Tufte

Subterranean fact check blues

leave a comment »

In Jon Ronson’s uneven but worthwhile book So You’ve Been Publicly Shamed, there’s a fascinating interview with Jonah Lehrer, the superstar science writer who was famously hung out to dry for a variety of scholarly misdeeds. His troubles began when a journalist named Michael C. Moynihan noticed that six quotes attributed to Bob Dylan in Lehrer’s Imagine appeared to have been fabricated. Looking back on this unhappy period, Lehrer blames “a toxic mixture of insecurity and ambition” that led him to take shortcuts—a possibility that occurred to many of us at the time—and concludes:

And then one day you get an email saying that there’s these…Dylan quotes, and they can’t be explained, and they can’t be found anywhere else, and you were too lazy, too stupid, to ever check. I can only wish, and I wish this profoundly, I’d had the temerity, the courage, to do a fact check on my last book. But as anyone who does a fact check knows, they’re not particularly fun things to go through. Your story gets a little flatter. You’re forced to grapple with all your mistakes, conscious and unconscious.

There are at least two striking points about this moment of introspection. One is that the decision whether or not to fact-check a book was left to the author himself, which feels like it’s the wrong way around, although it’s distressingly common. (“Temerity” also seems like exactly the wrong word here, but that’s another story.) The other is that Lehrer avoided asking someone to check his facts because he saw it as a painful, protracted process that obliged him to confront all the places where he had gone wrong.

He’s probably right. A fact check is useful in direct proportion to how much it hurts, and having just endured one recently for my article on L. Ron Hubbard—a subject on whom no amount of factual caution is excessive—I can testify that, as Lehrer says, it isn’t “particularly fun.” You’re asked to provide sources for countless tiny statements, and if you can’t find it in your notes, you just have to let it go, even if it kills you. (As far as I can recall, I had to omit exactly one sentence from the Hubbard piece, on a very minor point, and it still rankles me.) But there’s no doubt in my mind that it made the article better. Not only did it catch small errors that otherwise might have slipped into print, but it forced me to go back over every sentence from another angle, critically evaluating my argument and asking whether I was ready to stand by it. It wasn’t fun, but neither are most stages of writing, if you’re doing it right. In a couple of months, I’ll undergo much the same process with my book, as I prepare the endnotes and a bibliography, which is the equivalent of my present self performing a fact check on my past. This sort of scholarly apparatus might seem like a courtesy to the reader, and it is, but it’s also good for the book itself. Even Lehrer seems to recognize this, stating in his attempt at an apology in a keynote speech for the Knight Foundation:

If I’m lucky enough to write again, I won’t write a thing that isn’t fact-checked and fully footnoted. Because here is what I’ve learned: unless I’m willing to continually grapple with my failings—until I’m forced to fix my first draft, and deal with criticism of the second, and submit the final for a good, independent scrubbing—I won’t create anything worth keeping around.

For a writer whose entire brand is built around counterintuitive, surprising insights, this realization might seem bluntly obvious, but it only speaks to how resistant most writers, including me, are to any kind of criticism. We might take it better if we approached it with the notion that it isn’t simply for the sake of our readers, or our hypothetical critics, or even the integrity of the subject matter, but for ourselves. A footnote lurking in the back of the book makes for a better sentence on the page, if only because of the additional pass that it requires. It would help if we saw such standards—the avoidance of plagiarism, the proper citation of sources—not as guidelines imposed by authority from above, but as a set of best practices that well up from inside the work itself. A few days ago, there yet was another plagiarism controversy, which, in what Darin Morgan once called “one of those coincidences found only in real life and great fiction,” also involved Bob Dylan. As Andrea Pitzer of Slate recounts it:

During his official [Nobel] lecture recorded on June 4, laureate Bob Dylan described the influence on him of three literary works from his childhood: The Odyssey, All Quiet on the Western Front, and Moby-Dick. Soon after, writer Ben Greenman noted that in his lecture Dylan seemed to have invented a quote from Moby-Dick…I soon discovered that the Moby-Dick line Dylan dreamed up last week seems to be cobbled together out of phrases on the website SparkNotes, the online equivalent of CliffsNotes…Across the seventy-eight sentences in the lecture that Dylan spends describing Moby-Dick, even a cursory inspection reveals that more than a dozen of them appear to closely resemble lines from the SparkNotes site.

Without drilling into it too deeply, I’ll venture to say that if this all seems weird, it’s because Bob Dylan, of all people, after receiving the Nobel Prize for Literature, might have cribbed statements from an online study guide written by and for college students. But isn’t that how it always goes? Anecdotally speaking, plagiarists seem to draw from secondary or even tertiary sources, like encyclopedias, since the sort of careless or hurried writer vulnerable to indulging in it in the first place isn’t likely to grapple with the originals. The result is an inevitable degradation of information, like a copy of a copy. As Edward Tufte memorably observes in Visual Explanations: “Incomplete plagiarism leads to dequantification.” In context, he’s talking about the way in which illustrations and statistical graphics tend to lose data the more often they get copied. (In The Visual Display of Quantitative Information, he cites a particularly egregious example, in which a reproduction of a scatterplot “forgot to plot the points and simply retraced the grid lines from the original…The resulting figure achieves a graphical absolute zero, a null data-ink ratio.”) But it applies to all kinds of plagiarism, and it makes for a more compelling argument, I think, than the equally valid point that the author is cheating the source and the reader. In art or literature, it’s better to argue from aesthetics than ethics. If fact-checking strengthens a piece of writing, then plagiarism, with its effacing of sources and obfuscation of detail, can only weaken it. One is the opposite of the other, and it’s no surprise that the sins of plagiarism and fabrication tend to go together. They’re symptoms of the same underlying sloppiness, and this is why writers owe it to themselves—not to hypothetical readers or critics—to weed them out. A writer who is sloppy on small matters of fact can hardly avoid doing the same on the higher levels of an argument, and policing the one is a way of keeping an eye on the other. It isn’t always fun. But if you’re going to be a writer, as Dylan himself once said: “Now you’re gonna have to get used to it.”

The AutoContent Wizard, Part 1

leave a comment »

The AutoContent Wizard

A few days ago, while helping my wife prepare a presentation for a class she’s teaching as an adjunct lecturer at the Medill School of Journalism, I was reminded of how much I hate PowerPoint. This isn’t a knock at my wife, who is taking her role there very seriously, or even at the slideshow format itself, which can be useful, as she intends to employ it, in presenting exhibits or examples for a classroom discussion. It’s more a feeling of creeping terror at the slideshow state of mind, the one that links both the stodgiest of corporations and the startups that like to think they have nothing in common with the old guard. Every tech pundit or entrepreneur is expected to have a deck, a set of slides—presumably stored in a handy flash drive that can be kept on one’s person at all times—with which he or she can distill his life’s work into a few images, ready to be shown at an impromptu TED talk. And it horrifies me, not just as a former office worker who has spent countless hours suffering through interminable slide presentations, but as someone who cares about the future of ideas. I’m not the first person to say this, of course, and at this point, it would probably be more interesting to mount a legitimate defense of the slideshow mentality than to make fun of it yet again. But the more I look around at our media landscape, the more it seems like a PowerPoint world, crafted and marketed to us by people who don’t want to think in terms that can’t be boiled down to a slick presentation.

The most useful critic on the subject, not surprisingly, is the legendary statistician and information designer Edward Tufte, whose book Beautiful Evidence includes a savage takedown of PowerPoint and the sloppy thinking it encourages. Tufte includes the usual samples of incomprehensible slides, but he also gets at a crucial point that explains how we got here:

Yet PowerPoint is entirely presenter-oriented, and not content-oriented, not audience-oriented. The claims of [PowerPoint] marketing are addressed to speakers: “A cure for the presentation jitters.” “Get yourself organized.” “Use the AutoContent Wizard to figure out what you want to say.” And fans of PowerPoint are presenters, rarely audience members.

This, I think, is the crux of the matter. Any form of communication that is designed more for the convenience of the creator than for its audience is inherently corrupt, and it tends to corrupt serious thought—or even simple clarity—the more frequently it gets used. We’re often told that people fear public speaking more than death, but the solution turns into a kind of living death for its listeners. And its most obvious manifestation is the dreaded bullet point, or, more accurately, the nested list, a visual cliché intended to create the impression of clear, logical, sequential thinking where none actually exists.

A PowerPoint slide

Bullet points aren’t anything new, as Richard Feynman learned back when he was wading through government documents as part of his investigation of the Challenger explosion: “Then we learned about ‘bullets’—little black circles in front of phrases that were supposed to summarize things. There was one after another of these little goddamn bullets in our briefing books and on slides.” But the way PowerPoint not only encourages but essentially forces users to structure their presentations as nested bullet points, no matter how incoherent the underlying argument, points to something truly insidious: the assumption that presentation counts for more than content, and that it’s fine to settle for shoddy, disorganized thinking as long as it follows the same set of stereotyped beats. As Ian Parker wrote in an excellent New Yorker piece on the subject from more than a decade ago:

But PowerPoint also has a private, interior influence. It edits ideas. It is, almost surreptitiously, a business manual as well as a business suit, with an opinion—an oddly pedantic, prescriptive opinion—about the way we should think. It helps you make a case, but it also makes its own case about how to organize information, how much information to organize, how to look at the world.

Parker cites the “Motivating a Team” template, which invites the anxious presenter to fill in a series of blanks—”Generate possible solutions with green light, nonjudgmental thinking”—and cheerfully advises: “Have an inspirational close.” A tool supposedly made for presentations, in short, shades imperceptibly into a formula for thought, based on the implicit premise that ideas themselves are interchangeable.

The templates that Parker is mocking here used to appear in a feature known as the AutoContent Wizard, which sounds a little what a prolific blogger might want to be called in bed. In fact, it was a set of templates—including my favorite, “Communicating Bad News”—that amounts to a gesture of contempt toward everyone who creates or is subjected to such presentations, as Parker’s account of its creation makes clear:

AutoContent was added in the mid-nineties, when Microsoft learned that some would-be presenters were uncomfortable with a blank PowerPoint page—it was hard to get started. “We said, ‘What we need is some automatic content!'” a former Microsoft developer recalls, laughing. “‘Punch the button and you’ll have a presentation.'” The idea, he thought, was “crazy.” And the name was meant as a joke. But Microsoft took the idea and kept the name—a rare example of a product named in outright mockery of its target customers.

The AutoContent Wizard was phased out a few years ago, but even if we’re lucky enough not to work at a company at which we’re subjected to its successors, the attitudes behind it have expanded to cover much of the content that we consume on a daily basis. I’m speaking of the slideshow and the listicle, which have long since become the models for how online content can be profitably packaged and delivered. Tomorrow, I’ll be talking more about how PowerPoint gave birth to these secret children, how the assumptions they reflect come from the very top, and what it means to live in a world in which AutoContent, not content, is king.

Written by nevalalee

September 21, 2015 at 9:47 am

The Travolta moment

with 2 comments

Jonathan Franzen

There’s a moment halfway through Jonathan Franzen’s The Corrections when Enid Lambert, the matriarch of the novel’s dysfunctional Midwestern family, visits a doctor on a cruise ship. It’s an important scene—Enid leaves with a handful of antidepressants that will play a big role later in the story—and Franzen lavishes his usual amount of care on the sequence, which runs for a full nine pages. But here’s how he chooses to describe the doctor on his first appearance:

He had a large, somewhat coarse-skinned face like the face of the Italian-American actor people loved, the one who once starred as an angel and another time as a disco dancer.

I adore The Corrections, but this is an embarrassing sentence—one of the worst I’ve ever seen pass the pen of a major novelist. It’s particularly surprising coming from Franzen, who has thought as urgently and probingly as any writer alive about the problem of voice. But it’s also the kind of lapse that turns out to be unexpectedly instructive, precisely because it comes from an author who really ought to know better.

So why does this sentence grate so much? Let’s break down the reasons one at a time:

  1. Franzen clearly wants to tell us that the doctor looks like John Travolta, but he’s too shy to come out and say so, so he uses more than twenty words to convey what could have easily been expressed in two.
  2. In the process, he’s false to his character. No woman of Enid’s generation and background would have any trouble coming up with Travolta’s name, especially if she were familiar with his role in Michael, of all movies. It’s not like she’s trying to remember, say, Richard Jenkins.
  3. Worst of all, it takes us out of the story. Instead of focusing on the moment—which happens to be a crucial turning point for Enid’s character—we’re distracted by Franzen’s failure of style.

And the punchline here is that a lesser novelist would simply have said that the doctor looked like Travolta and been done with it. Franzen, an agonizingly smart writer, senses how lazy this is, so he backs away, but not nearly far enough. And the result reads like nothing a recognizable human being would feel or say.

John McPhee

I got to thinking about this after reading John McPhee’s recent New Yorker piece about frames of reference. McPhee’s pet peeve is when authors describe a person’s appearance by leaning on a perceived resemblance to a famous face, as in this example from Ian Frazier: “She looks enough like the late Bea Arthur, the star of the nineteen-seventies sitcom Maude, that it would be negligent not to say so.” Clearly, if you don’t remember how Bea Arthur looks, this description isn’t very useful. And while any such discussion tends to turn into a personal referendum on which references are obvious and which aren’t—McPhee claims he doesn’t know who Gene Wilder is, for instance—his point is a valid one:

If you say someone looks like Tom Cruise—and you let it go at that—you are asking Tom Cruise to do your writing for you. Your description will fail when your reader doesn’t know who Tom Cruise is.

And references that seem obvious now may not feel that way in twenty years. McPhee concludes, reasonably, that if you’re going to compare a character to a celebrity, you need to pay back that borrowed vividness by amplifying it with a line of description of your own, as when Joel Achenbach follows up his reference to Gene Wilder by referring to the subject’s “manic energy.”

When we evaluate Franzen’s Travolta moment in this light, it starts to look even worse. It reminds me a little of the statistician Edward Tufte, who famously declared that graphical excellence gives the viewer the greatest number of ideas in the shortest time with the least ink in the smallest space. In his classic The Visual Display of Quantitative Information, he introduces the concept of the data-ink ratio, which consists of the amount of data ink divided by the total ink used to print a statistical graphic. (“Data ink” is the ink in a graph or chart that can’t be erased without a real loss of information.) Ideally, as large a proportion of the ink as possible should be devoted to the presentation of the data, rather than to redundant material. As an example of the ratio at its worst, Tufte reprints a graph from a textbook that erased all the data points while retaining the grid lines, noting drily: “The resulting figure achieves a graphical absolute zero, a null data-ink ratio.” And that’s what Franzen gives us here. In twenty words, he offers no information that the reader isn’t asked to supply on his or her own. To be fair, Franzen is usually better than this. But here, it’s like giving us a female character and saying that she looks like Adele Dazeem.

%d bloggers like this: