Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Slate

Subterranean fact check blues

leave a comment »

In Jon Ronson’s uneven but worthwhile book So You’ve Been Publicly Shamed, there’s a fascinating interview with Jonah Lehrer, the superstar science writer who was famously hung out to dry for a variety of scholarly misdeeds. His troubles began when a journalist named Michael C. Moynihan noticed that six quotes attributed to Bob Dylan in Lehrer’s Imagine appeared to have been fabricated. Looking back on this unhappy period, Lehrer blames “a toxic mixture of insecurity and ambition” that led him to take shortcuts—a possibility that occurred to many of us at the time—and concludes:

And then one day you get an email saying that there’s these…Dylan quotes, and they can’t be explained, and they can’t be found anywhere else, and you were too lazy, too stupid, to ever check. I can only wish, and I wish this profoundly, I’d had the temerity, the courage, to do a fact check on my last book. But as anyone who does a fact check knows, they’re not particularly fun things to go through. Your story gets a little flatter. You’re forced to grapple with all your mistakes, conscious and unconscious.

There are at least two striking points about this moment of introspection. One is that the decision whether or not to fact-check a book was left to the author himself, which feels like it’s the wrong way around, although it’s distressingly common. (“Temerity” also seems like exactly the wrong word here, but that’s another story.) The other is that Lehrer avoided asking someone to check his facts because he saw it as a painful, protracted process that obliged him to confront all the places where he had gone wrong.

He’s probably right. A fact check is useful in direct proportion to how much it hurts, and having just endured one recently for my article on L. Ron Hubbard—a subject on whom no amount of factual caution is excessive—I can testify that, as Lehrer says, it isn’t “particularly fun.” You’re asked to provide sources for countless tiny statements, and if you can’t find it in your notes, you just have to let it go, even if it kills you. (As far as I can recall, I had to omit exactly one sentence from the Hubbard piece, on a very minor point, and it still rankles me.) But there’s no doubt in my mind that it made the article better. Not only did it catch small errors that otherwise might have slipped into print, but it forced me to go back over every sentence from another angle, critically evaluating my argument and asking whether I was ready to stand by it. It wasn’t fun, but neither are most stages of writing, if you’re doing it right. In a couple of months, I’ll undergo much the same process with my book, as I prepare the endnotes and a bibliography, which is the equivalent of my present self performing a fact check on my past. This sort of scholarly apparatus might seem like a courtesy to the reader, and it is, but it’s also good for the book itself. Even Lehrer seems to recognize this, stating in his attempt at an apology in a keynote speech for the Knight Foundation:

If I’m lucky enough to write again, I won’t write a thing that isn’t fact-checked and fully footnoted. Because here is what I’ve learned: unless I’m willing to continually grapple with my failings—until I’m forced to fix my first draft, and deal with criticism of the second, and submit the final for a good, independent scrubbing—I won’t create anything worth keeping around.

For a writer whose entire brand is built around counterintuitive, surprising insights, this realization might seem bluntly obvious, but it only speaks to how resistant most writers, including me, are to any kind of criticism. We might take it better if we approached it with the notion that it isn’t simply for the sake of our readers, or our hypothetical critics, or even the integrity of the subject matter, but for ourselves. A footnote lurking in the back of the book makes for a better sentence on the page, if only because of the additional pass that it requires. It would help if we saw such standards—the avoidance of plagiarism, the proper citation of sources—not as guidelines imposed by authority from above, but as a set of best practices that well up from inside the work itself. A few days ago, there yet was another plagiarism controversy, which, in what Darin Morgan once called “one of those coincidences found only in real life and great fiction,” also involved Bob Dylan. As Andrea Pitzer of Slate recounts it:

During his official [Nobel] lecture recorded on June 4, laureate Bob Dylan described the influence on him of three literary works from his childhood: The Odyssey, All Quiet on the Western Front, and Moby-Dick. Soon after, writer Ben Greenman noted that in his lecture Dylan seemed to have invented a quote from Moby-Dick…I soon discovered that the Moby-Dick line Dylan dreamed up last week seems to be cobbled together out of phrases on the website SparkNotes, the online equivalent of CliffsNotes…Across the seventy-eight sentences in the lecture that Dylan spends describing Moby-Dick, even a cursory inspection reveals that more than a dozen of them appear to closely resemble lines from the SparkNotes site.

Without drilling into it too deeply, I’ll venture to say that if this all seems weird, it’s because Bob Dylan, of all people, after receiving the Nobel Prize for Literature, might have cribbed statements from an online study guide written by and for college students. But isn’t that how it always goes? Anecdotally speaking, plagiarists seem to draw from secondary or even tertiary sources, like encyclopedias, since the sort of careless or hurried writer vulnerable to indulging in it in the first place isn’t likely to grapple with the originals. The result is an inevitable degradation of information, like a copy of a copy. As Edward Tufte memorably observes in Visual Explanations: “Incomplete plagiarism leads to dequantification.” In context, he’s talking about the way in which illustrations and statistical graphics tend to lose data the more often they get copied. (In The Visual Display of Quantitative Information, he cites a particularly egregious example, in which a reproduction of a scatterplot “forgot to plot the points and simply retraced the grid lines from the original…The resulting figure achieves a graphical absolute zero, a null data-ink ratio.”) But it applies to all kinds of plagiarism, and it makes for a more compelling argument, I think, than the equally valid point that the author is cheating the source and the reader. In art or literature, it’s better to argue from aesthetics than ethics. If fact-checking strengthens a piece of writing, then plagiarism, with its effacing of sources and obfuscation of detail, can only weaken it. One is the opposite of the other, and it’s no surprise that the sins of plagiarism and fabrication tend to go together. They’re symptoms of the same underlying sloppiness, and this is why writers owe it to themselves—not to hypothetical readers or critics—to weed them out. A writer who is sloppy on small matters of fact can hardly avoid doing the same on the higher levels of an argument, and policing the one is a way of keeping an eye on the other. It isn’t always fun. But if you’re going to be a writer, as Dylan himself once said: “Now you’re gonna have to get used to it.”

Avocado’s number

with one comment

Earlier this month, you may have noticed a sudden flurry of online discussion around avocado toast. It was inspired by a remark by a property developer named Tim Gurner, who said to the Australian version of 60 Minutes: “When I was trying to buy my first home, I wasn’t buying smashed avocados for nineteen bucks and four coffees at four dollars each.” Gurner’s statement, which was fairly bland and unmemorable in itself, was promptly transformed into the headline “Millionaire to Millennials: Stop Buying Avocado Toast If You Want to Buy a Home.” From there, it became the target of widespread derision, with commentators pointing out that if owning a house seems increasingly out of reach for many young people, it has more to do with rising real estate prices, low wages, and student loans than with their irresponsible financial habits. And the fact that such a forgettable sentiment became the focal point for so much rage—mostly from people who probably didn’t see the original interview—implies that it merely catalyzed a feeling that had been building for some time. Millennials, it’s fair to say, have been getting it from both sides. When they try to be frugal by using paper towels as napkins, they’re accused of destroying the napkin industry, but they’re also scolded over spending too much at brunch. They’re informed that their predicament is their own fault, unless they’re also being idealized as “joyfully engaged in a project of creative destruction,” as Laura Marsh noted last year in The New Republic. “There’s nothing like being told precarity is actually your cool lifestyle choice,” Marsh wrote, unless it’s being told, as the middle class likes to maintain to the poor, that financial stability is only a matter of hard work and a few small sacrifices.

It also reflects an overdue rejection of what used to be called the latte factor, as popularized by the financial writer David Bach in such books as Smart Women Finish Rich. As Helaine Olen writes in Slate:

Bach calculated that eschewing a five-dollar daily bill at Starbucks—because who, after all, really needs anything at Starbucks?—for a double nonfat latte and biscotti with chocolate could net a prospective saver $150 a month, or $2,000 a year. If she then took that money and put it all in stocks that Bach, ever an optimist, assumed would grow at an average annual rate of eleven percent a year, “chances are that by the time she reached sixty-five, she would have more than $2 million sitting in her account.”

There are a lot of flaws in this argument. Bach rounds up his numbers, assumes an unrealistic rate of return, and ignores taxes and inflation. Most problematic of all is his core assumption that tiny acts of indulgence are what prevent the average investor from accumulating wealth. In fact, big, unpredictable risk factors and fixed expenses play a much larger role, as Olen points out:

Buying common luxury items wasn’t the issue for most Americans. The problem was the fixed costs, the things that are difficult to cut back on. Housing, health care, and education cost the average family seventy-five percent of their discretionary income in the 2000s. The comparable figure in 1973: fifty percent. Indeed, studies demonstrate that the quickest way to land in bankruptcy court was not by buying the latest Apple computer but through medical expenses, job loss, foreclosure, and divorce.

It turns out that incremental acts of daily discipline are powerless in the face of systemic factors that have a way of erasing all your efforts—and this applies to more than just personal finance. Back when I was trying to write my first novel, I was struck by the idea that if I managed to write just one hundred words every day, I’d have a manuscript in less than three years. I was so taken by this notion that I wrote it down on an index card and stuck it to my bathroom mirror. That was over a decade ago, and while I can’t quite remember how long I stuck with that regimen, it couldn’t have been more than a few weeks. Novels, I discovered, aren’t written a hundred words at a time, at least not in a fashion that can be banked in the literary equivalent of a penny jar. They’re the product of hard work combined with skills that can only be developed after a period of sustained engagement. There’s a lot of trial and error involved, and you can only arrive at a workable system through the kind of experience that comes from addressing issues of craft with maximal attention. Luck and timing also play a role, particularly when it comes navigating the countless pitfalls that lie between a finished draft and its publication. In finance, we’re inclined to look at a historical return series and attribute it after the fact to genius, rather than to variables that are out of our hands. Similarly, every successful novel creates its own origin story. We naturally underestimate the impact of factors that can’t be credited to individual initiative and discipline. As a motivational tool, there’s a place for this kind of myth. But if novels were written using the literary equivalent of the latte factor, we’d have more novels, just as we’d have more millionaires.

Which isn’t to say that routine doesn’t play a crucial role. My favorite piece of writing advice ever is what David Mamet writes in Some Freaks:

As a writer, I’ve tried to train myself to go one achievable step at a time: to say, for example, “Today I don’t have to be particularly inventive, all I have to be is careful, and make up an outline of the actual physical things the character does in Act One.” And then, the following day to say, “Today I don’t have to be careful. I already have this careful, literal outline, and I all have to do is be a little bit inventive,” et cetera, et cetera.

A lot of writing comes down to figuring out what to do on any given morning—but it doesn’t mean doing the same thing each day. Knowing what achievable steps are appropriate at every stage is as important here as it is anywhere else. You can acquire this knowledge as systematically or haphazardly as you like, but you can also do everything right and still fail in the end. (If we define failure as spending years on a novel that will never be published, it’s practically a requirement of the writer’s education.) Books on writing and personal finance continue to take up entire shelves at bookstores, and they can sound very much alike. In “The Writer’s Process,” a recent, and unusually funny, humor piece in The New Yorker, Hallie Cantor expertly skewers their tone—“I give myself permission to write a clumsy first draft and vigorously edit it later”—and concludes: “Anyway, I guess that’s my process. It’s all about repetition, really—doing the same thing every single day.” We’ve all heard this advice. I’ve been guilty of it myself. But when you don’t take the big picture into account, it’s just a load of smashed avocado.

The Berenstain Barrier

with 2 comments

If you’ve spent any time online in the last few years, there’s a decent chance that you’ve come across some version of what I like to call the Berenstain Bears enigma. It’s based on the fact that a sizable number of readers who recall this book series from childhood remember the name of its titular family as “Berenstein,” when in reality, as a glance at any of the covers will reveal, it’s “Berenstain.” As far as mass instances of misremembering are concerned, this isn’t particularly surprising, and certainly less bewildering than the Mandela effect, or the similar confusion surrounding a nonexistent movie named Shazam. But enough people have been perplexed by it to inspire speculation that these false memories may be the result of an errant time traveler, à la Asimov’s The End of Eternity, or an event in which some of us crossed over from an alternate universe in which the “Berenstein” spelling was correct. (If the theory had emerged a few decades earlier, Robert Anton Wilson might have devoted a page or two to it in Cosmic Trigger.) Even if we explain it as an understandable, if widespread, mistake, it stands as a reminder of how an assumption absorbed in childhood remains far more powerful than a falsehood learned later on. If we discover that we’ve been mispronouncing, say, “Steve Buscemi” for all this time, we aren’t likely to take it as evidence that we’ve ended up in another dimension, but the further back you go, the more ingrained such impressions become. It’s hard to unlearn something that we’ve believed since we were children—which indicates how difficult it can be to discard the more insidious beliefs that some of us are taught from the cradle.

But if the Berenstain Bears enigma has proven to be unusually persistent, I suspect that it’s because many of us really are remembering different versions of this franchise, even if we believe that we aren’t. (You could almost take it as a version of Hilary Putnam’s Twin Earth thought experiment, which asks if the word “water” means the same thing to us and to the inhabitants of an otherwise identical planet covered with a similar but different liquid.) As I’ve recently discovered while reading the books aloud to my daughter, the characters originally created by Stan and Jan Berenstain have gone through at least six distinct incarnations, and your understanding of what this series “is” largely depends on when you initially encountered it. The earliest books, like The Bike Lesson or The Bears’ Vacation, were funny rhymed stories in the Beginner Book style in which Papa Bear injures himself in various ways while trying to teach Small Bear a lesson. They were followed by moody, impressionistic works like Bears in the Night and The Spooky Old Tree, in which the younger bears venture out alone into the dark and return safely home after a succession of evocative set pieces. Then came big educational books like The Bears’ Almanac and The Bears’ Nature Guide, my own favorites growing up, which dispensed scientific facts in an inviting, oversized format. There was a brief detour through stories like The Berenstain Bears and the Missing Dinosaur Bone, which returned to the Beginner Book format but lacked the casually violent gags of the earlier installments. Next came perhaps the most famous period, with dozens of books like Trouble With Money and Too Much TV, all written, for the first time, in prose, and ending with a tidy, if secular, moral. Finally, and jarringly, there was an abrupt swerve into Christianity, with titles like God Loves You and The Berenstain Bears Go to Sunday School.

To some extent, you can chalk this up to the noise—and sometimes the degeneration—that afflicts any series that lasts for half a century. Incremental changes can lead to radical shifts in style and tone, and they only become obvious over time. (Peanuts is the classic example, but you can even see it in the likes of Dennis the Menace and The Family Circus, both of which were startlingly funny and beautifully drawn in their early years.) Fashions in publishing can drive an author’s choices, which accounts for the ups and downs of many a long career. And the bears only found Jesus after Mike Berenstain took over the franchise after the deaths of his parents. Yet many critics don’t bother making these distinctions, and the ones who hate the Berenstain Bears books seem to associate them entirely with the Trouble With Money period. In 2005, for instance, Paul Farhi of the Washington Post wrote:

The larger questions about the popularity of the Berenstain Bears are more troubling: Is this what we really want from children’s books in the first place, a world filled with scares and neuroses and problems to be toughed out and solved? And if it is, aren’t the Berenstain Bears simply teaching to the test, providing a lesson to be spit back, rather than one lived and understood and embraced? Where is the warmth, the spirit of discovery and imagination in Bear Country? Stan Berenstain taught a million lessons to children, but subtlety and plain old joy weren’t among them.

Similarly, after Jan Berenstain died, Hanna Rosin of Slate said: “As any right-thinking mother will agree, good riddance. Among my set of mothers the series is known mostly as the one that makes us dread the bedtime routine the most.”

Which only tells me that neither Farhi or Rosin ever saw The Spooky Old Tree, which is a minor masterpiece—quirky, atmospheric, gorgeously rendered, and utterly without any lesson. It’s a book that I look forward to reading with my daughter. And while it may seem strange to dwell so much on these bears, it gets at a larger point about the pitfalls in judging any body of work by looking at a random sampling. I think that Peanuts is one of the great artistic achievements of the twentieth century, but it would be hard to convince anyone who was only familiar with its last two decades. You can see the same thing happening with The Simpsons, a series with six perfect seasons that threaten to be overwhelmed by the mediocre decades that are crowding the rest out of syndication. And the transformations of the Berenstain Bears are nothing compared to those of Robert A. Heinlein, whose career somehow encompassed Beyond This Horizon, Have Spacesuit—Will Travel, Starship Troopers, Stranger in a Strange Land, and I Will Fear No Evil. Yet there are also risks in drawing conclusions from the entirety of an artist’s output. In his biography of Anthony Burgess, Roger Lewis notes that he has read through all of Burgess’s work, and he asks parenthetically: “And how many have done that—except me?” He’s got a point. Trying to internalize everything, especially over a short period of time, can provide as false a picture as any subset of the whole, and it can result in a pattern that not even the author or the most devoted fan would recognize. Whether or not we’re from different universes, my idea of Bear Country isn’t the same as yours. That’s true of any artist’s work, and it hints at the problem at the root of all criticism: What do we talk about when we talk about the Berenstain Bears?

The two kinds of commentaries

leave a comment »

The Principal and the Pauper

There are two sorts of commentary tracks. The first kind is recorded shortly after a movie or television season is finished, or even while it’s still being edited or mixed, and before it comes out in theaters. Because their memories of the production are still vivid, the participants tend to be a little giddy, even punch drunk, and their feelings about the movie are raw: “The wound is still open,” as Jonathan Franzen put it to Slate. They don’t have any distance, and they remember everything, which means that they can easily get sidetracked into irrelevant detail. They don’t yet know what is and isn’t important. Most of all, they don’t know how the film did with viewers or critics, so their commentary becomes a kind of time capsule, sometimes laden with irony. The second kind of commentary is recorded long after the fact, either for a special edition, for the release of an older movie in a new format, or for a television series that is catching up with its early episodes. These tend to be less predictable in quality: while commentaries on recent work all start to sound more or less the same, the ones that reach deeper into the past are either disappointingly superficial or hugely insightful, without much room in between. Memories inevitably fade with time, but this can also allow the artist to be more honest about the result, and the knowledge of how the work was ultimately received adds another layer of interest. (For instance, one of my favorite commentaries from The Simpsons is for “The Principal and the Pauper,” with writer Ken Keeler and others ranting against the fans who declared it—preemptively, it seems safe to say—the worst episode ever.)

Perhaps most interesting of all are the audio commentaries that begin as the first kind, but end up as the second. You can hear it on the bonus features for The Lord of the Rings, in which, if memory serves, Peter Jackson and his cowriters start by talking about a movie that they finished years ago, continue by discussing a movie that they haven’t finished editing yet, and end by recording their comments for The Return of the King after it won the Oscar for Best Picture. (This leads to moments like the one for The Two Towers in which Jackson lays out his reasoning for pushing the confrontation with Saruman to the next movie—which wound up being cut for the theatrical release.) You also see it, on a more modest level, on the author’s commentaries I’ve just finished writing for my three novels. I began the commentary on The Icon Thief way back on April 30, 2012, or less than two months after the book itself came out. At the time, City of Exiles was still half a year away from being released, and I was just beginning the first draft of the novel that I still thought would be called The Scythian. I had a bit of distance from The Icon Thief, since I’d written a whole book and started another in the meantime, but I was still close enough that I remembered pretty much everything from the writing process. In my earliest posts, you can sense me trying to strike the right balance between providing specific anecdotes about the novel itself to offering more general thoughts on storytelling, while using the book mostly as a source of examples. And I eventually reached a compromise that I hoped would allow those who had actually read the book to learn something about how it was put together, while still being useful to those who hadn’t.

Peter Jackson

As a result, the commentaries began to stray further from the books themselves, usually returning to the novel under discussion only in the final paragraph. I did this partly to keep the posts accessible to nonreaders, but also because my own relationship with the material had changed. Yesterday, when I posted the last entry in my commentary on Eternal Empire, almost four years had passed since I finished the first draft of that novel. Four years is a long time, and it’s even longer in writing terms. If every new project puts a wall between you and the previous one, a series of barricades stands between these novels and me: I’ve since worked on a couple of book-length manuscripts that never got off the ground, a bunch of short stories, a lot of occasional writing, and my ongoing nonfiction project. With each new endeavor, the memory of the earlier ones grows dimmer, and when I go back to look at Eternal Empire now, not only do I barely remember writing it, but I’m often surprised by my own plot. This estrangement from a work that consumed a year of my life is a little sad, but it’s also unavoidable: you can’t keep all this information in your head and still stay sane. Amnesia is a coping strategy. We’re all programmed to forget many of our experiences—as well as our past selves—to free up capacity for the present. A novel is different, because it exists in a form outside the brain. Any book is a piece of its writer, and it can be as disorienting to revisit it as it is to read an old diary. As François Mauriac put it: “It is as painful as reading old letters…We touch it like a thing: a handful of ashes, of dust.” I’m not quite at that point with Eternal Empire, but I’ll sometimes read a whole series of chapters and think to myself, where did that come from?

Under the circumstances, I should count myself lucky that I’m still reasonably happy with how these novels turned out, since I have no choice but to be objective about it. There are things that I’d love to change, of course: sections that run too long, others that seem underdeveloped, conceits that seem too precious or farfetched or convenient. At times, I can see myself taking the easy way out, going with a shortcut or ignoring a possible implication because I lacked the time or energy to do it justice. (I don’t necessarily regret this: half of any writing project involves conserving your resources for when it really matters.) But I’m also surprised by good ideas or connections that seem to have come from outside of me, as if, to use Isaac Asimov’s phrase, I were writing over my own head. Occasionally, I’ll have trouble following my own logic, and the result is less a commentary than a forensic reconstruction of what I must have been thinking at the time. But if I find it hard to remember my reasoning today, it’s easier now than it will be next year, or after another decade. As I suspected at the time, the commentary exists more for me than for anybody else. It’s where I wrote down my feelings about a series of novels that once dominated my life, and which now seem like a distant memory. While I didn’t devote nearly as many hours to these commentaries as I did to the books themselves, they were written over a comparable stretch of time. And now that I’ve gotten to the point of writing a commentary on my commentary—well, it’s pretty clear that it’s time to stop.

Love and research

leave a comment »

Jonathan Franzen

I’ve expressed mixed feelings about Jonathan Franzen before, but I don’t think there’s any doubt about his talent, or about his ability to infuriate readers in just the right way. His notorious essay on climate change in The New Yorker still irritates me, but it prompted me to think deeply on the subject, if only to articulate why I thought he was wrong. But Franzen isn’t a deliberate provocateur, like Norman Mailer was: instead, he comes across as a guy with deeply felt, often conflicted opinions, and he expresses them as earnestly as he can, even if he knows he’ll get in trouble for it. Recently, for instance, he said the following to Isaac Chotiner of Slate, in response to a question about whether he could ever write a book about race:

I have thought about it, but—this is an embarrassing confession—I don’t have very many black friends. I have never been in love with a black woman. I feel like if I had, I might dare…Didn’t marry into a black family. I write about characters, and I have to love the character to write about the character. If you have not had direct firsthand experience of loving a category of person—a person of a different race, a profoundly religious person, things that are real stark differences between people—I think it is very hard to dare, or necessarily even want, to write fully from the inside of a person.

It’s quite a statement, and it comes right at the beginning of the interview, before either Franzen or Chotiner have had a chance to properly settle in. Not surprisingly, it has already inspired a fair amount of snark online. But Franzen is being very candid here in ways that most novelists wouldn’t dare, and he deserves credit for it, even if he puts it in a way that is likely to make us uncomfortable. The question of authors writing about other races is particularly fraught, and the practical test that Franzen proposes is a better entry point than most. We shouldn’t discourage writers from imagining themselves into the lives of characters of different backgrounds, but we can insist on setting a high bar. (I’m talking mostly about literary fiction, by the way, which works hard to enter the consciousness of a protagonist or a society, and not necessarily about the ordinary diversity that I like to see in popular fiction, in which writers can—and often should—make the races of the characters an unobtrusive element in the story.) We could say, for instance, that a novel about race should be conceived from the inside out, rather than the outside in, and that it demands a certain intensity of experience and understanding to justify itself. Given the number of minority authors who are amply qualified to write about these issues firsthand, an outsider needs to earn the right to engage with the subject, and this requires something beyond well-intentioned concern. As Franzen rightly says in the same interview: “I feel it’s really dangerous, if you are a liberal white American, to presume that your good intentions are enough to embark on a work of imagination about black America.”

The Corrections

And Franzen’s position becomes easier to understand when framed within his larger concerns about research itself. As he once told The Guardian: “When information becomes free and universally accessible, voluminous research for a novel is devalued along with it.” Yet like just about everything Franzen says, this seemingly straightforward rule is charged with a kind of reflexive uneasiness, because he’s among the most obsessive of researchers. His novels are full of lovingly rendered set pieces that were obviously researched with enormous diligence, and sometimes they call attention to themselves, as Norman Mailer unkindly but accurately noted of The Corrections:

Everything of novelistic use to him that came up on the Internet seems to have bypassed the higher reaches of his imagination—it is as if he offers us more human experience than he has literally mastered, and this is obvious when we come upon his set pieces on gourmet restaurants or giant cruise ships or modern Lithuania in disarray. Such sections read like first-rate magazine pieces, but no better—they stick to the surface.

For a writer like Franzen, whose novels are ambitious attempts to fit everything he can within two covers, research is part of the game. But it’s also no surprise that the novelist who has tried the hardest to bring research back into mainstream literary fiction should also be the most agonizingly aware of its limitations.

These limitations are particularly stark when it comes to race, which, more than any other theme, demands to be lived and felt before it can be written. And if Franzen shies away from it with particular force, it’s because the set of skills that he has employed so memorably elsewhere is rendered all but useless here. It’s wise of him to acknowledge this, and he sets forth a useful test for gauging a writer’s ability to engage the subject. He writes:

In the case of Purity, I had all this material on Germany. I had spent two and a half years there. I knew the literature fairly well, and I could never write about it because I didn’t have any German friends. The portal to being able to write about it was suddenly having these friends I really loved. And then I wasn’t the hostile outsider; I was the loving insider.

Research, he implies, takes you only so far, and love—defined as the love of you, the novelist, for another human being—carries you the rest of the way. Love becomes a kind of research, since it provides you with something like the painful vividness of empathy and feeling required to will yourself into the lives of others. Without talent and hard work, love isn’t enough, and it may not be enough even with talent in abundance. But it’s necessary, if not sufficient. And while it doesn’t tell us much about who ought to be writing about race, it tells us plenty about who shouldn’t.

Written by nevalalee

August 2, 2016 at 8:37 am

The slow fade

with one comment

Pet Shop Boys

Note: I’m on vacation this week, so I’ll be republishing a few of my favorite posts from earlier in this blog’s run. This post originally appeared, in a slightly different form, on September 16, 2014.

A while back, William Weir wrote an excellent piece in Slate about the decline of the fade-out in pop music, once ubiquitous, now nearly impossible to find. Of the top ten songs of 1985, every single one ended with a fade; in the three years before the article was written, there was only one, “Blurred Lines,” which in itself is a conscious homage to—or an outright plagiarism of—a much earlier model. Weir points to various possible causes for the fade’s disappearance, from the impatience of radio and iTunes listeners to advances in technology that allow producers to easily splice in a cold ending, and he laments the loss of the technique, which at its best produces an impression that a song never ends, but imperceptibly embeds itself into the fabric of the world beyond. (He also notes that a fade-out, more prosaically, can be used to conceal a joke or hidden message. One of my favorites, which he doesn’t mention, occurs in “Always On My Mind” by the Pet Shop Boys, which undermines itself with a nearly inaudible aside at the very end: “Maybe I didn’t love you…”)

The slow fade is a special case of what I’ve elsewhere called the Layla effect, in which a song creates an impression of transcendence or an extension into the infinite by the juxtaposition of two unrelated parts—although one of the few songs on that list that doesn’t end with a fade, interestingly, is “Layla” itself. As Weir points out, a proper fade involves more than just turning down the volume knob: it’s a miniature movement in its own right, complete with its own beginning, middle, and end, and it produces a corresponding shift in the listener’s mental state. He cites a fascinating study by the Hanover University of Music in Germany, which measured how long students tapped along to the rhythm of the same song in two different versions. When the song was given a cold ending, subjects stopped tapping an average of 1.4 seconds before the song was over, but with a fade-out, they continued to tap 1.04 seconds after the song ended, as if the song had somehow managed to extend itself beyond its own physical limits. As the Pet Shop Boys say elsewhere on Introspective, the music plays forever.

Joe Mantell and Jack Nicholson in Chinatown

In some ways, then, a fade-out is the musical equivalent of the denouement in fiction, and it’s easy to draw parallels to different narrative strategies. A cold ending is the equivalent of the kind of abrupt close we see in many of Shakespeare’s tragedies, which rarely go on for long after the demise of the central character. (This may be due in part to the logistics of theatrical production: a scene change so close to the end would only sow confusion, and in the meantime, the leading actor is doing his best to lie motionless on the stage.) The false fade, in which a song like “Helter Skelter” pretends to wind down before abruptly ramping up again, has its counterpart in the false denouement, which we see in so many thrillers, perhaps most memorably in Thomas Harris’s Red Dragon. And the endless slow fade, which needs a long song like “Hey Jude” or “Dry the Rain” to sustain it, is reminiscent of the extended denouements in epic novels from War and Peace to The Lord of the Rings. The events of the epic wrench both the protagonist and reader out of everyday life, and after a thousand crowded pages, it takes time to settle us back into Bag End.

The fade, in short, is a narrative tool like any other, complete with its own rules and tricks of the trade. Weir quotes the sound engineer Jeff Rothschild, who says that in order for the fade to sound natural to a listener’s ear, the volume must “go down a little quicker at first, and then it’s a longer fade”—which is a strategy often employed in fiction, in which an abrupt conclusion to the central conflict is followed by a more gradual withdrawal. There are times, of course, when a sudden ending is what you want: Robert Towne himself admits that the original dying close of Chinatown isn’t as effective as the “simple severing of the knot” that Roman Polanski imposed. But it’s a mistake to neglect a tool both so simple and so insinuating. (A fade-in, which allows the song to edge gradually into our circle of consciousness, can create an equally haunting impression, as in Beethoven’s Ninth Symphony and one of my favorite deep cuts by the Beatles, George Harrison’s “I Want to Tell You.”) These days, we have a way of seeing songs as discrete items on a playlist, but they often work best if they’re allowed to spill over a bit to either side. An ending draws a line in the world, but sometimes it’s nice if it’s a little blurred.

Written by nevalalee

April 27, 2016 at 9:00 am

The title search

with 2 comments

Crash

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What pop-culture title duplication do you find most annoying?”

At the thrift store up the street from my house, there’s a shelf that an enterprising employee has stocked with a selection of used books designed to make you look twice: The Time Traveler’s Wife, The Paris Wife, The Tiger’s Wife, A Reliable Wife, American Wife, The Zookeeper’s Wife, and probably some others I’ve forgotten. It’s funny to see them lined up in a row, but there’s a reason this formula is so popular: with an average of just three words, it gives us the germ of a story—at least in terms of its central relationship—and adds a touch of color to distinguish itself from its peers. As with the premises of most bestsellers, the result is both familiar and a little strange, telling us that we’re going to be told the story of a marriage, which is always somewhat interesting, but with a distinctive twist or backdrop. It’s perhaps for analogous reasons that the three most notable mainstream thrillers of the past decade are titled The Girl With the Dragon Tattoo, Gone Girl, and The Girl on the Train, the last of which is still awaiting its inevitable adaptation by David Fincher. Girl, in the context of suspense, has connotations of its own: when used to refer to an adult woman, it conveys the idea that she looks harmless, but she isn’t as innocent as she seems. (And I’m far from immune from this kind of thing: if there’s one noun that challenges Girl when it comes to its omnipresence in modern thriller titles, it’s Thief.)

When you look at the huge range of words available in any language, it can feel strange to reflect that the number of marketable titles often feels like a limited pool. Whenever I’m about to write a new story, I check Amazon and the Internet Science Fiction Database to make sure that the title hasn’t been used before, and in many cases, I’m out of luck. (Occasionally, as with “Cryptids,” I’ll grab a title that I’m surprised has yet to be claimed, and it always feels like snagging a prime domain name or Twitter handle.) Titles stick to the same handful of variations for a lot of reasons, sometimes to deliberately evoke or copy a predecessor, sometimes—as with erotica—to enable keyword searches, and sometimes because we’ve agreed as a culture that certain words are more evocative than others. As David Haglund of Slate once observed, everyone seems to think that tacking American onto the start of a title lends it a certain gravitas. For many movies, titles are designed to avoid certain implications, while inadvertently creating new clichés of their own. Studios love subtitles that allow them to avoid the numerals conventionally attached to a sequel or prequel, which is why we see so many franchise installments with a colon followed by Origins, Revolution, Resurrection, or Genesis, or sequels that offer us the Rise of, the Age of, the Dawn of, or the Revenge of something or other. I loved both Rise and Dawn of the Planet of the Apes, but I can’t help feeling that it’s a problem that their titles could easily have been interchanged.

Twilight

Then, of course, there are the cases in which a title is simply reappropriated by another work: Twilight, The Host, Running Scared, Bad Company, Fair Game. This sort of thing can lead to its share of confusion, some of it amusing. Recently, while watching the Roger Ebert documentary Life Itself, my wife was puzzled by a sequence revolving around the movie Crash, only to belatedly realize that it was referring to the one by David Cronenberg, not Paul Haggis. From a legal perspective, there’s no particular reason why a title shouldn’t be reused: titles can’t be copyrighted, although you could be sued for unfair competition if enough money were on the line. Hence the inadvisability of publishing your own fantasy series called Game of Thrones or The Wheel of Time, although Wheel of Thrones is still ripe for the taking. (In Hollywood, disputes over titles are sometimes used as a bullying tactic, as in the otherwise inexplicable case of The Butler, which inspired Warner Bros. to file for arbitration because it had released a silent film of the same name in 1916. There’s also the odd but persistent rumor that Bruce Willis agreed to star in Live Free or Die Hard on condition that he be given the rights to the title Tears of the Sun—which raises the question if it was worth the trouble. Such cases, which are based on tenuous legal reasoning at best, always remind me of David Mamet’s line from State and Main, which knows as much as any film about how movie people think: “I don’t need a cause. I just need a lawyer.”)

Which just reminds us that titles are like any other part of the creative process, except more so. We feel that our choices have a talismanic quality, even if their true impact on success or failure is minimal, and it’s only magnified by the fact that we only have a handful of words with which to make an impression. No title alone can guarantee a blockbuster, but we persist in thinking that it can, which may be why we return consistently to the same words or phrases. But it’s worth keeping an observation by Jorge Luis Borges in mind:

Except for the always astonishing Book of the Thousand Nights and One Night (which the English, equally beautifully, called The Arabian Nights) I believe that it is safe to say that the most celebrated works of world literature have the worst titles. For example, it is difficult to conceive of a more opaque and visionless title than The Ingenious Knight Don Quixote of La Mancha, although one must grant that The Sorrows of Young Werther and Crime and Punishment are almost as dreadful.

And that’s as true today as it ever was—it’s hard to think of a less interesting title than Breaking Bad. Still, the search continues, and sometimes, it pays off. Years ago, a writer and his editor were struggling to come up with a title for a debut novel, considering and discarding such contenders as The Stillness in the Water, The Silence of the Deep, and Leviathan Rising. Ultimately, they came up with over a hundred possibilities, none of them any good, and it was only twenty minutes before they were set to go to the printer that the editor finally gave up and said: “Okay, we’ll call the thing Jaws.”

Written by nevalalee

July 17, 2015 at 8:53 am

%d bloggers like this: