Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Anthony Lane

The president is collaborating

leave a comment »

Last week, Bill Clinton and James Patterson released their collaborative novel The President is Missing, which has already sold something like a quarter of a million copies. Its publication was heralded by a lavish two-page spread in The New Yorker, with effusive blurbs from just about everyone whom a former president and the world’s bestselling author might be expected to get on the phone. (Lee Child: “The political thriller of the decade.” Ron Chernow: “A fabulously entertaining thriller.”) If you want proof that the magazine’s advertising department is fully insulated from its editorial side, however, you can just point to the fact that the task of reviewing the book itself was given to Anthony Lane, who doesn’t tend to look favorably on much of anything. Lane’s style—he has evidently never met a smug pun or young starlet he didn’t like—can occasionally turn me off from his movie reviews, but I’ve always admired his literary takedowns. I don’t think a month goes by that I don’t remember his writeup of the New York Times bestseller list May 15, 1994, which allowed him to tackle the likes of The Bridges of Madison County, The Celestine Prophecy, and especially The Day After Tomorrow by Allan Folsom, from which he quoted a sentence that permanently changed my view of such novels: “Two hundred European cities have bus links with Frankfurt.” But he seems to have grudgingly liked The President is Missing. If nothing else, he furnishes a backhanded compliment that has already been posted, hilariously out of context, on Amazon: “If you want to make the most of your late-capitalist leisure-time, hit the couch, crack a Bud, punch the book open, focus your squint, and enjoy.”

The words “hit the couch, crack a Bud, punch the book open, [and] focus your squint,” are all callbacks to samples of Patterson’s prose that Lane quotes in the review, but the phrase “late-capitalist leisure-time” might require some additional explanation. It’s a reference to the paper “Structure over Style: Collaborative Authorship and the Revival of Literary Capitalism,” which appeared last year in Digital Humanities Review, and I’m grateful to Lane for bringing it to my attention. The authors, Simon Fuller and James O’Sullivan, focus on the factory model of novelists who employ ghostwriters to boost their productivity, and their star exhibit is Patterson, to whom they devote the same kind of computational scrutiny that has previously uncovered traces of collaboration in Shakespeare. Not surprisingly, it turns out that Patterson doesn’t write most of the books that he ostensibly coauthors. (He may not even have done much of the writing on First to Die, which credits him as the sole writer.) But the paper is less interesting for its quantitative analysis than for its qualitative evaluation of what Patterson tells us about how we consume and enjoy fiction. For instance:

The form of [Patterson’s] novels also appears to be molded by contemporary experience. In particular, his work is perhaps best described as “commuter fiction.” Nicholas Paumgarten describes how the average time for a commute has significantly increased. As a result, reading has increasingly become one of those pursuits that can pass the time of a commute. For example, a truck driver describes how “he had never read any of Patterson’s books but that he had listened to every single one of them on the road.” A number of online reader reviews also describe Patterson’s writing in terms of their commutes…With large print, and chapters of two or three pages, Patterson’s works are constructed to fit between the stops on a metro line.

Of course, you could say much the same of many thrillers, particularly the kind known as the airport novel, which wasn’t just a book that you read on planes—at its peak, it was one in which many scenes took place in airports, which were still associated with glamor and escape. What sets Patterson apart from his peers is his ability to maintain a viable brand while publishing a dozen books every year. His productivity is inseparable from his use of coauthors, but he wasn’t the first. Fuller and O’Sullivan cite the case of Alexandre Dumas, who allegedly boasted of having written four hundred novels and thirty-five plays that had created jobs for over eight thousand people. And they dig up a remarkable quote from The German Ideology by Karl Marx and Friedrich Engels, who “favorably compare French popular fiction to the German, paying particular attention to the latter’s appropriation of the division of labor”:

In proclaiming the uniqueness of work in science and art, [Max] Stirner adopts a position far inferior to that of the bourgeoisie. At the present time it has already been found necessary to organize this “unique” activity. Horace Vernet would not have had time to paint even a tenth of his pictures if he regarded them as works which “only this Unique person is capable of producing.” In Paris, the great demand for vaudevilles and novels brought about the organization of work for their production, organization which at any rate yields something better than its “unique” competitors in Germany.

These days, you could easily imagine Marx and Engels making a similar case about film, by arguing that the products of collaboration in Hollywood have often been more interesting, or at least more entertaining, than movies made by artists working outside the system. And they might be right.

The analogy to movies and television seems especially appropriate in the case of Patterson, who has often drawn such comparisons himself, as he once did to The Guardian: “There is a lot to be said for collaboration, and it should be seen as just another way to do things, as it is in other forms of writing, such as for television, where it is standard practice.” Fuller and O’Sullivan compare Patterson’s brand to that of Alfred Hitchcock, whose name was attached to everything from Dell anthologies to The Three Investigators to Alfred Hitchcock’s Mystery Magazine. It’s a good parallel, but an even better one might be hiding in plain sight. In her recent profile of the television producer Ryan Murphy, Emily Nussbaum evokes an ability to repackage the ideas of others that puts even Patterson to shame:

Murphy is also a collector, with an eye for the timeliest idea, the best story to option. Many of his shows originate as a spec script or as some other source material. (Murphy owned the rights to the memoir Orange Is the New Black before Jenji Kohan did, if you want to imagine an alternative history of television.) Glee grew out of a script by Ian Brennan; Feud began as a screenplay by Jaffe Cohen and Michael Zam. These scripts then get their DNA radically altered and replicated in Murphy’s lab, retooled with his themes and his knack for idiosyncratic casting.

Murphy’s approach of retooling existing material in his own image might be even smarter than Patterson’s method of writing outlines for others to expand, and he’s going to need it. Two months ago, he signed an unprecedented $300 million contract with Netflix to produce content of all kinds: television shows, movies, documentaries. And another former president was watching. While Bill Clinton was working with Patterson, Barack Obama was finalizing a Netflix deal of his own—and if he needs a collaborator, he doesn’t have far to look.

Astounding Stories #21: Black Man’s Burden

with 3 comments

Note: With less than half a year to go until the publication of Astounding: John W. Campbell, Isaac Asimov, Robert A. Heinlein, L. Ron Hubbard, and the Golden Age of Science Fiction, I’m returning, after a long hiatus, to the series in which I highlight works of science fiction that deserve to be rediscovered, reappraised, or simply enjoyed by a wider audience. You can read the earlier installments here

“This never gets old,” T’Challa says in Black Panther, just before we see the nation of Wakanda in its full glory for the first time. It’s perhaps the most moving moment in this often overwhelmingly emotional film, and it speaks to how much of its power hinges on the idea of Wakanda itself. Most fictional countries in the movies—a disproportionate number of which seem to be located in Africa, South America, or the Middle East—are narrative evasions, but not here. As Ishaan Tharoor wrote recently in the Washington Post:

Wakanda, like many places in Africa, is home to a great wealth of natural resources. But unlike most places in Africa, it was able to avoid European colonization. Shielded by the powers of vibranium, the element mined beneath its surface that enabled the country to develop the world’s most advanced technology, Wakanda resisted invaders while its rulers constructed a beautiful space-age kingdom.

Or as the writer Evan Narcisse observed elsewhere to the Post: “Wakanda represents this unbroken chain of achievement of black excellence that never got interrupted by colonialism.” It’s imaginary, yes, but that’s part of the point. In his review, Anthony Lane of The New Yorker delivered a gentle rebuke: “I wonder what weight of political responsibility can, or should, be laid upon anything that is accompanied by buttered popcorn. Vibranium is no more real than the philosopher’s stone…Are 3-D spectacles any more reliable than rose-tinted ones, when we seek to imagine an ideal society?” But the gap between dreams and reality is precisely how the best science fiction—and Black Panther, along with so much else, is a kickass science fiction movie—compels us to see the world with new eyes.

The fiction published by the editor John W. Campbell rarely tackled issues of race directly, and the closest that it ever came was probably a series that began with Black Man’s Burden, the first installment of which ran in the December 1961 issue of Analog. It revolves around a coalition of African-American academics working undercover to effect social and political change in North Africa, with the ultimate goal of uniting the region in the scientific and cultural values of the West. The protagonist is a sociologist named Homer Crawford, who explains:

The distrust of the European and the white man as a whole was prevalent, especially here in Africa. However, and particularly in Africa, the citizens of the new countries were almost unbelievably uneducated, untrained, incapable of engineering their own destiny…We of the Reunited Nations teams are here because we are Africans racially but not nationally, we have no affiliations with clan, tribe, or African nation. We are free to work for Africa’s progress without prejudice. Our job is to remove obstacles wherever we find them. To break up log jams. To eliminate prejudices against the steps that must be taken if Africa is to run down the path of progress, rather than to crawl.

All of this is explained to the reader at great length. There’s some effective action, but much of the story consists of the characters talking, and if these young black intellectuals all end up sounding a lot like John W. Campbell, that shouldn’t be surprising—the author, Mack Reynolds, later said that the story and its sequels “were written at a suggestion of John Campbell’s and whole chunks of them were based on his ideas.” Many sections are taken verbatim from the editor’s letters and editorials, ranging from his musings on judo, mob psychology, and the virtues of the quarterstaff to blanket statements that border on the unforgivable: “You know, with possibly a few exceptions, you can’t enslave a man if he doesn’t want to be a slave…The majority of Jefferson’s slaves wanted to be slaves.”

We’re obviously a long way from Wakanda here—but although Black Man’s Burden might seem easy to hate, oddly enough, it isn’t. Mack Reynolds, who had lived in North Africa, was a talented writer, and the serial as a whole is intelligent, restrained, consistently interesting, and mindful of the problems with its own premise. To encourage the locals to reject tribalism in favor of modern science, medicine, and education, for instance, the team attributes many of its ideas to a fictional savior figure, El Hassan, on the theory that such societies “need a hero,” and by the end, Homer Crawford has reluctantly assumed the role himself. (There are shades not just of T.E. Lawrence but of Paul Atreides, whose story would appear in the magazine just two years later.) But he has few illusions about the nature of his work. As one of his colleagues puts it in the sequel:

Monarchies are of the past, and El Hassan is the voice of the future, something new. We won’t admit he’s just a latter-day tyrant, an opportunist seizing power because it’s there crying to be seized. Actually, El Hassan is in the tradition of Genghis Khan, Temerlane, or, more recently, Napoleon. But he’s a modern version, and we’re not going to hang the old labels on him.

Crawford mordantly responds: “As a young sociologist, I never expected to wind up a literal tyrant.” And Reynolds doesn’t pretend to offer easy solutions. The sequel, Border, Breed, Nor Birth, closes with a bleak denial of happy endings, while the concluding story, “Black Sheep Astray,” ends with Crawford, overthrown after a long rule as El Hassan, returning to start a new revolution among the younger generation, at the likely cost of his life. The leads are drawn with considerable care—even if Reynolds has a bad habit of saying that they look “surprisingly like” Joe Louis or Lena Horne—and their mere presence in Analog is striking enough that one prominent scholar has used it to question Samuel R. Delany’s claim that Campbell rejected one of his stories because “his readership would be able to relate to a black main character.”

Yet this overlooks the fact that an ambitious, messy, uncategorizable novel like Delany’s Nova is worlds apart from a serial that was commissioned and written to Campbell’s specifications. And its conceptual and literary limitations turn out to be closely related. Black Man’s Burden is constructed with diligence and real craft, but this doesn’t make its basic premise any more tenable. It interrogates many of its assumptions, but it doesn’t really question the notion of a covert operation to shape another country’s politics through propaganda, guerrilla action, and the assimilation of undercover agents into the local population. This isn’t science fiction. It’s what intelligence agencies on both sides were doing throughout the Cold War. (If anything, the whisper campaign for El Hassan seems primitive by contemporary standards. These days, the plan would include data analysis, viral messaging in support of favored policies or candidates, and the systematic weaponization of social media on the part of foreign nationals. What would be wrong with that?) By the story’s own logic, the project has to be run by black activists because the locals are suspicious of white outsiders, but there’s no suggestion that their underlying goals are any different—and if the same story would be unthinkable with a white protagonist, it implies that it has problems here that can’t be addressed with a change of race. It’s also characteristically evasive when it comes to how psychohistory actually works. Reading it again, I found myself thinking of what William Easterly writes in The White Man’s Burden:

A Planner thinks he already knows the answers; he thinks of poverty as a technical engineering problem that his answers will solve. A Searcher admits he doesn’t know the answers in advance…A Planner believes outsiders know enough to impose solutions. A Searcher believes only insiders have enough knowledge to find solutions, and that most solutions must be homegrown.

Planners still exist in foreign aid—but they can also edit magazines. Campbell was one of them. Black Man’s Burden was his idea of how to deal with race in Analog, even as he failed to make any effort to look for black writers who knew about the subject firsthand. And it worked about as well here as it did anywhere else.

The art of the bad review

leave a comment »

Mark Twain

Note: I’m taking a few days off for the holidays, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on September 7, 2016.

Every few years, whenever my spirits need a boost, I go back and read the famous smackdown that Martin Amis delivered to the novel Hannibal by Thomas Harris, just for the simple pleasure of it. It’s one of the great savage reviews of all time, and it checks off most of the boxes that this sort of shellacking requires. Amis begins by listing the hyperbolic claims made by other reviewers—“A momentous achievement,” “A plausible candidate for the Pulitzer Prize”—and then skewering them systematically. But he also goes after the novel, significantly, from a position of respect, calling himself “a Harris fan from way back.” Writing of the earlier books in the series, he says that Harris has achieved what every popular novelist hopes to accomplish: “He has created a parallel world, a terrible antiterra, airless and arcane but internally coherent.” When Amis quotes approvingly from the previous installments, it can only make Hannibal look worse by comparison, although Harris doesn’t do himself any favors. As Amis writes:

[Lecter] has no need of “need”: Given the choice, he—and Harris—prefer to say “require”…Out buying weapons—or, rather, out “purchasing” weapons—he tells the knife salesman, “I only require one.” Why, I haven’t felt such a frisson of sheer class since I last heard room service say “How may I assist you?’” And when Lecter is guilty of forgetfulness he says “Bother”—not “Shit” or “Fuck” like the rest of us. It’s all in the details.

Amis’s review falls squarely in the main line of epic takedowns that began with Mark Twain’s “Fenimore Cooper’s Literary Offenses.” This is a piece that was probably ruined for a lot of readers by being assigned in high school, but it deserves a fresh look: it’s one of the funniest and most valuable essays about writing that we have, and I revisit it on a regular basis. Like Amis, Twain begins by quoting some of the puffier encomiums offered by other critics: “[Cooper’s] five tales reveal an extraordinary fullness of invention…The craft of the woodsman, the tricks of the trapper, all the delicate art of the forest were familiar to Cooper from his youth up.” (Twain proposes the following rule in response: “Crass stupidities shall not be played upon the reader as ‘the craft of the woodsman, the delicate art of the forest’ by either the author or the people in the tale.”) Both Twain and Amis are eager to go after their subjects with a broadsword, but they’re also alert to the nuances of language. For Amis, it’s the subtle shading of pretension that creeps in when Harris writes “purchases” instead of “buys”; for Twain, it’s the distinction between “verbal” and “oral,” “precision” and “facility,” “phenomena” and “marvels,” “necessary” and “predetermined.” His eighteen rules of writing, deduced in negative fashion from Cooper’s novels, are still among the best ever assembled. He notes that one of the main requirements of storytelling is “that the personages in a tale shall be alive, except in the case of corpses, and that always the reader shall be able to tell the corpses from the others.” Which, when you think about it, is even more relevant in Harris’s case—although that’s a subject for another post.

Martin Amis

I’ve learned a lot from these two essays, as I have with other bad reviews that have stuck in my head over the years. In general, a literary critic should err on the side of generosity, especially when it comes to his or her contemporaries, and a negative review of a first novel that nobody is likely to read is an expense of spirit in a waste of shame. But occasionally, a bad review can be just as valuable and memorable as any other form of criticism. I may not agree with James Wood’s feelings about John le Carré, but I’ll never forget how he sums up a passage from Smiley’s People as “a clever coffin of dead conventions.” Once a year or so, I’ll find myself remembering John Updike’s review of Tom Wolfe’s A Man in Full, which notes the author’s obsession with muscular male bodies—“the latissimi dorsi,” “the trapezius muscles”—and catalogs his onomatopoetics, which are even harder to take seriously when you have to type them all out:

“Brannnnng! Brannnnng! Brannnnng!,” “Woooo-eeeeeee! Hegh-heggghhhhhh,” “Ahhhhhhhhhhh ahhhhhhhhhhhh ahhhhhhhhhhh,” “Su-puerflyyyyyyyyyyyyyyyy!,” “eye eye eye eye eye eye eye eye eye,” Scrack scrack scrack scraccckkk scraccccck,” “glug glug glug glugglugglug,” “Awriiighhhhhhhht!”

And half of my notions as a writer seem to have been shaped by a single essay by Norman Mailer, “Some Children of the Goddess,” in which he takes careful aim at most of his rivals from the early sixties. William Styron’s Set This House on Fire is “the magnum opus of a fat spoiled rich boy who could write like an angel about landscape and like an adolescent about people”; J.D. Salinger’s four novellas about the Glass family “seem to have been written for high-school girls”; and Updike himself writes “the sort of prose which would be admired in a writing course overseen by a fussy old nance.”

So what makes a certain kind of negative review linger in the memory for longer than the book it describes? It often involves one major writer taking aim at another, which is already more interesting than the sniping of a critic who knows the craft only from the outside. In most cases, it picks on a target worthy of the writer’s efforts. And there’s usually an undercurrent of wounded love: the best negative reviews, like the one David Foster Wallace delivered on Updike’s Toward the End of Time, or Renata Adler’s demolition of Pauline Kael, reflect a real disillusionment with a former idol. (Notice, too, how so many of the same names keep recurring, as if Mailer and Updike and Wolfe formed a closed circle that runs forever, in a perpetual motion machine of mixed feelings.) Even when there’s no love lost between the critic and his quarry, as with Twain and Cooper, there’s a sense of anger at the betrayal of storytelling by someone who should know better. To return to poor Thomas Harris, I’ll never forget the New Yorker review by Anthony Lane that juxtaposed a hard, clean excerpt from The Silence of the Lambs:

“Lieutenant, it looks like he’s got two six-shot .38s. We heard three rounds fired and the dump pouches on the gunbelts are still full, so he may just have nine left. Advise SWAT it’s +Ps jacketed hollowpoints. This guy favors the face.”

With this one from Hannibal Rising:

“I see you and the cricket sings in concert with my heart.”
“My heart hops at the sight of you, who taught my heart to sing.”

Lane reasonably responds: “What the hell is going on here?” And that’s what all these reviews have in common—an attempt by one smart, principled writer to figure out what the hell is going on with another.

The large rug

leave a comment »

A few days ago, I was browsing through The Journals of André Gide, 1914-1927 in search of a quotation when my eye was caught by the following passage:

What a wonderful subject for a novel: X. indulges in a tremendous effort of ingenuity, scheming, and duplicity to succeed in an undertaking that he knows to be reprehensible. He is urged on by his temperament, which has its exigences, then by the rule of conduct he has built in order to satisfy them. It takes an extreme and hourly application; he expends more resolve, energy, and patience in this than would be needed to succeed in the best. And when eventually the event is prepared to such a point that he has only to let it take its course, the letdown he experiences allows him to reflect; he then realizes that he has ceased to desire greatly that felicity on which he had counted too much. But it is too late now to back out; he is caught in the mechanism he has built and set in motion and, willy-nilly, he must now follow its impetus to its conclusion.

Reading this over, I naturally thought of Donald Trump, who seems less happy to be in the White House than any other president in recent memory. Before I reveal how the story ends, however, I need to talk about Gide himself, a man of letters who was awarded the Nobel Prize later in life in honor of a career of extraordinary range and productivity. The plot that he outlines here sounds at first like a crime novel, but he may well have had a political context in mind—he wrote this journal entry on May 9, 1918, adding a few days later of the war: “The victory will be due to an invention, to something surprising or other; and not so much to the army as to the scientist and the engineer.”

But there’s also an uncomfortable truth about Gide that we need to confront. In 1999, Anthony Lane of The New Yorker wrote an appreciation of Gide’s work, saying that his “sincerity” was “alarmingly apposite to our own era, when a few insincere words to the press corps are almost enough to unseat a president.” This reads now as merely quaint. But a few pages later, Lane writes: “Gide was true to his inconstancy; he would never relinquish his sweet tooth for young Arabs, or for teenagers of every race.” In the book André and Oscar, Jonathan Fryer, a sympathetic biographer, describes a trip to North Africa that Gide took in his early twenties:

André’s illness did not prevent his going out to sit with [the painter] Paul Laurens, as his friend painted local scenes, or persuaded local children to pose for him. The children fascinated André. Groups of boys would gather outside the hotel where the two friends were staying, out of curiosity or a wish to earn a few coins through some trivial service. André’s attention had been particularly caught by one brown-skinned lad called Ali, who one day suggested that he should carry André’s overcoat and invalid’s rug to the dunes, where André could enjoy some of the weak autumn sun…As soon as they got into the crater, the boy threw his coat and rug to the ground, then flung himself down, stretched out on his back, his arms spread out, all the while laughing. André sat down primly at a distance, well aware of what was on offer, but not quite ready to accept. Ali’s face clouded; his smile disappeared. “Goodbye then,” he said, rising to his feet. But André seized the hand that the boy held out and pulled him to the ground.

I’ll skip over Frye’s description of what happened next on that “invalid’s rug,” but I’m compelled to note that he concludes of what he calls “this restorative treatment”: “André had indeed found himself.”

What are we supposed to think about this? Many of Gide’s admirers have done their best not to think about it at all. Lane, writing two decades ago, mentions it only in passing. (His article, incidentally, is titled “The Man in the Mirror,” a pop culture reference that I sincerely hope wasn’t intentional.) Fryer does what he can in the line of extenuation, in terms that have an uncomfortably familiar ring: “Most of André’s and Paul’s little visitors were on the wrong side of puberty, as moralists these days would view it. Not that André’s pedophilia seems to have taken on any physical dimension. Many of his future sexual partners would range between the ages of fourteen to seventeen, with the initiative coming from the adolescent himself.” This wouldn’t fly today, and even if we try to ignore Gide’s interest in very young children—Fryer compares him to Lewis Carroll—there’s no getting around those teenagers. In André Gide: A Life in the Present, the biographer Alan Sheridan shares the following story, which took place when Gide was in his thirties:

The train journey to Weimar was not without its “petite aventure.” No doubt as the result of his usual systematic inspection of the entire train, Gide found himself in a compartment with two German boys, brothers aged sixteen and fourteen. After falling asleep, Gide woke up to find the younger boy standing near him looking out of the window. Gide got up and stood beside him. Wandering fingers were met with encouragement—the elder brother was still asleep. Under a large rug, matters proceeded, further helped when the train entered a long tunnel.

This wasn’t an isolated incident. And Sheridan’s “matters proceeded,” like Fryer’s “restorative treatment,” feels like another large rug flung over our ability to honestly talk about it.

I’m not an expert on Gide, so I really can’t do anything more at this stage than flag this and move on. But it seems clear that we’re at the early stages of a reckoning that is only now beginning to turn to the figures of the past. Much of the pain of recent revelations comes from the realization that men we admired and saw as intellectual or artistic role models have repeatedly betrayed that trust, and the fact that the person in question is no longer alive shouldn’t exempt him from scrutiny. If anything, it’s only going to get harder from here, since we’re talking in many cases about literary giants whose behavior has been a matter of public record for decades. (Just last week, Orhan Pamuk, another Nobel laureate, mentioned Gide in the New York Times in an essay on the rise of nationalism in the West, but omitted any discussion of his personal life—and if you think that this isn’t relevant, try to imagine doing it now with a consideration of the ideas of, say, Israel Horovitz or Leon Wieseltier.) Here’s the conclusion of Gide’s “wonderful subject for a novel” that I quoted above:

The event that [X.] no longer dominates carries him along and it is almost passively that he witnesses his perdition. Unless he suddenly gets out of it by a sort of cowardice; for there are some who lack the courage to pursue their acts to their conclusion, without moreover being any more virtuous for this reason. On the contrary they come out diminished and with less self-esteem. This is why, everything considered, X. will persevere, but without any further desire, without joy and rather through fidelity. This is the reason why there is often so little happiness in crime—and what is called “repentance” is frequently only the exploitation of this.

This still seems to shed light on Trump and his enablers—but also on Harvey Weinstein and so many others. And it can’t just be swept under the rug.

Calder’s baggage

leave a comment »

For most of the last week, I’ve been obsessively leafing through all of the multivolume biographies that I own, glancing over their endnotes, reading their acknowledgments, and marveling both at their sheer bulk and at the commitment of time that they require. You don’t need to be a psychologist to understand why. If all goes well, on Monday, I’ll be delivering a draft of Astounding to my editor. It’s a little anticlimactic—there’s plenty of rewriting to come, and I’m sending it out now mostly because that’s what it says in my contract. But it means, if nothing else, that I’m technically done, which I don’t take for granted. This project will have taken up three years of my life from initial conception to publication, which feels like a long time, although you don’t need to look far to find examples that dwarf it. (The champion here might be Muriel St. Clare Byrne, who spent fifty years on The Lisle Letters.) I would have happily worked for longer, and one of my readers rather deflatingly suggested, after reading a recent draft, that I ask my publisher for another year. But the more this kind of project drags out, the greater the chance that it won’t be finished at all, and on balance, I think it’s best for me to push ahead. The dust jacket of Robert A. Caro’s The Path to Power refers to it as “the first of the three volumes that will constitute The Years of Lyndon Johnson,” and we’re all still waiting patiently for number five to take us even as far as Vietnam. Much the same thing happened with John Richardson’s massive life of Picasso, which was originally supposed to be just one book, only to be touted later as an “exceedingly detailed yet readable three-volume life.” Richardson is currently at work on the fourth volume, which only follows Picasso up through World War II, with three decades still left to be covered. When recently asked if he thought he would ever get to a fifth, the author replied: “Listen, I’m ninety-one—I don’t think I have time for that.”

These days, such books are testing the limits of mortality, not just for authors and editors, but possibly for print media itself. When Caro published The Path to Power back in 1982, it would have been impossible to anticipate the changes in publishing that were looming on the horizon, and perhaps the arrival of another doorstopper about Lyndon Johnson every decade or so provides us with a sentimental connection to an earlier era of books. Yet the multivolume life seems more popular than ever, at least among major publishers. In the latest issue of The New Yorker, Adam Gopnik issues a mild protest against “the multivolume biography of the single-volume life”:

In the nineteenth century, the big sets were usually reserved for the big politicians. Disraeli got seven volumes and Gladstone three, but the lives of the poets or the artists or even the scientists tended to be enfolded within the limits of a single volume. John Forster’s life of Dickens did take its time, and tomes, but Elizabeth Gaskell kept Charlotte Brontë within one set of covers, and Darwin got his life and letters presented in one compact volume, by his son. The modern mania for the multivolume biography of figures who seem in most ways “minor” may have begun with Michael Holroyd’s two volumes devoted to Lytton Strachey, who was wonderful and influential but a miniaturist perhaps best treated as such. Strachey, at least, talked a lot and had a vivid sex life. But we are now headed toward a third volume of the life of Bing Crosby, and already have two volumes on Dai Vernon, the master card magician (a master, yes, but of card magic). This season, the life of Alexander Calder, toymaker to the modernist muses, arrives in the first volume of what promises to be two.

Gopnik seems bemused by the contrast between the size of Jed Perl’s Calder: The Conquest of Time: The Early Years: 1898-1940, which is seven hundred pages long, and the delicacy of the mobiles on which its subject’s reputation rests. And although he asks why we seem to be seeing more such efforts, which come off as oddly anachronistic at a time when publishing as a whole is struggling, he doesn’t really answer his own question. I can think of a few possible reasons. The most plausible explanation, I suspect, is that there’s an economic incentive to extending a life over multiple volumes, as long as the publisher is reasonably confident that an audience for it exists. If you’re the sort of person who would buy a huge biography of Alexander Calder at all, you’re probably going to buy two, and the relationship between the number of volumes and the rate of return—even after you account for time, production costs, and the loss of readers turned off by its size or lack of completion—might be narrowly positive. (You might think that these gains would be offset by the need to pay the author more money, but that probably isn’t the case. Looking at the acknowledgments for Richardson’s A Life of Picasso, it seems clear that his years of work were largely underwritten by outside sources, including nothing less than the John Richardson Fund for Picasso Research, set up by Sid and Mercedes Bass.) There’s a psychological side to this. As our online reading habits become divided into ever smaller particles of attention, perhaps we’re more drawn to these huge tomes as a sort of counterbalance, whether or not we have any intention of reading them. Publishing is as subject to the blockbuster mentality as any other art form, and it may well be that a book of fourteen hundred pages on Calder has a greater chance of reaching readers than one of three hundred pages would.

This kind of logic isn’t altogether unfamiliar in the art world, and Gopnik identifies a similar trend in Calder’s career, in which “the early sense of play gave way to dulled-down, chunk-of-metal-in-a-plaza heaviness.” Bigger can seem better for certain books as well, and biographers fill pages in the only way that they can. As Gopnik writes:

Calder’s is not a particularly dramatic life—he was neither much of a talker nor a prolific lover. In broad strokes, the career follows the customary arc of a modern artist, going from small, animated Parisian experiments, in the twenties, and ending with big, dull American commissions fifty years later—and though we are hungry to get him, we are not perhaps hungry to get him at quite this length. A dubious density of detailing—“In Paris, Calder had to wait an hour for his luggage, which he had checked through in London”—of the kind inevitable to such multivolume investigations may daunt even the reader who was eager at the start.

And that image of Calder waiting an hour for his luggage is one that every biographer should regard with dread. (It belongs on the same shelf as the line from Allan Folsom’s The Day After Tomorrow that Anthony Lane quoted to illustrate the accretion of procedural detail that deadens so many thrillers: “Two hundred European cities have bus links with Frankfurt.”) Not every big book suffers from this tendency—I don’t think that many readers wish that The Power Broker were shorter, even if its size discourages others from starting in the first place. And some lives do benefit from multiple books delivered over the course of many years. But they can also put readers in the position of waiting for more baggage—and when it comes at last, they’re the ones who get to decide whether or not it was worth it.

We lost it at the movies

with 3 comments

Over a decade ago, the New Yorker film critic David Denby published a memoir titled American Sucker. I read it when it first came out, and I honestly can’t remember much about it, but there’s one section that has stuck in my mind ever since. Denby is writing of his obsession with investing, which has caused him to lose much of what he once loved about life, and he concludes sadly:

Well, you can’t get back to that. Do your job, then. After much starting and stopping, and considerable shifting of clauses, all the while watching the Nasdaq run above 5,000 on the CNNfn website, I put together the following as the opening of a review.

It happens to be his piece on Steven Soderbergh’s Erin Brockovich, which begins like this:

In Erin Brockovich, Julia Roberts appears in scene after scene wearing halter tops with a bit of bra showing; there’s a good bit of leg showing, too, often while she’s holding an infant on one arm. This upbeat, inspirational melodrama, based on a true story and written by Susannah Grant and directed by Steven Soderbergh, has been bought to life by a movie star on a heavenly rampage. Roberts swings into rooms, ablaze with indignation, her breasts pushed up and bulging out of the skimpy tops, and she rants at the people gaping at her. She’s a mother and a moral heroine who dresses like trailer trash but then snaps at anyone who doesn’t take her seriously—a real babe in arms, who gets to protect the weak and tell off the powerful while never turning her back on what she is.

Denby stops to evaluate his work: “Nothing great, but not bad either. I was reasonably happy with it as a lead—it moves, it’s active, it conveys a little of my pleasure in the picture. I got up and walked around the outer perimeter of the twentieth floor, looking west, looking east.”

I’ve never forgotten this passage, in part because it represents one of the few instances in which a prominent film critic has pulled back the curtain on an obvious but rarely acknowledged fact—that criticism is a genre of writing in itself, and that the phrases with which a movie is praised, analyzed, or dismissed are subject to the same sort of tinkering, revision, and doubt that we associate with other forms of expression. Critics are only human, even if sometimes try to pretend that they aren’t, as they present their opinions as the product of an unruffled sensibility. I found myself thinking of this again as I followed the recent furor over David Edelstein’s review of Wonder Woman in New York magazine, which starts as follows:

The only grace note in the generally clunky Wonder Woman is its star, the five-foot-ten-inch Israeli actress and model Gal Gadot, who is somehow the perfect blend of superbabe-in-the-woods innocence and mouthiness. She plays Diana, the daughter of the Amazon queen Hippolyta (Connie Nielsen) and a trained warrior. But she’s also a militant peacenik. Diana lives with Amazon women on a mystically shrouded island but she’s not Amazonian herself. She was, we’re told, sculpted by her mother from clay and brought to life by Zeus. (I’d like to have seen that.)

Edelstein was roundly attacked for what was perceived as the sexist tone of his review, which also includes such observations as “Israeli women are a breed unto themselves, which I say with both admiration and trepidation,” and “Fans might be disappointed that there’s no trace of the comic’s well-documented S&M kinkiness.” He responded with a private Facebook post, widely circulated, in which he wrote: “Right now I think the problem is that some people can’t read.” And he has since written a longer, more apologetic piece in which he tries to explain his choice of words.

I haven’t seen Wonder Woman, although I’m looking forward to it, so I won’t wade too far into the controversy itself. But when I look at these two reviews—which, significantly, are about films focusing on different sorts of heroines—I see some striking parallels. It isn’t just the echo of “a real babe in arms” with “superbabe-in-the-woods,” or how Brockovich “gets to protect the weak and tell off the powerful” while Diana is praised for her “mouthiness.” It’s something in the rhythm of their openings, which start at a full sprint with a consideration of a movie star’s appearance. As Denby says, “it moves, it’s active,” almost to a fault. Here are three additional examples, taken at random from the first paragraphs of reviews published in The New Yorker:

Gene Wilder stares at the world with nearsighted, pale-blue-eyed wonder; he was born with a comic’s flyblown wig and the look of a reddish creature from outer space. His features aren’t distinct; his personality lacks definition. His whole appearance is so fuzzy and weak he’s like mist on the lens.

There is a thick, raw sensuality that some adolescents have which seems almost preconscious. In Saturday Night Fever, John Travolta has this rawness to such a degree that he seems naturally exaggerated: an Expressionist painter’s view of a young role. As Tony, a nineteen-year-old Italian Catholic who works selling paint in a hardware store in Brooklyn’s Bay Ridge, he wears his heavy black hair brushed up in a blower-dried pompadour. His large, wide mouth stretches across his narrow face, and his eyes—small slits, close together—are, unexpectedly, glintingly blue and panicky.

As Jake La Motta, the former middleweight boxing champ, in Raging Bull, Robert De Niro wears scar tissue and a big, bent nose that deform his face. It’s a miracle that he didn’t grow them—he grew everything else. He developed a thick-muscled neck and a fighter’s body, and for the scenes of the broken, drunken La Motta he put on so much weight that he seems to have sunk in the fat with hardly a trace of himself left.

All of these reviews were written, of course, by Pauline Kael, who remains the movie critic who has inspired the greatest degree of imitation among her followers. And when you go back and read Denby and Edelstein’s openings, they feel like Kael impersonations, which is the mode on which a critic tends to fall back when he or she wants to start a review so that “it moves, it’s active.” Beginning with a description of the star, delivered in her trademark hyperaware, slightly hyperbolic style, was one of Kael’s stock devices, as if she were observing an animal seen in the wild and frantically jotting down her impressions before they faded. It’s a technical trick, but it’s a good one, and it isn’t surprising that Kael’s followers like to employ it, consciously or otherwise. It’s when a male critic uses it to describe the appearance of a woman that we run into trouble. (The real offender here isn’t Denby or Edelstein, but Anthony Lane, Kael’s successor at The New Yorker, whose reviews have the curious habit of panning a movie for a page and a half, and then pausing a third of the way from the end to rhapsodize about the appearance of a starlet in a supporting role, which is presented as its only saving grace. He often seems to be leering at her a little, which is possibly an inadvertent consequence of his literary debt to Kael. When Lane says of Scarlett Johansson, “She seemed to be made from champagne,” he’s echoing the Kael who wrote of Madeline Kahn: “When you look at her, you see a water bed at just the right temperature.”) Kael was a sensualist, and to the critics who came after her, who are overwhelmingly male, she bequeathed a toolbox that is both powerful and susceptible to misuse when utilized reflexively or unthinkingly. I don’t think that Edelstein is necessarily sexist, but he was certainly careless, and in his routine ventriloquism of Kael, which to a professional critic comes as easily as breathing, he temporarily forgot who he was and what movie he was reviewing. Kael was the Wonder Woman of film critics. But when we try to channel her voice, and we can hardly help it, it’s worth remembering—as another superhero famously learned—that with great power comes great responsibility.

A visit to the chainmaker

leave a comment »

In the landmark study The Symbolist Movement in Literature by the critic Arthur Symons, there’s a short chapter titled “A Note on Zola’s Method.” Even if you’ve never gotten around to reading Émile Zola—and I confess that I haven’t—it’s an essay that every writer should take to heart. After describing the research that Zola devoted to his novel L’Assommoir, Symons launches a brutal attack on the value of this kind of work:

[Zola] observes with immense persistence, but his observation, after all, is only that of the man in the street; it is simply carried into detail, deliberately…And so much of it all is purely unnecessary, has no interest in itself and no connection with the story: the precise details of Lorilleux’s chainmaking, bristling with technical terms…Goujet’s forge, and the machinery in the shed next door; and just how you cut out zinc with a large pair of scissors.

We’ve all read stories in which the writer feels obliged to include every last bit of research, and Symons’s judgment of this impulse is deservedly harsh:

To find out in a slang dictionary that a filthy idea can be expressed by an ingeniously filthy phrase…is not a great feat, or, on purely artistic grounds, altogether desirable. To go to a chainmaker and learn the trade name of the various kinds of chain which he manufactures, and of the instruments with which he manufactures them, is not an elaborate process, or one which can be said to pay you for the little trouble which it no doubt takes. And it is not well to be too certain after all that Zola is always perfectly accurate in his use of all this manifold knowledge.

And the most punishing comparison is yet to come: “My main contention is that Zola’s general use of words is, to be quite frank, somewhat ineffectual. He tries to do what Flaubert did, without Flaubert’s tools, and without the craftsman’s hand at the back of the tools. His fingers are too thick; they leave a blurred line. If you want merely weight, a certain kind of force, you get it; but no more.” It’s the difference, Symons observes, between the tedious accumulation of detail, in hopes that its sheer weight will somehow make the scene real, and the one perfect image that will ignite a reader’s imagination:

[Zola] cannot leave well alone; he cannot omit; he will not take the most obvious fact for granted…He tells us particularly that a room is composed of four walls, that a table stands on its four legs. And he does not appear to see the difference between doing that and doing as Flaubert does, namely, selecting precisely the detail out of all others which renders or consorts with the scene in hand, and giving that detail with an ingenious exactness.

By way of illustration, Symons quotes the moment in Madame Bovary in which Charles turns away at the exact moment that his first wife dies, which, he notes, “indicates to us, at the very opening of the book, just the character of the man about whom we are to read so much.” And he finishes with a devastating remark that deserves to be ranked alongside Mark Twain’s classic demolition of James Fenimore Cooper: “Zola would have taken at least two pages to say that, and, after all, he would not have said it.”

Flaubert, of course, is usually seen as the one shining example of a writer whose love of research enhanced his artistry, rather than diminishing it. In his takedown of a very different book, Allan Folsom’s thriller The Day After Tomorrow, the critic Anthony Lane cites one typical sentence—“Two hundred European cities have bus links with Frankfurt”—and adds:

When Flaubert studied ancient Carthage for Salammbô, or the particulars of medieval falconry for “The Legend of St. Julien Hospitalier,” he was furnishing and feathering a world that had already taken shape within his mind; when Allan Folsom looks at bus timetables, his book just gets a little longer.

Even Flaubert’s apparent mistakes, on closer examination, turn out to be controlled by an almost inhuman attentiveness. In his novel Flaubert’s Parrot, Julian Barnes quotes a line from the literary critic Enid Starkie: “Flaubert does not build up his characters, as did Balzac, by objective, external description; in fact, so careless is he of their outward appearance that on one occasion he gives Emma brown eyes; on another deep black eyes; and on another blue eyes.” When the narrator, who shouldn’t be confused with Barnes himself, goes back to the text, he finds that Flaubert, in fact, describes Emma’s eyes with meticulous precision. In their first appearance, he writes: “In so far as she was beautiful, this beauty lay in her eyes: although they were brown, they would appear black because of her lashes.” A little later on: “They were black when she was in shadow and dark blue in full daylight.” And just after her seduction, as Emma looks in the mirror: “Her eyes had never been so large, so black, nor contained such depth.” Barnes’s narrator concludes: “It would be interesting to compare the time spent by Flaubert making sure that his heroine had the rare and difficult eyes of a tragic adulteress with the time spent by Dr. Starkie in carelessly selling him short.”

This level of diligent observation is a universe apart from the mechanical gathering of detail, and there’s no question that writers should aim for one, not the other. But to some extent, we all pay visits to the chainmaker—that is, we conduct research aimed at furnishing our stories with material that we can’t get from personal experience. Sometimes we even get this information from books. (Tolstoy seems to have derived all of the information about the Freemasons in War and Peace from his reading, which scandalizes some critics, as if they’ve caught him in an embarrassing breach of etiquette.) If an author’s personality is strong enough, it can transmute it into something more. John Updike turned this into a calling card, moving methodically through a series of adulterous white male protagonists who were distinguished mostly by their different jobs. In U and I, Nicholson Baker tries to call this a flaw: “He gives each of his male characters a profession, and then he has him think in metaphors drawn from that profession. That’s not right.” But after approvingly quoting one of the metaphors that emerge from the process, Baker changes his mind:

Without Updike’s determination to get some measure of control over his constant instinct to fling outward with a simile by filtering his correspondences through the characters’ offstage fictional professions, he would probably not have come up with this nice little thing, dropped as it is into the middle of a paragraph.

I like that phrase “measure of control,” which gets at the real point of research. It isn’t to pad out the story, but to channel it along lines that wouldn’t have occurred to the author otherwise. Research can turn into a set of chains in itself. But after all the work is done, the writer should be able to say, like Dylan Thomas in “Fern Hill”: “I sang in my chains like the sea.”

The art of the bad review

leave a comment »

Mark Twain

Yesterday, while writing about the pitfalls of quotation in book reviews, I mentioned the famous smackdown that Martin Amis delivered to the novel Hannibal by Thomas Harris. When I went back to look up the lines I wanted to quote, I found myself reading the whole thing over again, just for the simple pleasure of it. It’s one of the great critical slams of all time, and it checks off most of the boxes that this kind of shellacking requires. Amis begins by listing a few hyperbolic claims made by other reviewers—“A momentous achievement,” “A plausible candidate for the Pulitzer Prize”—and then skewers them systematically. He comes at the novel, significantly, from a position of real respect: Amis calls himself “a Harris fan from way back.” Writing of the earlier books in the series, he says that Harris has achieved what every popular novelist hopes to accomplish: “He has created a parallel world, a terrible antiterra, airless and arcane but internally coherent.” When Amis quotes approvingly from these previous installments, it can only make Hannibal look worse by comparison, although Harris doesn’t do himself any favors:

[Lecter] has no need of “need”: Given the choice, he—and Harris—prefer to say “require”…Out buying weapons—or, rather, out “purchasing” weapons—he tells the knife salesman, “I only require one.” Why, I haven’t felt such a frisson of sheer class since I last heard room service say “How may I assist you?’” And when Lecter is guilty of forgetfulness he says “Bother”—not “Shit” or “Fuck” like the rest of us. It’s all in the details.

Reading the review again, I realized that it falls squarely in the main line of epic takedowns that begins with Mark Twain’s “Fenimore Cooper’s Literary Offenses.” This is a piece that was probably ruined for a lot of readers by being assigned to them in high school, but it deserves a fresh look: it really is one of the funniest and most valuable essays about writing we have, and I revisit it every couple of years. Like Amis, Twain begins by quoting some of his target’s puffier critical encomiums: “The five tales reveal an extraordinary fullness of invention…The craft of the woodsman, the tricks of the trapper, all the delicate art of the forest were familiar to Cooper from his youth up.” (In response, Twain proposes the following rule: “That crass stupidities shall not be played upon the reader as ‘the craft of the woodsman, the delicate art of the forest’ by either the author or the people in the tale.”) Both Twain and Amis are eager to go after their subjects with a broadsword, but they’re also alert to the nuances of language. For Amis, it’s the subtle shading of pretension that creeps in when Harris writes “purchases” instead of “buys”; for Twain, it’s the distinction between “verbal” and “oral,” “precision” and “facility,” “phenomena” and “marvels,” “necessary” and “predetermined.” His eighteen rules of writing, deduced in negative fashion from Cooper’s novels, are still among the best ever assembled. He notes that one of the main requirements of storytelling is “that the personages in a tale shall be alive, except in the case of corpses, and that always the reader shall be able to tell the corpses from the others.” Which, when you think about it, is even more relevant in Harris’s case—although that’s a subject for another post.

Martin Amis

I’ve learned a lot from these two essays, and it made me reflect on the bad reviews that have stuck in my head over the years. In general, a literary critic should err on the side of generosity, especially when it comes to his or her contemporaries, and a negative review of a first novel that nobody is likely to read is an expense of spirit in a waste of shame. But occasionally, a bad review can be just as valuable and memorable as any other form of criticism. I may not agree with James Wood’s feelings about John le Carré, but I’ll never forget how he sums up a passage from Smiley’s People as “a clever coffin of dead conventions.” Once a year or so, I’ll find myself remembering John Updike’s review of Tom Wolfe’s A Man in Full, which notes the author’s obsession with muscular male bodies—“the latissimi dorsi,” “the trapezius muscles”—and catalogs his onomatopoetics, which are even harder to take seriously when you have to type them all out:

“Brannnnng! Brannnnng! Brannnnng!,” “Woooo-eeeeeee! Hegh-heggghhhhhh,” “Ahhhhhhhhhhh ahhhhhhhhhhhh ahhhhhhhhhhh,” “Su-puerflyyyyyyyyyyyyyyyy!,” “eye eye eye eye eye eye eye eye eye,” Scrack scrack scrack scraccckkk scraccccck,” “glug glug glug glugglugglug,” “Awriiighhhhhhhht!”

And half of my notions as a writer seem to have been shaped by a single essay by Norman Mailer, “Some Children of the Goddess,” in which he takes careful aim at most of his rivals from the early sixties. William Styron’s Set This House on Fire is “the magnum opus of a fat spoiled rich boy who could write like an angel about landscape and like an adolescent about people”; J.D. Salinger’s four novellas about the Glass family “seem to have been written for high-school girls”; and Updike himself writes “the sort of prose which would be admired in a writing course overseen by a fussy old nance.”

So what makes a certain kind of negative review linger in the memory long after the book in question has been forgotten? It often involves one major writer taking aim at another, which is already more interesting than the sniping of a critic who knows the craft only from the outside. In most cases, it picks on a potential competitor, which is a target worthy of the writer’s efforts. And there’s usually an undercurrent of wounded love: the best negative reviews, like the one David Foster Wallace wrote on Updike’s Toward the End of Time, reflect a real disillusionment with a former idol. (Notice, too, how so many of the same names keep recurring, as if Mailer and Updike and Wolfe formed a closed circle that runs forever, like a perpetual motion machine of mixed feelings.) Even when there’s no love lost between the critic and his quarry, as with Twain and Cooper, there’s a sense of anger at the betrayal of storytelling by someone who should know better. To return to poor Thomas Harris, I’ll never forget the New Yorker review by Anthony Lane that juxtaposed a hard, clean excerpt from The Silence of the Lambs:

“Lieutenant, it looks like he’s got two six-shot .38s. We heard three rounds fired and the dump pouches on the gunbelts are still full, so he may just have nine left. Advise SWAT it’s +Ps jacketed hollowpoints. This guy favors the face.”

With this one from Hannibal Rising:

“I see you and the cricket sings in concert with my heart.”
“My heart hops at the sight of you, who taught my heart to sing.”

Lane reasonably responds: “What the hell is going on here?” And that’s what all these reviews have in common—an attempt by one smart, principled writer to figure out what the hell is going on with another.

You are here

with one comment

Adam Driver in Star Wars: The Force Awakens

Remember when you were watching Star Wars: The Force Awakens and Adam Driver took off his mask, and you thought you were looking at some kind of advanced alien? You don’t? That’s strange, because it says you did, right here in Anthony Lane’s review in The New Yorker:

So well is Driver cast against type here that evil may turn out to be his type, and so extraordinary are his features, long and quiveringly gaunt, that even when he removes his headpiece you still believe that you’re gazing at some form of advanced alien.

I’m picking on Lane a little here, because the use of the second person is so common in movie reviews and other types of criticism—including this blog—that we hardly notice it, any more than we notice the “we” in this very sentence. Film criticism, like any form of writing, evolves its own language, and using that insinuating “you,” as if your impressions had melded seamlessly with the critic’s, is one of its favorite conventions. (For instance, in Manohla Dargis’s New York Times review of the same film, she says: “It also has appealingly imperfect men and women whose blunders and victories, decency and goofiness remind you that a pop mythology like Star Wars needs more than old gods to sustain it.”) But who is this “you,” exactly? And why has it started to irk me so much?

The second person has been used by critics for a long time, but in its current form, it almost certainly goes back to Pauline Kael, who employed it in the service of images or insights that could have occurred to no other brain on the planet, as when she wrote of Madeline Kahn in Young Frankenstein: “When you look at her, you see a water bed at just the right temperature.” This tic of Kael’s has been noted and derided for almost four decades, going back to Renata Adler’s memorable takedown in the early eighties, in which she called it “the intrusive ‘you'” and noted shrewdly: “But ‘you’ is most often Ms. Kael’s ‘I,’ or a member or prospective member of her ‘we.'” Adam Gopnik later said: “It wasn’t her making all those judgments. It was the Pop Audience there beside her.” And “the second-person address” clearly bugged Louis Menand, too, although his dislike of it was somewhat undermined by the fact that he internalized it so completely:

James Agee, in his brief service as movie critic of The Nation, reviewed many nondescript and now long-forgotten pictures; but as soon as you finish reading one of his pieces, you want to read it again, just to see how he did it…You know what you think about Bonnie and Clyde by now, though, and so [Kael’s] insights have lost their freshness. On the other hand, she is a large part of the reason you think as you do.

Pauline Kael

Kael’s style was so influential—I hear echoes of it in almost everything I write—that it’s no surprise that her intrusive “you” has been unconsciously absorbed by the generations of film critics that followed. If it bothers you as it does me, you can quietly replace it throughout with “I” without losing much in the way of meaning. But that’s part of the problem. The “you” of film criticism conceals a neurotic distrust of the first person that prevents critics from honoring their opinions as their own. Kael said that she used “you” because she didn’t like “one,” which is fair enough, but there’s also nothing wrong with “I,” which she wasn’t shy about using elsewhere. To a large extent, Kael was forging her own language, and I’m willing to forgive that “you,” along with so much else, because of the oceanic force of the sensibilities to which it was attached. But separating the second person from Kael’s unique voice and turning it into a crutch to be indiscriminately employed by critics everywhere yields a more troubling result. It becomes a tactic that distances the writer slightly from his or her own judgments, creating an impression of objectivity and paradoxical intimacy that has no business in a serious review. Frame these observations in “I,” and the critic would feel more of an obligation to own them and make sense of them; stick them in a convenient “you,” and they’re just one more insight to be tossed off, as if the critic happened to observe it unfolding in your brain and can record it here without comment.

Obviously, there’s nothing wrong with wanting to avoid the first person in certain kinds of writing. It rarely has a place in serious reportage, for instance, despite the efforts of countless aspiring gonzo journalists who try to do what Norman Mailer, Hunter S. Thompson, and only a handful of others have ever done well. (It can even plague otherwise gifted writers: I was looking forward to Ben Lerner’s recent New Yorker piece about art conservation, but I couldn’t get past his insistent use of the first person.) But that “I” absolutely belongs in criticism, which is fundamentally a record of a specific viewer, listener, or reader’s impressions of his or her encounter with a piece of art. All great critics, whether they use that “you” or not, are aware of this, and it can be painful to read a review by an inexperienced writer that labors hard to seem “objective.” But if our best critics so often fall into the “you” trap, it’s a sign that even they aren’t entirely comfortable with giving us all of themselves, and I’ve started to see it as a tiny betrayal—meaningful or not—of what ought to be the critic’s intensely personal engagement with the work. And if it’s only a tic or a trick, then we sacrifice nothing by losing it. Replace that “you” with “I” throughout, making whatever other adjustments seem necessary, and the result is heightened and clarified, with a much better sense of who was really sitting there in the dark, feeling emotions that no other human being would ever feel in quite the same way.

“And this has something to do with Operation Pepel?”

leave a comment »

"A few of the files talk about a poison program..."

Note: This post is the forty-first installment in my author’s commentary for City of Exiles, covering Chapter 40. You can read the earlier installments here

As I’ve written here elsewhere, research in fiction is less about factual accuracy than a way of dreaming. Fiction, like a dream, isn’t assembled out of nothing: it’s an assimilation and combination of elements that we’ve gathered in our everyday lives, in stories we hear from friends, in our reading and consumption of other works of art, and through the conscious investigation of whatever world we’ve decided to explore. This last component is perhaps the most crucial, and probably the least appreciated. Writers vary in the degree of novelistic attention they can bring to their surroundings at any one time, but most of us learn to dial it down: it’s both exhausting and a little unfair to life itself to constantly be mining for material. When we commence work on a project, though, our level of engagement rises correspondingly, to the point where we start seeing clues or messages everywhere we look. Research is really just a way of taking that urge for gleaning or bricolage and making it slightly more systematic, exposing ourselves to as many potential units of narrative as we can at a time when we’re especially tuned to such possibilities.

The primordial function of research—-of “furnishing and feathering a world,” in Anthony Lane’s memorable phrase—is especially striking when it comes to details that would never be noticed by the average reader. Few of us would care whether or not the fence at No. 7 Eccles Street could really be climbed by an ordinary man, but for James Joyce, it was important enough for him to write his aunt to confirm it. If we’re thinking only in terms of the effect on readers, this kind of meticulous accuracy can start to seem a little insane, but from the author’s point of view, it makes perfect sense. For most of the time we spend living with a novel, the only reader whose opinion matters is our own, and a lot of research consists of the author convincing himself that the story he’s describing could really have taken place. In order to lose ourselves in the fictional dream, the smallest elements have to seem persuasive to us, and even if a reader couldn’t be expected to know that we’ve fudged or invented a detail that we couldn’t verify elsewhere, we know it, and it subtly affects how deeply we can commit ourselves to the story we’re telling. A reader may never notice a minor dishonesty, but the writer will always remember it.

"And this has something to do with Operation Pepel?"

In my own fiction, I’ve tried to be as accurate as I can even in the smallest things. I keep a calendar of the major events in the story, and I do my best to square it with such matters as railway schedules, museum hours, and the times for sunrise and sunset. (As Robert Louis Stevenson wrote: “And how troublesome the moon is!”) I walk the locations of each scene whenever possible, counting off the steps and figuring out how long it would take a character to get from one point to another, and when I can’t go there in person, I spend a long time on Google Street View. It may seem like a lot of trouble, but it actually saves me work in the long run: being able to select useful details from a mass of existing material supplements the creative work that has to be done, and I’m always happier to take something intact from the real world than to have to invent it from scratch. And I take a kind of perverse pleasure in the knowledge that a reader wouldn’t consciously notice any of it. At best, these details serve as a kind of substratum for the visible events of the story, and tiny things add up to a narrative that is convincing in its broadest strokes. There’s no guarantee that such an approach will work, of course, but it’s hard to make anything work without it.

In City of Exiles, for instance, I briefly mention something called Operation Pepel, which is described as a special operation by Russian intelligence that occurred in Turkey in the sixties. Operation Pepel did, in fact, exist, even if we don’t know much about who was involved or what it was: I encountered it thanks to a passing reference, amounting to less than a sentence, in the monumental The Sword and the Shield by Christopher Andrew and Vasili Mitrokhin. (It caught my eye, incidentally, only because I’d already established that part of the story would center on an historical event involving Turkey, which is just another illustration of how parts of the research process can end up informing one another across far-flung spaces.) Later, I tie Operation Pepel—purely speculatively—to elements of the Soviet poison program, and the details I provide on such historical events as Project Bonfire are as accurate as I can make them. None of this will mean anything even to most specialists in the history of Russia, and I could easily have made up something that would have served just as well. But since I invent so much elsewhere, and so irresponsibly, it felt better to retain as many of the known facts I could. It may not matter to the reader, but it mattered a lot to me…

Written by nevalalee

July 24, 2014 at 9:44 am

“Karvonen headed for the platform…”

leave a comment »

"Karvonen headed for the platform..."

Note: This post is the twenty-seventh installment in my author’s commentary for City of Exiles, covering Chapter 26. You can read the earlier installments here.)

These days, we think of an “airport novel” as a thick little paperback sold at Hudson News, designed to give travelers in business class a few hours of diversion, a category in which my own books have occasionally been classified. In the past, though, it meant exactly what it said: a novel in which much of the action took place in airports. They emerged in the Mad Men era, when air travel was accessible for the first time to large swaths of the population, and even if you couldn’t afford a ticket on Pan Am, you could buy a book in which the glamour of modern transportation was evident on every page. If I were doing academic research on what it was like to travel in the sixties and seventies, I’d turn first to the likes of Arthur Hailey and Robert Ludlum, and it’s still true of thrillers today. Suspense novels engage in such loving descriptions of the railway terminals, airline lounges, and private planes that the characters use to get from one point to another that they double as a stealth advertisement for stylish travel. Hence the Falcon 2000EX corporate jet with its dual Pratt & Whitney engines that pops up randomly in The Da Vinci Code, or the line in Allan Folsom’s The Day After Tomorrow that Anthony Lane thought was the most boring sentence imaginable: “Two hundred European cities have bus links with Frankfurt.”

Why do thrillers love this sort of thing? In part, it’s just a particular example of the suspense novel’s usual fascination with hardware, which, as I’ve argued elsewhere, is both designed to appeal to readers who like a side of facts with their violence and to enhance the verisimilitude of an otherwise implausible story. But there’s also something especially attractive about transportation itself. Thrillers, especially those that center on the chase, are often about moving a character from point A to point B—ideally with his adversaries in hot pursuit—and the means by which he gets to his destination inevitably takes up a large part of the narrative. Here, as in so much else, the template was set by Frederick Forsyth in The Day of the Jackal, in which the antihero of the title spends much of his time ingeniously circumventing various forms of transit security. In thrillers, as I’ve said elsewhere, movement across geography often stands as a surrogate or metaphor for narrative motion, and the protagonist’s progress in physical space mirrors the act of turning the pages. Such stories are a sequence of arrivals and departures, and it’s no accident that so many of them, including The Icon Thief, began with a key character arriving at passport control.

"His passport had not been scanned..."

When I was in London doing research for City of Exiles, I bought a ticket to Brussels, boarded the train, spent maybe three hours in Belgium, then came back in time to spend the night at my hotel room near King’s Cross. I wasn’t even particularly interested per se in Brussels: once I arrived, I spent a rainy afternoon doing little more than wandering around until it was time to head back again, although I did make a pilgrimage to the Royal Museums to see The Death of Marat, which had played an important role in the epilogue of the previous novel. What I really cared about was the terminal and the train itself. I knew that much of Part II would consist of Karvonen’s journey to Helsinki, and while I wasn’t able to take the entire trip myself, I wanted to at least be able to describe its beginning and end. Before leaving for London, I had mapped out his itinerary as best I could, using travel guides and online railway schedules, and I knew more or less where he’d be and when, although I wasn’t entirely sure what would happen there. That was one of the peculiar things about this trip: it took place before I’d even outlined most of the novel, so I had to single out specific locations, neighborhoods, and landmarks in hopes that I’d find a place for them later.

The total cost of the trip was about three hundred dollars, all for the sake of a page or two of detail, which counts as one of my priciest expenses per word of material. (Still, the champion here is probably what I dropped on Philippe Duboy’s ridiculous book Lequeu, which I bought for $125 in hopes of finding a few tidbits that I could use in The Icon Thief, only to end up not using a word of it.) But it was money well spent. My discoveries included such minutiae as the look of the Eurostar terminal at St. Pancras, the security and immigration procedures, and the seating arrangements on the train itself. Some of this was important to the plot—I wanted to see how hard it would be for Karvonen to get certain items past security, and whether or not his passport would be scanned on his departure—but for the most part, it served as a kind of background murmur of authenticity against which more interesting events would take place. None of this should be visible to the reader, but its absence would be noticed, at least subconsciously. If nothing else, it seemed necessary that I see it for myself, if only so I could forget about it when the time came to write the scene. In the overall scheme of the story, the train itself is much less important than where Karvonen is going. But it’s good that we travel with him at least part of the way…

Facts with a side of violence

with 2 comments

Frederick Forsyth

Over the last few weeks, I’ve been rereading The Dogs of War by Frederick Forsyth, my favorite suspense novelist. I’ve mentioned before that Forsyth is basically as good as it gets, and that he’s the writer I turn to the most these days in terms of pure enjoyment: he operates within a very narrow range of material and tone, but on those terms, he always delivers. Reading The Dogs of War again was a fascinating experience, because although it takes place in the world of mercenaries and other guns for hire, it contains surprisingly little action—maybe thirty pages’ worth over the course of four hundred dense pages. The rest of the novel is taken up by an obsessively detailed account of how, precisely, a privately funded war might be financed and equipped, from obtaining weapons to hiring a ship to acquiring the necessary amount of shirts and underwear. And although the amount of information is sometimes overwhelming, it’s always a superlatively readable book, if only because Forsyth is a master of organization and clarity.

Of course, it also works because it’s fun to learn about these things. The Dogs of War is perhaps the ultimate example of the kind of fiction that Anthony Lane, speaking of Allan Folsom’s The Day After Tomorrow, has dismissed as “not so much a novel as a six-hundred-page fact sheet with occasional breaks for violence.” Yet the pleasure we take in absorbing a few facts while reading a diverting thriller is perfectly understandable. Recently, I saw a posting on a social news site from a commenter who said that he didn’t read much, but was looking for novels that would teach him some things while telling an interesting story. I pointed him toward Michael Crichton, who is one of those novelists, like Forsyth, whose work has inspired countless imitators, but who remains the best of his breed. This kind of fiction is easy to dismiss, but conveying factual information to a reader is like any other aspect of writing: when done right, it can be a source of considerable satisfaction. In my own novels, I’ve indulged in such tidbits as how to build a handheld laser, how to open a Soviet weapons cache, and what exactly happened at the Dyatlov Pass.

Michael Crichton

That said, like all good things, the desire to satisfy a reader’s craving for information can also be taken too far. I’ve spoken elsewhere about the fiction of Irving Wallace, who crams his books with travelogues, dubious factoids, and masses of undigested research—along with a few clinical sex scenes—until whatever narrative interest the story once held is lost. And my feelings about Dan Brown are a matter of record. Here, as in most things, the key is balance: information can be a delight, but only in the context of a story that the reader finds engaging for the usual reasons. Its effectiveness can also vary within the work of a single author. Forsyth is great, but the weight of information in some of his later novels can be a little deadening; conversely, I’m not a fan of Tom Clancy, and gave up on The Cardinal of the Kremlin after struggling through a few hundred pages, but I found Without Remorse to be a really fine revenge story, hardware and all. The misuse of factual information by popular novelists has given it a bad reputation, but really, like any writing tool, it just needs to be properly deployed.

And it’s especially fascinating to see how this obsession with information—in a somewhat ambivalent form—has migrated into literary fiction. It’s hard to read Thomas Pynchon, for instance, without getting a kick from his mastery of everything from Tarot cards to aeronautical engineering, and James Wood points out that we see much the same urge in Jonathan Franzen:

The contemporary novel has such a desire to be clever about so many elements of life that it sometimes resembles a man who takes too many classes that he has no time to read: auditing abolishes composure. Of course, there are readers who will enjoy the fact that Franzen fills us in on campus politics, Lithuanian gangsters, biotech patents, the chemistry of depression, and so on…

Yet Franzen, like Pynchon, uses voluminous research to underline his point about how unknowable the world really is: if an author with the capacity to write limericks about the vane servomotor feels despair at the violent, impersonal systems of which we’re all a part, the rest of us don’t stand a chance. Popular novelists, by contrast, use information for the opposite reason, to flatter us that perhaps we, too, would make good mercenaries, if only we knew how to forge an end user certificate for a shipment of gun parts in Spain. In both cases, the underlying research gives the narrative a credibility it wouldn’t otherwise have. And the ability to use it correctly, according to one’s intentions, is one that every writer could stand to develop.

“He checked the assembled device, then switched it on…”

leave a comment »

(Note: This post is the seventeenth installment in my author’s commentary for The Icon Thief, covering Chapter 16. You can read the earlier installments here.)

Suspense novelists love information. The tradition of loading a thriller with arcane detail, especially involving exotic weaponry and the nuts and bolts of various cloak-and-dagger activities, goes back a long time, but probably reached its high point with The Day of the Jackal, the most memorable sections of which recount the acquisition, testing, and use of a deadly assassin’s rifle, as well as serving as a comprehensive manual of passport fraud. No one has ever done it better than Forsyth does here—including Forsyth himself—but we all keep trying. As I’ve mentioned before, this peculiar urge to combine the content of an action movie with the tone of a PowerPoint presentation can lead to unintentionally humorous accretions of detail, as in the famous line from Allan Folsom’s The Day After Tomorrow that critic Anthony Lane has called one of the most boring sentences ever written: “Two hundred European cities have bus links with Frankfurt.” And at its worst, as in many of Tom Clancy’s novels, the level of minutiae can render the underlying story unreadable.

So why do we do it? The obvious explanation is that showing your work in certain ways is designed to appeal to the traditional readers of big suspense novels—hence the emphasis on firearms, spycraft, and modes of transportation. (One could assemble a trainspotter’s guide to Europe entirely from the descriptions of continental railway stations in countless suspense novels, including mine.) At the same time, the fact that we see different kinds of arcana in books aimed at other audiences—think of the forensic expertise in the novels of Patricia Cornwell—makes me think that the impulse amounts to more than just a mere fascination with hardware. Information, in the thriller, functions as a kind of synecdoche for the overall plot: by describing functionally minor elements of the story with apparent expertise, the author subliminally persuades us that major aspects of the novel are equally accurate. The Day of the Jackal may be wildly implausible in its larger details, but we wouldn’t know it, because Forsyth describes that rifle so well.

This is all preface to explaining why I spend the better part of two pages in Chapter 16 of The Icon Thief describing how Ilya builds a handheld laser, MacGyver style, out of a flashlight and the diode from an optical drive. The details are accurate enough, as this sort of thing goes—you can watch someone build a similar device here—but on a structural level, the scene isn’t really necessary. I could have shown Ilya with the laser without any explanation, or, even better, dispensed with its construction in a sentence or two. Instead, I spend a fair amount of time on it, not so much to provide instructions on how to build a laser of your own, but because this kind of scene can be pleasurable for its own sake, and it adds to the verisimilitude of what I wanted to come off as a fairly realistic thriller, however outlandish it might be in other respects. (In fact, this is a good time to admit that I came up with the image of Ilya building the laser first, with only a general sense of how it would fit into the rest of the plot—and it went on, as we’ll see, to play an important role in the story at several crucial points.)

Perhaps most important of all, the scene tells us something about Ilya himself. This is the first time we’ve really seen him alone, and like his brief flashback later in the chapter to an exchange in Yekaterinberg, it reveals elements of his character that will pay off down the line: he’s smart, methodical, and capable of doing a lot with limited resources. It’s no accident that he builds his laser himself, with ordinary components: I don’t have much interest in spyware of the James Bond variety, but I’m very interested in seeing characters solve problems when they have almost nothing to work with, which is what Ilya does, on a number of levels, throughout the entire story. And I don’t think this impression would be conveyed nearly as well without the attention to detail we see in this scene. Information, in a thriller, can be a surprisingly useful tool for building characters, especially in a genre that tends to gravitate, for obvious reasons, toward individuals with a certain level of competence. The Jackal is his rifle, and Ilya, at least for the moment, is identified with the little gadget he builds here. And it’s going to come in handy soon…

What makes a great critic?

with 10 comments

Although my life has since taken me in a rather different direction, for a long time, I was convinced that I wanted to be a film critic. My first paying job as a writer was cranking out movie reviews, at fifty dollars a pop, for a now-defunct college website, a gig that happily coincided with the best year for movies in my lifetime. Later, I spent the summer of 2001 writing capsule reviews for the San Francisco Bay Guardian, during a somewhat less distinguished era for film—my most memorable experience was interviewing Kevin Smith about Jay and Silent Bob Strike Back. After college, I tried to get work as a film critic in New York, only to quickly realize that reviewing movies for a print publication is one of the cushier jobs around, meaning that most critics don’t leave the position until they retire or die, and when they do, there’s usually someone in the office—often the television reporter—already waiting in the wings.

In the years since, the proliferation of pop cultural sites on the Internet has led to a mixed renaissance for critics of all kinds: there are more professional reviewers than ever before, but their influence has been correspondingly diluted. Critics have always been distrusted by artists, of course, but these days, they get it from both sides: for every working critic, there are a thousand commenters convinced that they can do a better job, and the rest of us are often swayed less by the opinions of individual writers than the consensus on Rotten Tomatoes, which is a shame. At its best, a critic’s body of work is a substantial accomplishment in its own right, and personalities as dissimilar as those of Pauline Kael, Roger Ebert, and David Thomson—speaking only of film, which is the area I know best—have created lasting legacies in print and online. And while the critical profession is still in a period of transition, the elements of great criticism haven’t changed since the days of James Agee, or even Samuel Johnson.

So what makes a good critic? Knowledge of the field, yes; enthusiasm for art, most definitely. (A critic without underlying affection for his chosen medium, or who sees it only as an excuse for snark, isn’t good for much of anything.) Above all else, it requires a curious mixture of the objective and the subjective. A critic needs to be objective enough to evaluate a work of art on its own terms—to review the work that the creator wanted to make, not the one that the critic wishes had been made instead—while also acknowledging that all good reviews are essentially autobiographical. Ebert has noted that his own criticism is written in the first person, and the most enduring critics are those who write, not as an authority delivering opinions from up on high, but as someone speaking to an intelligent friend. As a result, the collected works of critics like Ebert and Kael are the closest things we have these days to books that seem like living men or women, like Montaigne’s essays or The Anatomy of Melancholy. “Cut these words,” as Emerson said of Montaigne, “and they would bleed.”

Surveying the current crop of writers on the arts, my sense is that while we have many gifted critics, most of them fall short in one way or another. A critic like Anthony Lane, for all his intelligence, tends to treat the subject under consideration as an excuse for an arch bon mot (as with Star Trek: First Contact: “If you thought the Borg were bad, just wait till you meet the McEnroe.”) And while his wit can be devastating when aimed at the right target—The Da Vinci Code, for instance, or the occupants of the New York Times bestseller list—it often betrays both too much self-regard and a lack of respect for the work itself. On the literary side, James Wood has a similar problem: he’s a skilled parodist and mimic, but surely not every review obliges him to show off with one of his self-consciously clever pastiches. (If I were Chang Rae-Lee, I’d still be mad about this.) The writers of the A.V. Club are more my style: in their pop cultural coverage, especially of television, they’ve struck a nice balance between enthusiasm, autobiography, and reader engagement. But I’m always looking for more. Which critics do you like?

Written by nevalalee

January 4, 2012 at 10:48 am

Andrew Stanton and the world beyond Pixar

leave a comment »

Art is messy, art is chaos—so you need a system.

Andrew Stanton, to the New Yorker

For the second time in less than six months, the New Yorker takes on the curious case of Pixar, and this time around, the results are much more satisfying. In May, the magazine offered up a profile of John Lasseter that was close to a total failure, since critic Anthony Lane’s customary air of disdain was unprepared to draw any useful conclusions about a studio that, at least up to that point, had gotten just about everything blessedly right. This week’s piece by Tad Friend is far superior, focusing on the relatively unsung talents of Andrew Stanton, director of Finding Nemo and Wall-E. And while the publication of a fawning New Yorker profile of a hot creative talent rarely bodes well for his or her next project—as witness the recent articles on Tony Gilroy, Steve Carrell, Anna Faris, or even Lasseter himself, whose profile only briefly anticipated the release of the underwhelming Cars 2—I’m still excited by Stanton’s next project, the Edgar Rice Burroughs epic John Carter, which will serve as a crucial test as to whether Pixar’s magic can extend to the world beyond animation.

Stanton’s case is particularly interesting because of the role he plays at the studio: to hear the article tell it, he’s Pixar’s resident storyteller. “Among all the top talent here,” says Jim Morris, the head of Pixar’s daily operations, “Andrew is the one who has a genius for story structure.” And what makes this all the more remarkable is the fact that Stanton seems to have essentially willed this talent into existence. Stanton was trained as an animator, and began, like most of his colleagues, by focusing on the visual side. As the script for Toy Story was being developed, however, he decided that his future would lie in narrative, and quietly began to train himself in the writer’s craft, reading classic screenplays—including, for some reason, the truly awful script for Ryan’s Daughter—and such texts as Lajos Egri’s The Art of Dramatic Writing. In the end, he was generally acknowledged as the senior writer at Pixar, which, given the caliber of talent involved, must be a heady position indeed.

And while the article is littered with Stanton’s aphorisms on storytelling—”Inevitable but not predictable,” “Conflict + contradiction,” “Do the opposite”—his main virtue as a writer seems to lie in the most universal rule of all: “Be wrong fast.” More than anything else, Stanton’s success so far has been predicated on an admirable willingness to throw things out and start again. He spent years, for instance, working on a second act for Wall-E that was finally junked completely, and while I’m not sure he ever quite cracked the plot for that moviewhich I don’t think lives up to the promise of its first twenty minutes—there’s no question that his ruthlessness with structure did wonders for Finding Nemo, which was radically rethought and reconceived several times over the course of production. Pixar, like the rest of us, is making things up as it goes along, but is set apart by its refusal to let well enough alone. As Stanton concludes:

We’re in this weird, hermetically sealed freakazoid place where everybody’s trying their best to do their best—and the films still suck for three out of the four years it takes to make them.

The real question, of course, is whether this approach to storytelling, with its necessary false starts and extensive rendering time, can survive the transition to live action, in which the use of real actors and sets makes retakes—and thus revision—drastically more expensive. So far, it sounds like John Carter is doing fine, at least judging from the trailer and early audience response, which has reportedly been encouraging. And more rides on this movie’s success or failure than the fate of one particular franchise. Pixar’s story has been extraordinary, but its most lasting legacy may turn out to be the migration of its talent beyond the safety zone of animation—assuming, of course, that their kung fu can survive. With Brad Bird’s Mission: Impossible—Ghost Protocol and John Carter in the wingswe’re about to discover if the directors who changed animation at Pixar can do the same in live action. The New Yorker article is fine, but it buries the lede: Stanton and Bird are the first of many. And if their next movies are half as entertaining as the ones they’ve made so far, we’re looking at an earthquake in the world of pop culture.

Learning to curse

with 2 comments

You taught me language; and my profit on’t
Is, I know how to curse.

—Caliban, in The Tempest

We interrupt our run of quotes from famous bearded writers to consider the unexpectedly thorny problem of cursing in fiction. Years ago, this wasn’t an issue: you simply didn’t use profanity at all. These days, though, with the demise of the Hays Code, the Comics Code, and the Comstock Act, writers can say whatever they want, which is an unambiguous good. It’s absolutely necessary for authors to have as broad and expressive a vocabulary as possible, which includes occasional recourse to some of the most powerful words in any given language. We’ve thankfully moved past the stage when Norman Mailer was forced to use the word “fug” in The Naked and the Dead—which prompted Tallulah Bankhead’s quip, upon meeting Mailer for the first time: “So you’re the young man who can’t spell.”

Yet with great power comes great responsibility, as one famous product of the Comics Code would have us believe, and the fact remains that most writers in any medium have no idea how to curse. Sometimes, of course, you’re naturally constrained: if you’re writing for television, say, or for Analog, which strives to be suitable for bright teenagers, you tailor your vocabulary accordingly. Without those constraints, though, writers often fall into one of two extremes. A lot of screenwriters throw profanity around like punctuation, trying to equal the effect that Nicholas Pileggi and Martin Scorsese turned to record-setting poetry in Goodfellas and Casino, but which more often results in unspeakable dialogue. And a surprising number of suspense novelists run in the opposite direction, with books filled with detectives and soldiers who sound like Gwyneth Paltrow covering Cee-Lo Green. As Anthony Lane notes in his review of Clive Cussler’s Inca Gold:

The plot is some farrago about buried treasure in the Andes, and the characters, though intended to be as tough as old boots, are not quite tough enough to curse properly. “Those fornicating baboons” is about as close as they get. The fruitful comparison here is with Judith Krantz, who I thought would be partial to soft-core euphemisms like “manhood” and “moistness” but never hesitates to call a fuck a fuck.

For the most part, my own writing is comparatively straitlaced, although you’d never guess this from looking at my custom dictionary in Word. Because Word doesn’t include profanity in its standard word list, and creates a new entry whenever I add an unfamiliar term to spellcheck, what you see in my custom dictionary, along with the usual proper names and technical terms, is an almost nonstop litany of filth, created over the course of ten years and 700,000 words of fiction. The result looks like something out of The Aristocrats, and it gives you a very skewed impression of my body of work, which is generally pretty clean—aside, of course, from an almost comically pervasive degree of violence, which goes with the territory, but is another issue entirely.

And my reticence about swearing has less to do with any real scruples than with doubts about my ability to use so fine an instrument. Our greatest artists of profanity, like Mamet, have raised the bar for everyone, and I don’t think I’ll ever be able to top a scene like Alec Baldwin’s electric seven minutes in Glengarry Glen Ross, or any of R. Lee Ermey’s scenes in Full Metal Jacket. (Although it’s worth noting that my favorite Mamet movie is rated G.) The upshot is that profanity is a subset of dialogue, and dialogue, profane or otherwise, is one of the hardest things for any writer to master. It needs to track on the page and read well out loud, while also being true to character, and bringing out the big guns of extreme profanity before you’ve mastered the basics will only draw attention to other shortcomings in style. As with nearly all else in writing, then, write naturally and unobtrusively, with an eye to character and clarity, and the rest will take care of itself. I swear.

Written by nevalalee

October 3, 2011 at 10:12 am

“Two hundred European cities have bus links with Frankfurt”

with 7 comments

Let’s say you’re reading a novel, perhaps a thriller, and while you wouldn’t say it’s a great book, you’re reasonably engaged by the plot and characters. The story is clocking along nicely, the author’s prose is clean and unobtrusive, and suddenly you’re brought up short by something like this:

He was sitting all alone in the enormous cabin of a Falcon 2000EX corporate jet as it bounced its way through turbulence. In the background, the dual Pratt & Whitney engines hummed evenly.

Hold on. What do those Pratt & Whitney engines have to do with anything? Is this a novel or an aircraft catalog? Well, it’s neither, at least not at the moment: rather, it’s an instance of a novelist being reluctant to part with a laboriously acquired piece of research. Suspense novelists are especially guilty of this sort of thing—the above example is from Dan Brown’s The Lost Symbol, admittedly not the most original target in the world—but it’s something that every writer needs to beware: the temptation to overload one’s fiction with factual detail, especially detail that was the result of a long and painful research process.

This tendency is easy to understand in historical and science fiction, in which so much energy has gone into researching a story set in another time and place, but it’s less obvious why it should also be so common in thrillers, which in other respects have become ever more streamlined. Anthony Lane, in an amusing article on the top ten books on the New York Times bestseller list of May 15, 1994, quotes a sentence from Allan Folsom’s thriller The Day After Tomorrow (the one about the Frankfurt bus lines), which he claims is the most boring clause in any of the books he’s read for his essay. He then says:

The odd thing about pedantry, however, is that it can’t be trusted. Many of the writers on this list are under the impression that if they do the factual spadework, the fiction will dig itself in and hunker down, solid and secure. The effect, unfortunately, is quite the opposite. It suggests that the writers are hanging on for grim life to what they know for fear of unleashing what they don’t know; they are frightened, in other words, of their own imagination…When Flaubert studied ancient Carthage for Salammbô, or the particulars of medieval falconry for “The Legend of St. Julien Hospitalier,” he was furnishing and feathering a world that had already taken shape within his mind; when Allan Folsom looks at bus timetables, his book just gets a little longer.

True enough. Lane is mistaken, though, when he blames this tendency, elsewhere in his article, on the work of James Michener, which consists of “gathering more research than any book could possibly need, then refusing to jettison a particle of it for the sake of dramatic form.” Michener is probably to blame for such excesses in historical fiction, but as far as thrillers are concerned, there’s another, more relevant culprit: Frederick Forsyth. Much of the pleasure of The Day of the Jackal (which Lane elsewhere claims to read once a year) comes from Forsyth’s expertise, real or cunningly feigned, in such matters as identity theft and the construction of an assassin’s rifle, which makes the less plausible elements of his novel all the more convincing. He’s so good at this, in fact, that legions of inferior writers have been seduced by his example. (Even Forsyth himself, in his later novels, isn’t entirely immune.)

Here, then, is the novelist’s dilemma: an appropriate amount of research will lure readers into the fictional dream, but too much will yank them out. So what’s a writer to do? The answer here, as in most other places, is that good habits of writing in general will trim away the worst of these particular excesses. For instance, Stephen King’s invaluable advice to cut all your drafts by ten percent applies twice as much to expository or factual passages. We haven’t discussed point of view yet, but by restricting each scene to the point of view of a particular character, you’re less likely to introduce extraneous information. And the endless labor of rereading, editing, and revision, once time has given you sufficient detachment from your own work, will gradually alert you to places where the research has begun to interfere with the underlying story.

There’s another place where excessive research can also be dangerous, and that’s in the writing process itself. Nearly every novel requires some degree of background material, but how much is too much? It’s always hard to say when research turns into procrastination, but here’s my own rule of thumb: two or three months of research is probably enough for the beginning of any project. Later on, you can always take a break to do more, and should certainly go back and check your facts once the novel is done, but any more than three months at the start, and you risk losing the momentum that encouraged you to write the novel in the first place. And once that momentum is gone, not even a Pratt & Whitney engine will get it back.

%d bloggers like this: