Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘James Wood

The electric dream

with 4 comments

There’s no doubt who got me off originally and that was A.E. van Vogt…The basic thing is, how frightened are you of chaos? And how happy are you with order? Van Vogt influenced me so much because he made me appreciate a mysterious chaotic quality in the universe that is not to be feared.

—Philip K. Dick, in an interview with Vertex

I recently finished reading I Am Alive and You Are Dead, the French author Emmanuel Carrère’s novelistic biography of Philip K. Dick. In an article last year about Carrère’s work, James Wood of The New Yorker called it “fantastically engaging,” noting: “There are no references and very few named sources, yet the material appears to rely on the established record, and is clearly built from the same archival labor that a conventional biographer would perform.” It’s very readable, and it’s one of the few such biographies—along with James Tiptree, Jr. by Julie Phillips and a certain upcoming book—aimed at intelligent audience outside the fan community. Dick’s life also feels relevant now in ways that we might not have anticipated two decades ago, when the book was first published in France. He’s never been as central to me as he has for many other readers, mostly because of the accidents of my reading life, and I’ve only read a handful of his novels and stories. I’m frankly more drawn to his acquaintance and occasional correspondent Robert Anton Wilson, who ventured into some of the same dark places and returned with his sanity more or less intact. (One notable difference between the two is that Wilson was a more prolific experimenter with psychedelic drugs, which Dick, apart from one experience with LSD, appears to have avoided.) But no other writer, with one notable exception that I’ll mention below, has done a better job of forcing us to confront the possibility that our understanding of the world might be fatally flawed. And it’s quite possible that he serves as a better guide to the future than any of the more rational writers who populated the pages of Astounding.

What deserves to be remembered about Dick, though, is that he loved the science fiction of the golden age, and he’s part of an unbroken chain of influence that goes back to the earliest days of the pulps. In I Am Alive and You Are Dead, Carrère writes of Dick as a young boy: “He collected illustrated magazines with titles like Astounding and Amazing and Unknown, and these periodicals, in the guise of serious scientific discussion, introduced him to lost continents, haunted pyramids, ships that vanished mysteriously in the Sargasso Sea.” (Carrère, weirdly, puts a superfluous exclamation point at the end of the titles of all these magazines, which I’ve silently removed in these quotations.) Dick continued to collect pulps throughout his life, keeping the most valuable issues in a fireproof safe at his house in San Rafael, California, which was later blown open in a mysterious burglary. Throughout his career, Dick refers casually to classic stories with an easy familiarity that suggests a deep knowledge of the genre, as in a line from his Exegesis, in which he mentions “that C.L. Moore novelette in Astounding about the two alternative futures hinging on which of two girls the guy marries in the present.” But the most revealing connection lies in plain sight. In a section on Dick’s early efforts in science fiction, Carrère writes:

Stories about little green men and flying saucers…were what he was paid to write, and the most they offered in terms of literary recognition was comparison to someone like A.E. van Vogt, a writer with whom Phil had once been photographed at a science fiction convention. The photo appeared in a fanzine above the caption “The Old and the New.”

Carrère persistently dismisses van Vogt as a writer of “space opera,” which might be technically true, though hardly the whole story. Yet he was also the most convincing precursor that Dick ever had. The World of Null-A may be stylistically cruder than Dick at his best, but it also appeared in Astounding in 1945, and it remains so hallucinatory, weird, and undefinable that I still have trouble believing that it was read by twelve-year-olds. (As Dick once said of it in an interview: “All the parts of that book do not add up; all the ingredients did not make a coherency. Now some people are put off by that. They think it’s sloppy and wrong, but the thing that fascinated me so much was that this resembled reality more than anybody else’s writing inside or outside science fiction.”) Once you see the almost apostolic line of succession from van Vogt to Alfred Bester to Dick, the latter seems less like an anomaly within the genre than like an inextricable part of its fabric. Although he only sold one short story, “Impostor,” to John W. Campbell, Dick continued to submit to him for years, before concluding that it wasn’t the best use of his time. As Eric Leif Davin recounts in Partners in Wonder: “[Dick] said he’d rather write several first-draft stories for one cent a word than spend time revising a single story for Campbell, despite the higher pay.” And Dick recalled in his collection The Minority Report:

Horace Gold at Galaxy liked my writing whereas John W. Campbell, Jr. at Astounding considered my writing not only worthless but as he put it, “Nuts.” By and large I liked reading Galaxy because it had the broadest range of ideas, venturing into the soft sciences such as sociology and psychology, at a time when Campbell (as he once wrote me!) considered psionics a necessary premise for science fiction. Also, Campbell said, the psionic character in the story had to be in charge of what was going on.

As a result, the two men never worked closely together, although Dick had surprising affinities with the editor who believed wholeheartedly in psionics, precognition, and genetic memory, and whose magazine never ceased to play a central role in his inner life. In his biography, Carrère provides an embellished version of a recurring dream that Dick had at the age of twelve, “in which he found himself in a bookstore trying to locate an issue of Astounding that would complete his collection.” As Dick describes it in his autobiographical novel VALIS:

In the dream he again was a child, searching dusty used-book stores for rare old science fiction magazines, in particular Astoundings. In the dream he had looked through countless tattered issues, stacks upon stacks, for the priceless serial entitled “The Empire Never Ended.” If he could find it and read it he would know everything; that had been the burden of the dream.

Years later, the phrase “the empire never ended” became central to Dick’s late conviction that we were all living, without our knowledge, in the Rome of the Acts of the Apostles. But the detail that sticks with me the most is that the magazines in the dream were “in particular Astoundings.” The fan Peter Graham famously said that the real golden age of science fiction was twelve, and Dick reached that age at the end of 1940, at the peak of Campbell’s editorship. The timing was perfect for Astounding to rewire his brain forever. When Dick first had his recurring dream, he would have just finished reading a “priceless serial” that had appeared in the previous four issues of the magazine, and I’d like to think that he spent the rest of his life searching for its inconceivable conclusion. It was van Vogt’s Slan.

Quote of the Day

leave a comment »

One London lunchtime many years ago, the late poet and editor Ian Hamilton was sitting at his usual table in a Soho pub called the Pillars of Hercules…A pale, haggard poet entered, and Hamilton offered him a chair and a glass of something. “Oh no, I just can’t keep drinking,” said the weakened poet. “I must give it up. It’s doing terrible things to me. It’s not even giving me any pleasure any longer.” But Hamilton, narrowing his eyes, responded to this feebleness in a tone of weary stoicism and said in a quiet, hard voice: “Well, none of us likes it.”

James Wood, The Irresponsible Self

Written by nevalalee

May 18, 2018 at 7:30 am

An awkward utilitarianism

with 2 comments

Two decades ago, the critic James Wood published a scathing review in The New Republic of James Atlas’s biography of Saul Bellow. Wood acknowledged that the book was “very diligent,” but he found that it suffered from at least two fatal flaws. The first was that it was insufficiently reverent toward the novelist whom Wood considered “the greatest writer of American prose of the twentieth century,” a shortcoming that he framed in amusingly petty terms: “[Atlas] writes of Bellow as if he were writing a life of Joyce Carol Oates or Richard Ford, some middler who oddly managed to bag the Nobel Prize.” And a page or so later: “Atlas proceeds as if he were writing the life of Stanley Elkin, not the unfolding of a will-to-greatness.” His second objection was that Atlas had paid undue attention to the unpleasant details of Bellow’s personal life. After quoting from a speech that Bellow once gave at his birthplace—“We are people capable of freedom, and some of us are even willing to take chances for the sake of freedom”—Wood made an extraordinary argument:

A biographer should write the history of this passage to freedom, should see that a superior soul with superior gifts has to be accounted for. It is an elitist assumption, no doubt; but without such an assumption the biography of a great writer leaks away its rationale. Bellow’s “sins”—how he treated his wives, and how self-regarding he was—were committed in the process of creating an imperishable body of work. It is not so much that they should be “forgiven,” whatever this means, than that they must be judged in the light of the work of which we are the beneficiaries. An awkward but undeniable utilitarianism must be in play: the number of people hurt by Bellow is probably no more than can be counted on two hands, yet he has delighted and consoled and altered the lives of thousands of readers.

It’s fair to say that the final sentence—which could be applied equally well to, say, James Levine or Roman Polanski—probably wouldn’t fly today. But it’s worth looking at some of the “sins” that caused Wood to recoil so strongly. He doesn’t cite any specific passage from Atlas’s biography, but he must have been thinking of moments like this, which concerns Bellow and his second wife Sondra Tschacbasov:

On Labor Day, Bellow came to pick up [his son Adam], but Sondra wouldn’t let him go. Bellow alleged that she tore his clothes and “bruised” him. “He beat me up,” Sondra countered, claiming she was “bedridden for a week. Did I give him a slap? I did. But he retaliated violently—more than once.”

This doesn’t make for pleasant reading, regardless of your feelings toward Bellow himself. Just two years ago, however, the scholar Zachary Leader published the first bulky volume of The Life of Saul Bellow, a massive undertaking that was widely seen as a respectful corrective to Atlas’s work. (The second half, which covers the last four decades of Bellow’s life, is due later this year.) In the course of his research, Leader was allowed to read an unpublished memoir by Tschacbasov, in which she gives a graphically detailed version of the same incident: “He was spoiling for it, I could see his tense lip and twitch that always telegraphed a simmering rage…I slapped him and he grabbed me by the ponytail and swung me around punching me with his other hand. I was bruised for a week and took out a restraining order.” And in a letter that Tschacbasov wrote to her lawyer shortly afterward, she describes her injuries as “severe bone bruises behind one ear, cuts on my left temple and left eyelid, and a bad bruise on my left breast. My scalp is a mess of lumps and bruises.”

As Principal Skinner once said to Superintendent Chalmers: “Oh. That’s much worse.” And remember, this is from the biography that was supposed to rehabilitate Bellow’s reputation. (It also includes an account of an incident of which Tschacbasov wrote to Bellow: “As you know, you dragged me from the car by my hair across the lawn, kicked me and whipped me with your cap.”) Leader spends much of his discussion of this episode parsing whether Tschacbasov’s slap—which she didn’t mention to her lawyer—could be “mistaken for an attack,” and he concludes: “Both parties were shading the truth.” He also apologetically explains that he’s only bringing up these accusations at all “because they are part of the life Bellow lived as he wrote Herzog.” In the finished novel, which is clearly based on the end of Bellow’s marriage, Herzog merely fantasizes about beating up his wife Madeleine, who is leaving him for another man:

Herzog…pictured what might have happened if instead of listening so intensely and thoughtfully he had hit Madeleine in the face. What if he had knocked her down, clutched her hair, dragged her screaming and fighting around the room, flogged her until her buttocks bled. What if he had! He should have torn her clothes, ripped off her necklace, brought his fists down on her head.

“In early versions of the novel, Herzog uses physical force on Madeleine,” Leader writes, referring us in a short footnote to another study of the most autobiographical of American novelists—and then he just moves on. As far as I can tell, none of the reviews of Leader’s biography, and there were a lot, dealt with this material at any length. Of course, that was two years ago, and if we haven’t gotten around to Bellow yet, like André Gide, it’s because it hasn’t occurred to us. He can get in line. Which is a form of utilitarianism in itself.

And I’d like to think that James Wood might have second thoughts now about his “awkward but undeniable utilitarianism,” or at least about its undeniability. Learning to deny it is largely what the events of the last six months have been about, and it matters what our most prominent literary critic thinks about our greatest novelist, even—or especially—if their relationship was even closer than they let on. In The Shadow in Garden, James Atlas’s book on the art of biography, he refers to Wood as one of Bellow’s three “nonconsanguineous” sons, and he notes of the critic’s negative review of a memoir by the novelist’s actual son Greg Bellow:

At least Wood was upfront about his partisanship: he mentioned that he had co-taught a course with Bellow at Boston University. And if you looked back at a tribute in The New Republic Wood had written eight years earlier, just after Bellow’s death, it emerged that they had been close friends: their daughters had played together; Wood and Bellow had played piano (Wood) and recorder (Bellow) duets. And they grew still closer toward the end: “In the final year of Bellow’s life, as he became very frail, I would read some of his own prose to him.”

It’s hard for anyone to acknowledge the worst about a man whom he loved—but it’s equally true that if our current moment can’t force James Wood to rethink Saul Bellow, then it might not be worth as much as we hope. It can’t just be an excuse to find more reasons to hate Brett Ratner. We have to look closely at the men who might be our fathers. It’s worth noting that along with Wood, Atlas lists two other men as Bellow’s three surrogate sons. One was Martin Amis. The other was Leon Wieseltier, Wood’s editor at The New Republic, who was accused last year of decades of sexual harassment, and who also wrote admiringly after Bellow’s death: “I always had the feeling about Saul that he was inwardly at war, that he breakfasted with his demons.”

The art of the bad review

leave a comment »

Mark Twain

Note: I’m taking a few days off for the holidays, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on September 7, 2016.

Every few years, whenever my spirits need a boost, I go back and read the famous smackdown that Martin Amis delivered to the novel Hannibal by Thomas Harris, just for the simple pleasure of it. It’s one of the great savage reviews of all time, and it checks off most of the boxes that this sort of shellacking requires. Amis begins by listing the hyperbolic claims made by other reviewers—“A momentous achievement,” “A plausible candidate for the Pulitzer Prize”—and then skewering them systematically. But he also goes after the novel, significantly, from a position of respect, calling himself “a Harris fan from way back.” Writing of the earlier books in the series, he says that Harris has achieved what every popular novelist hopes to accomplish: “He has created a parallel world, a terrible antiterra, airless and arcane but internally coherent.” When Amis quotes approvingly from the previous installments, it can only make Hannibal look worse by comparison, although Harris doesn’t do himself any favors. As Amis writes:

[Lecter] has no need of “need”: Given the choice, he—and Harris—prefer to say “require”…Out buying weapons—or, rather, out “purchasing” weapons—he tells the knife salesman, “I only require one.” Why, I haven’t felt such a frisson of sheer class since I last heard room service say “How may I assist you?’” And when Lecter is guilty of forgetfulness he says “Bother”—not “Shit” or “Fuck” like the rest of us. It’s all in the details.

Amis’s review falls squarely in the main line of epic takedowns that began with Mark Twain’s “Fenimore Cooper’s Literary Offenses.” This is a piece that was probably ruined for a lot of readers by being assigned in high school, but it deserves a fresh look: it’s one of the funniest and most valuable essays about writing that we have, and I revisit it on a regular basis. Like Amis, Twain begins by quoting some of the puffier encomiums offered by other critics: “[Cooper’s] five tales reveal an extraordinary fullness of invention…The craft of the woodsman, the tricks of the trapper, all the delicate art of the forest were familiar to Cooper from his youth up.” (Twain proposes the following rule in response: “Crass stupidities shall not be played upon the reader as ‘the craft of the woodsman, the delicate art of the forest’ by either the author or the people in the tale.”) Both Twain and Amis are eager to go after their subjects with a broadsword, but they’re also alert to the nuances of language. For Amis, it’s the subtle shading of pretension that creeps in when Harris writes “purchases” instead of “buys”; for Twain, it’s the distinction between “verbal” and “oral,” “precision” and “facility,” “phenomena” and “marvels,” “necessary” and “predetermined.” His eighteen rules of writing, deduced in negative fashion from Cooper’s novels, are still among the best ever assembled. He notes that one of the main requirements of storytelling is “that the personages in a tale shall be alive, except in the case of corpses, and that always the reader shall be able to tell the corpses from the others.” Which, when you think about it, is even more relevant in Harris’s case—although that’s a subject for another post.

Martin Amis

I’ve learned a lot from these two essays, as I have with other bad reviews that have stuck in my head over the years. In general, a literary critic should err on the side of generosity, especially when it comes to his or her contemporaries, and a negative review of a first novel that nobody is likely to read is an expense of spirit in a waste of shame. But occasionally, a bad review can be just as valuable and memorable as any other form of criticism. I may not agree with James Wood’s feelings about John le Carré, but I’ll never forget how he sums up a passage from Smiley’s People as “a clever coffin of dead conventions.” Once a year or so, I’ll find myself remembering John Updike’s review of Tom Wolfe’s A Man in Full, which notes the author’s obsession with muscular male bodies—“the latissimi dorsi,” “the trapezius muscles”—and catalogs his onomatopoetics, which are even harder to take seriously when you have to type them all out:

“Brannnnng! Brannnnng! Brannnnng!,” “Woooo-eeeeeee! Hegh-heggghhhhhh,” “Ahhhhhhhhhhh ahhhhhhhhhhhh ahhhhhhhhhhh,” “Su-puerflyyyyyyyyyyyyyyyy!,” “eye eye eye eye eye eye eye eye eye,” Scrack scrack scrack scraccckkk scraccccck,” “glug glug glug glugglugglug,” “Awriiighhhhhhhht!”

And half of my notions as a writer seem to have been shaped by a single essay by Norman Mailer, “Some Children of the Goddess,” in which he takes careful aim at most of his rivals from the early sixties. William Styron’s Set This House on Fire is “the magnum opus of a fat spoiled rich boy who could write like an angel about landscape and like an adolescent about people”; J.D. Salinger’s four novellas about the Glass family “seem to have been written for high-school girls”; and Updike himself writes “the sort of prose which would be admired in a writing course overseen by a fussy old nance.”

So what makes a certain kind of negative review linger in the memory for longer than the book it describes? It often involves one major writer taking aim at another, which is already more interesting than the sniping of a critic who knows the craft only from the outside. In most cases, it picks on a target worthy of the writer’s efforts. And there’s usually an undercurrent of wounded love: the best negative reviews, like the one David Foster Wallace delivered on Updike’s Toward the End of Time, or Renata Adler’s demolition of Pauline Kael, reflect a real disillusionment with a former idol. (Notice, too, how so many of the same names keep recurring, as if Mailer and Updike and Wolfe formed a closed circle that runs forever, in a perpetual motion machine of mixed feelings.) Even when there’s no love lost between the critic and his quarry, as with Twain and Cooper, there’s a sense of anger at the betrayal of storytelling by someone who should know better. To return to poor Thomas Harris, I’ll never forget the New Yorker review by Anthony Lane that juxtaposed a hard, clean excerpt from The Silence of the Lambs:

“Lieutenant, it looks like he’s got two six-shot .38s. We heard three rounds fired and the dump pouches on the gunbelts are still full, so he may just have nine left. Advise SWAT it’s +Ps jacketed hollowpoints. This guy favors the face.”

With this one from Hannibal Rising:

“I see you and the cricket sings in concert with my heart.”
“My heart hops at the sight of you, who taught my heart to sing.”

Lane reasonably responds: “What the hell is going on here?” And that’s what all these reviews have in common—an attempt by one smart, principled writer to figure out what the hell is going on with another.

The art of the bad review

leave a comment »

Mark Twain

Yesterday, while writing about the pitfalls of quotation in book reviews, I mentioned the famous smackdown that Martin Amis delivered to the novel Hannibal by Thomas Harris. When I went back to look up the lines I wanted to quote, I found myself reading the whole thing over again, just for the simple pleasure of it. It’s one of the great critical slams of all time, and it checks off most of the boxes that this kind of shellacking requires. Amis begins by listing a few hyperbolic claims made by other reviewers—“A momentous achievement,” “A plausible candidate for the Pulitzer Prize”—and then skewers them systematically. He comes at the novel, significantly, from a position of real respect: Amis calls himself “a Harris fan from way back.” Writing of the earlier books in the series, he says that Harris has achieved what every popular novelist hopes to accomplish: “He has created a parallel world, a terrible antiterra, airless and arcane but internally coherent.” When Amis quotes approvingly from these previous installments, it can only make Hannibal look worse by comparison, although Harris doesn’t do himself any favors:

[Lecter] has no need of “need”: Given the choice, he—and Harris—prefer to say “require”…Out buying weapons—or, rather, out “purchasing” weapons—he tells the knife salesman, “I only require one.” Why, I haven’t felt such a frisson of sheer class since I last heard room service say “How may I assist you?’” And when Lecter is guilty of forgetfulness he says “Bother”—not “Shit” or “Fuck” like the rest of us. It’s all in the details.

Reading the review again, I realized that it falls squarely in the main line of epic takedowns that begins with Mark Twain’s “Fenimore Cooper’s Literary Offenses.” This is a piece that was probably ruined for a lot of readers by being assigned to them in high school, but it deserves a fresh look: it really is one of the funniest and most valuable essays about writing we have, and I revisit it every couple of years. Like Amis, Twain begins by quoting some of his target’s puffier critical encomiums: “The five tales reveal an extraordinary fullness of invention…The craft of the woodsman, the tricks of the trapper, all the delicate art of the forest were familiar to Cooper from his youth up.” (In response, Twain proposes the following rule: “That crass stupidities shall not be played upon the reader as ‘the craft of the woodsman, the delicate art of the forest’ by either the author or the people in the tale.”) Both Twain and Amis are eager to go after their subjects with a broadsword, but they’re also alert to the nuances of language. For Amis, it’s the subtle shading of pretension that creeps in when Harris writes “purchases” instead of “buys”; for Twain, it’s the distinction between “verbal” and “oral,” “precision” and “facility,” “phenomena” and “marvels,” “necessary” and “predetermined.” His eighteen rules of writing, deduced in negative fashion from Cooper’s novels, are still among the best ever assembled. He notes that one of the main requirements of storytelling is “that the personages in a tale shall be alive, except in the case of corpses, and that always the reader shall be able to tell the corpses from the others.” Which, when you think about it, is even more relevant in Harris’s case—although that’s a subject for another post.

Martin Amis

I’ve learned a lot from these two essays, and it made me reflect on the bad reviews that have stuck in my head over the years. In general, a literary critic should err on the side of generosity, especially when it comes to his or her contemporaries, and a negative review of a first novel that nobody is likely to read is an expense of spirit in a waste of shame. But occasionally, a bad review can be just as valuable and memorable as any other form of criticism. I may not agree with James Wood’s feelings about John le Carré, but I’ll never forget how he sums up a passage from Smiley’s People as “a clever coffin of dead conventions.” Once a year or so, I’ll find myself remembering John Updike’s review of Tom Wolfe’s A Man in Full, which notes the author’s obsession with muscular male bodies—“the latissimi dorsi,” “the trapezius muscles”—and catalogs his onomatopoetics, which are even harder to take seriously when you have to type them all out:

“Brannnnng! Brannnnng! Brannnnng!,” “Woooo-eeeeeee! Hegh-heggghhhhhh,” “Ahhhhhhhhhhh ahhhhhhhhhhhh ahhhhhhhhhhh,” “Su-puerflyyyyyyyyyyyyyyyy!,” “eye eye eye eye eye eye eye eye eye,” Scrack scrack scrack scraccckkk scraccccck,” “glug glug glug glugglugglug,” “Awriiighhhhhhhht!”

And half of my notions as a writer seem to have been shaped by a single essay by Norman Mailer, “Some Children of the Goddess,” in which he takes careful aim at most of his rivals from the early sixties. William Styron’s Set This House on Fire is “the magnum opus of a fat spoiled rich boy who could write like an angel about landscape and like an adolescent about people”; J.D. Salinger’s four novellas about the Glass family “seem to have been written for high-school girls”; and Updike himself writes “the sort of prose which would be admired in a writing course overseen by a fussy old nance.”

So what makes a certain kind of negative review linger in the memory long after the book in question has been forgotten? It often involves one major writer taking aim at another, which is already more interesting than the sniping of a critic who knows the craft only from the outside. In most cases, it picks on a potential competitor, which is a target worthy of the writer’s efforts. And there’s usually an undercurrent of wounded love: the best negative reviews, like the one David Foster Wallace wrote on Updike’s Toward the End of Time, reflect a real disillusionment with a former idol. (Notice, too, how so many of the same names keep recurring, as if Mailer and Updike and Wolfe formed a closed circle that runs forever, like a perpetual motion machine of mixed feelings.) Even when there’s no love lost between the critic and his quarry, as with Twain and Cooper, there’s a sense of anger at the betrayal of storytelling by someone who should know better. To return to poor Thomas Harris, I’ll never forget the New Yorker review by Anthony Lane that juxtaposed a hard, clean excerpt from The Silence of the Lambs:

“Lieutenant, it looks like he’s got two six-shot .38s. We heard three rounds fired and the dump pouches on the gunbelts are still full, so he may just have nine left. Advise SWAT it’s +Ps jacketed hollowpoints. This guy favors the face.”

With this one from Hannibal Rising:

“I see you and the cricket sings in concert with my heart.”
“My heart hops at the sight of you, who taught my heart to sing.”

Lane reasonably responds: “What the hell is going on here?” And that’s what all these reviews have in common—an attempt by one smart, principled writer to figure out what the hell is going on with another.

The act of noticing

leave a comment »

Jonathan Franzen

Note: I’m on vacation this week, so I’ll be republishing a few of my favorite posts from earlier in this blog’s run. This post originally appeared, in a slightly different form, on September 24, 2014.

Yesterday, while playing with my daughter at the park, I found myself oddly fascinated by the sight of a landscaping crew that was taking down a tree across the street. It’s the kind of scene you encounter on a regular basis in suburbia, but I wound up watching with unusual attention, mostly because I didn’t have much else to do. (I wasn’t alone, either. Any kind of construction work amounts to the greatest show on earth for toddlers, and there ended up being a line of tiny spectators peering through the fence.) Maybe because I’ve been in a novelistic state of mind recently, I focused on details that I’d never noticed before. There’s the way a severed tree limb dangles from the end of the crane almost exactly like a hanged man, as Eco describes it in Foucault’s Pendulum, with its heavy base tracing a second, smaller circle in the air. I noted how a chainsaw in action sprays a fan of fine particles behind it, like a peacock’s tail. And when the woodchipper shoots chips into the back of the truck, a cloud of light golden dust forms above the container, like the soul of the tree ascending.

As I watched, I had the inevitable thought: I should put this into a story. Unfortunately, nothing I’m writing at the moment includes a landscaping scene, and the easiest way to incorporate it would be through some kind of elaborate metaphor, as we often see, at its finest, in Proust. (“As he listened to her words, he found himself reminded of a landscaping crew he had once seen…”) But it made me reflect both on the act of noticing and on the role it plays, or doesn’t, in my own fiction. Most of the time, when I’m writing a story, I’m following the dictates of a carefully constructed plot, and I’ll find myself dealing with a building or a city scene that has imposed itself by necessity on the action: my characters end up at a hospital or a police station, and I strain to find a way to evoke it in a few economical lines that haven’t been written a million times before. Occasionally, this strikes me as a backward way of working. It would be better, it seems, to build the story around locations and situations that I already know I can describe—or which caught my attention in the way that landscaping crew did—rather than scrambling to push out something original under pressure.

Joseph O'Neill

In fact, that’s the way a lot of novelists work, particularly on the literary end. One of the striking trends in contemporary fiction is how so much of it doubles as reportage, with miniature New Yorker pieces buried like bonbons within the larger story. This isn’t exactly new: writers from Nabokov to Updike have filled their novels with set pieces that serve, in James Wood’s memorable phrase, as “propaganda on behalf of good noticing.” What sets more recent novels apart is how undigested some of it seems. At times, you can feel the narrative pausing for a page or two as the writer—invariably a talented one, or else these sections wouldn’t survive the editorial process—serves up a chunk of journalistic observation. As Norman Mailer writes, rather unkindly, of Jonathan Franzen:

Everything of novelistic use to him that came up on the Internet seems to have bypassed the higher reaches of his imagination—it is as if he offers us more human experience than he has literally mastered, and this is obvious when we come upon his set pieces on gourmet restaurants or giant cruise ships or modern Lithuania in disarray. Such sections read like first-rate magazine pieces, but no better—they stick to the surface.

This isn’t entirely fair to Franzen, a superb noticer who creates vivid characters even as he auditions for our admiration. But I thought of this again after finishing Joseph O’Neill’s Netherland. It’s a novel I’d wanted to read for years, and I enjoyed it a hell of a lot, while remaining conscious of its constant shifts into what amounts to nonfiction: beautifully written and reported essays on New York, London, the Hague, India, cricket, and just about everything else. It’s a gorgeous book, but it ends up feeling more like a collection of lovingly burnished parts than a cohesive whole, and its acts of noticing occasionally interfere with its ability to invent real interactions for its characters. It was Updike himself, I think, who warned writers against mining their journals for material, and you can see why: it encourages a sort of novelistic bricolage rather than an organic discovery of the action, and the best approach lies somewhere in the middle. And there’s more than one way of telling a story. As I was studying the landscaping crew at the park, my daughter was engaged in a narrative of her own: she ran into her friend Elise, played on the seesaw, and then had to leave abruptly for a diaper change. Or, as Beatrix put it, when I asked about her day: “Park. Elyse. Say hi. Seesaw. Poop. Go home.” And I don’t think I can do better than that.

Capturing The Goldfinch

with 10 comments

Donna Tartt

Last week, I finally finished Donna Tartt’s The Goldfinch, something like six months after I first picked it up. This protracted reading period wasn’t entirely the book’s fault: I’ve been so preoccupied by work and family, and plain exhausted at night, that I’ve rarely had a chance to sit down and read more than a few pages at a time. And there’s no question that a page or two of The Goldfinch goes down as smooth and easy as a vanilla milkshake. After a hundred more, though, you find yourself in much the same place as you started, and as painless as it is, you start to wonder if it’s all really worth it. Its narrator, Theo Decker, may be the most passive protagonist I’ve ever encountered in a mainstream novel, and for grindingly long stretches, the novel traps you in the same kind of stasis. Over the course of more than seven hundred pages, Theo undertakes maybe three meaningful actions, and he spends the rest of the book in a riot of noticing, unspooling dense paragraphs of details and quirks and brand names. And it’s all true to his character. After surviving a bombing in New York that claimed his mother’s life, Theo spends the next decade in a state of paranoid numbness, a condition that would result in exactly the book we have here.

That doesn’t sound like a potential bestseller, but The Goldfinch has been a true phenomenon, moving over a million copies in hardcover on its way to a Pulitzer Prize. Part of its success has to do with how it keeps the pages turning, even through huge chunks of nonaction, and this is all to Tartt’s credit—to a point. Yet there’s no avoiding a sense that twenty or even fifty pages at a time could be lifted out of the book’s middle sections without anyone noticing. If it were a deliberate attempt to replicate Theo’s shellshocked brain, it would be a considerable literary achievement, but I have a sneaking suspicion that the causal arrow ran in the opposite direction. If Theo comes off as passive, it’s because the book around him fails to find a convincing shape for itself, not the other way around. Tartt is a writer of huge merits: when she’s on fire, as during the lengthy section in Las Vegas, she can deliver set pieces that rank with the best that contemporary fiction has to offer. And her book doesn’t lack for eventfulness. But the incidents don’t build so much as accumulate, like Tartt’s fat descriptive paragraphs, and I have a feeling that a lot of readers emerge in agreement with what Samuel Johnson said about Milton: “Paradise Lost is one of the books which the reader admires and puts down, and forgets to take it up again. None ever wished it longer than it is.”

The Goldfinch by Donna Tartt

Which is the real reason it took me six months to read, when I might have polished off a more focused—or shorter—version of the same story over a long weekend. But I don’t mean to echo those critics, like James Wood of The New Yorker or Francine Prose of The New York Review of Books, who see the success of The Goldfinch as a symptom of a wider decline in literary standards. They seem to regret that Tartt didn’t write a different novel entirely, but as today’s quote from Christian Friedrich Hebbel reminds us, that’s a pernicious form of criticism. A novel, like a poem, deserves to be judged on the author’s intentions. (Wood is accurate, though, when he points out that Tartt’s American characters “move through a world of cozy Britishisms, like ‘they tucked into their food,’ ‘you look knackered,’ ‘crikey,’ ‘skive off,’ and ‘gobsmacked.'” It reminds me of what Lost in Space actor Jonathan Harris was reported to say when asked if he was British: “Oh no, my dear, just affected.”) But I’m not sure Tartt succeeds at the kind of novel she evidently wanted to write. I take a lot of interest in the intersection between literary and mainstream fiction: it’s where I see myself, even if my published novels skew more to the genre side. And I’d love to see Tartt pull it off, as she did, more or less, with The Secret History. But as eventful as The Goldfinch is, Tartt never convinces me that she knows how to construct a plot that would justify the investment of time it demands. And that’s a shame.

There’s a great deal of craft, obviously, involved in writing a huge, mostly readable novel through the eyes of a character who abdicates all responsibility for his fate, and who plays a minimal part in his own story’s resolution. Tartt refined the manuscript for eleven years, and she apparently wrote and discarded entire sections that required months of work. This may be part of the reason why The Goldfinch sometimes reads like a novel with its focus on all the wrong places: not just on Theo, who is the least compelling character in sight, but on the parts of his life it chooses to dramatize. (There’s a gutsy jump in time, effective in itself, that unfortunately skips over the single most interesting thing Theo ever does: he decides to become a con artist, which must have required considerable skill and ingenuity, but everything he attempts in that line is kept offstage, and instead, we’re treated to one chapter after another of Theo as a useless sad sack.) Tartt’s effort and accomplishment show on every page, but I can’t shake a nagging sense that this is the kind of book that Stephen King, one of the novel’s fans, could have cranked out in a year or so with less fuss. The result looks a lot like the kind of novel that many readers dream of finding, a great read of real literary heft, and it poses convincingly as one from sentence to sentence. But we can do better, and so can Tartt. A Pulitzer and a million copies sold aren’t likely to convince her of this—but I hope she takes another crack at it, and sooner than ten years from now.

The middle ground

leave a comment »

The Mad Men episode "In Care Of"

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What series are you waiting to dive into until you can do it all at once?”

Yesterday, while leafing through a recent issue of The New Yorker, I came across the following lines in a book review by James Wood:

[Amit Chaudhuri] has struggled, as an Indian novelist writing in English, with the long shadow of Salman Rushdie’s Booker-winning novel Midnight’s Children…and with the notion, established in part by the success of that book, that fictional writing about Indian life should be noisy, magical, hybrid, multivocally “exotic”—as busy as India itself…He points out that in the Bengali tradition “the short story and novella have predominated at least as much as the novel,” and that there are plenty of Indian writers who have “hoped to suggest India by ellipsis rather than by all-inclusiveness.”

Wood, who is no fan of the “noisy, magical, hybrid” form that so many modern novels have assumed, draws an apt parallel to “the ceaseless quest for the mimetically overfed Great American Novel.” But an emphasis on short, elliptical fiction has been the rule, rather than the exception, in our writing programs for years. And a stark division between big and small seems to be true of most national literatures: think of Russia, for instance, in which Eugene Onegin stands as the only real rival as a secular scripture to the loose, baggy monsters of Tolstoy and Dostoyevsky.

Yet most works of art, inevitably, end up somewhere in the middle. If we don’t tend to write essays or dissertations about boringly midsized novels, which pursue their plot and characters for the standard three hundred pages or so, it’s for much the same reason that we don’t hear much about political moderates: we may be in the majority, but it isn’t news. Our attention is naturally drawn to the extreme, which may be more interesting to contemplate, but which also holds the risk that we’ll miss the real story by focusing on the edges. When we think about film editing, for instance, we tend to focus on one of two trends: the increasingly rapid rate of cutting, on the one hand, and the fetishization of the long take, on the other. In fact, the average shot length has been declining at a more or less linear rate ever since the dawn of the sound era, and over the last quarter of a century, it’s gone from about five seconds to four—a change that is essentially imperceptible. The way a movie is put together has remained surprisingly stable for more than a generation, and whatever changes of pace we do find are actually less extreme than we might expect from the corresponding technical advances. Digital techniques have made it easier than ever to construct a film out of very long or very short shots, but most movies still fall squarely in the center of the bell curve. And in terms of overall length, they’ve gotten slightly longer, but not by much.

Emilia Clarke on Game of Thrones

That’s true of other media as well. Whenever I read think pieces about the future of journalism, I get the impression that we’ve been given a choice between the listicle and the longread: either we quickly skim a gallery of the top ten celebrity pets, or we devote an entire evening to scrolling through a lapbreaker like “Snow Fall.” Really, though, most good articles continue to fall in the middle ground; it’s just hard to quantify what makes the best ones stand out, and it’s impossible to reduce it to something as simple as length or format. Similarly, when it comes to what we used to call television, the two big stories of the last few years have been the dueling models of Vine and Netflix: it seems that either we can’t sit still for more than six seconds at a time, or we’re eager to binge on shows for hours and hours. There are obvious generational factors at play here—I’ve spent maybe six seconds total on Vine—but the division is less drastic than it might appear. In fact, I suspect that most of us still consume content in the way we always have, in chunks of half an hour to an hour. Mad Men was meant to be seen like this; so, in its own way, was Community, which bucked recent trends by releasing an episode per week. But it isn’t all that interesting to talk about how to make a great show that looks more or less like the ones that have come before, so we don’t hear much about it.

Which isn’t to say that the way we consume and think about media hasn’t changed. A few years ago, the idea of waiting to watch a television show until its entire run was complete might have seemed ridiculous; now, it’s an option that many of us seriously consider. (The only series I’ve ever been tempted to wait out like this was Lost, and it backfired: once I got around to starting it, the consensus was so strong that it went nowhere that I couldn’t bring myself to get past the second season.) But as I’ve said before, it can be a mistake for a television show—or any work of art—to proceed solely with that long game in mind, without the pressure of engaging with an audience from week to week. We’re already starting to see some of the consequences in Game of Thrones, which thinks entirely in terms of seasons, but often forgets to make individual scenes worth watching on a level beyond, “Oh, let’s see what this guy is doing.” But a show that focuses entirely on the level of the scene or moment can sputter out after a few seasons, or less: Unbreakable Kimmy Schmidt had trouble sustaining interest in its own premise for even thirteen episodes. The answer, as boring as it may be, lies in the middle, or in the narratives that think hard about telling stories in the forms that have existed before, and will continue to exist. The extremes may attract us. But it’s in the boring middle ground that the future of an art form is made.

“History often had plans of its own…”

leave a comment »

"According to legend..."

Note: This post is the sixteenth installment in my author’s commentary for Eternal Empire, covering Chapter 17. You can read the previous installments here.

“A genre is hardening,” the literary critic James Wood wrote fifteen years ago, in his enormously influential New Republic essay “Hysterical Realism.” It’s the set of conventions, he observed, that we see in so many big, ambitious novels published in the last few decades: they’re crammed with plot and information, and they often take a greater interest in how social and political systems work than in the inner lives of their own characters. Dickens provides the original model, with Pynchon setting the standard, followed by the likes of Rushdie, Wallace, and DeLillo. Woods quotes Zadie Smith, who says that she’s concerned with “ideas and themes that I can tie together—problem-solving from other places and worlds,” and who goes on to state:

[It’s not the writer’s job] to tell us how somebody feels about something, it’s to tell us how the world works…These are guys who know a great deal about the world. They understand macro-microeconomics, the way the Internet works, math, philosophy, but…they’re still people who know something about the street, about family, love, sex, whatever. That is an incredibly fruitful combination. If you can get the balance right. And I don’t think any of us have quite yet, but hopefully one of us will.

Woods, as the title of his essay implies, isn’t a fan. He notes, accurately, that this kind of “realism” can serve as an evasion of reality itself: it allows writers to retreat, fashionably, from the unglamorous consideration of the genuine emotions of real men and women. And even if you’re determined to work within that genre, the challenge, as Smith says, is balance. An ambitious literary novel these days is expected to move between two or more registers: the everyday interactions of its characters and the larger social context—meticulously researched and imagined—in which the human story takes place. Shifting between these levels is a hard technical problem, and we can feel the strain even in good novels. In Smith’s White Teeth, Woods sees “an instructive squabble…between these two literary modes,” and a book like The Corrections gains much of its interest from the tension between these kinds of storytelling. Jonathan Franzen, who is as smart a writer as they come, has as much trouble as anyone with managing those transitions: all too often, we end up with passages that read, as Norman Mailer puts it, like “first-rate magazine pieces, but no better.” But in a really fine example of the form, like Joseph O’Neill’s Netherland, the social concerns emerge so organically from the story that it’s hard to tell where one leaves off and the other begins.

"History often had plans of its own..."

What’s funny, of course, is that genre novelists have been dealing with these issues for a long time, and literary fiction is only now taking up the challenge. Science fiction or fantasy, for instance, is invariably set in an unfamiliar world, the rules of which need to be conveyed seamlessly within the action, and one of the first problems any thoughtful writer confronts is how to establish this background in an unobtrusive way. It also affects historical fiction, or even suspense, which often takes place in a realm far removed from the reader’s experience. And the bad examples—in which the story grinds to a halt as the author explains the workings of interstellar travel or the political situation in his warring kingdoms—aren’t so different from the moments in which hysterical realism abandons its characters for a treatise on geopolitical trade. The difference is that it’s our own world that these novels are describing, as if the authors were alien journalists encountering it for the first time. That kind of fictional reportage can be valuable: at its best, it forces us to see the world around us with new eyes, or discloses patterns that have lurked there unseen. But literary fiction, which was able to stick to a narrowly focused register for so long, is still figuring out what the best genre novelists have been doing for decades.

So what does this have to do with Eternal Empire? Like many suspense novels, it devotes ample space to filling in background—on the British prison system, the security services, and the world of oligarchs and gangsters—that few readers could be expected to know firsthand. It also follows a template, established by the first two books in the series, of engaging with history and religion, which creates another level of story in which it has to dip from time to time. I devoted a lot of effort, possibly too much, to integrating those digressions in ways that seemed natural, and it wasn’t always easy. In Chapter 17, for instance, I include a page of material about the Khazars, the enigmatic tribe of Central Asian horsemen that disappeared shortly after their unprecedented conversion to Judaism. The Khazars aren’t essential to the story; they serve primarily as a kind of sustained analogy for Ilya’s inward journey, to a degree that isn’t clear until the end. I realized early on that it would be asking too much of the reader to deliver all of this material at once, so I carved it up into three or four shorter sections, each of which represented a self-contained stage, and inserted them at points in which Ilya’s own thoughts or situation provided a natural transition. (They also serve, more practically, to create a pause in the action where such a delay seemed useful.) The result sometimes resembles the “squabble” that Woods sees in more literary novels. But the problem of moving between two worlds is one that most writers, like Ilya, will have to confront sooner or later…

Written by nevalalee

April 23, 2015 at 9:56 am

The act of noticing

leave a comment »

Jonathan Franzen

Yesterday, while playing with my daughter at the park, I found myself oddly fascinated by the sight of a landscaping crew that was taking down a tree across the street. It’s the kind of scene you encounter on a regular basis in suburbia, but I wound up watching with unusual attention, mostly because I didn’t have much else to do. (I wasn’t alone, either. Any kind of construction work amounts to the greatest show on earth for toddlers, and there ended up being a line of tiny spectators peering through the fence.) Maybe because I’ve been in a novelistic state of mind recently, I focused on details that I’d never noticed before. There’s the way a severed tree limb dangles from the end of the crane almost exactly like a hanged man, as Eco describes it in Foucault’s Pendulum, with its heavy base tracing a second, smaller circle in the air. I noted how a chainsaw in action sprays a fan of fine particles behind it, like a peacock’s tail. And when the woodchipper shoots chips into the back of the truck, a cloud of light golden dust forms above the container, like the soul of the tree ascending.

As I watched, I had the inevitable thought: I should put this into a story. Unfortunately, my current novel project doesn’t include a landscaping scene, and the easiest way to incorporate it would be through some kind of elaborate metaphor, as we often see, at its finest, in Proust. (“As he listened to her words, he found himself reminded of a landscaping crew he had once seen…”) But it made me reflect both on the act of noticing and on the role it plays, or doesn’t, in my own fiction. Most of the time, when I’m writing a story, I’m following the dictates of a carefully constructed plot, and I’ll find myself confronted by a building or a city scene that has imposed itself by necessity on the action: my characters end up at a hospital or a police station, and I strain to find a way of evoking it in a few economical lines that haven’t been written a million times before. Occasionally, this strikes me as a backward way of working. It would be better, it seems, to build the story around locations and situations that I already know I can describe—or which caught my attention in the way that landscaping crew did—rather than scrambling to push out something original under pressure.

Joseph O'Neill

In fact, that’s the way a lot of novelists work, particularly on the literary end. One of the striking trends in contemporary fiction is how so much of it doubles as reportage, with miniature New Yorker pieces buried like bonbons within the larger story. This isn’t exactly new: writers from Nabokov to Updike have filled their novels with set pieces that serve, in James Wood’s memorable phrase, as “propaganda on behalf of good noticing.” What sets more recent novels apart is how undigested some of it seems. At times, you can feel the narrative pausing for a page or two as the writer—invariably a talented one, or else these sections wouldn’t survive the editorial process—serves up a chunk of journalistic observation. As Norman Mailer writes, unkindly, of Jonathan Franzen:

Everything of novelistic use to him that came up on the Internet seems to have bypassed the higher reaches of his imagination—it is as if he offers us more human experience than he has literally mastered, and this is obvious when we come upon his set pieces on gourmet restaurants or giant cruise ships or modern Lithuania in disarray. Such sections read like first-rate magazine pieces, but no better—they stick to the surface.

This isn’t entirely fair to Franzen, a superb noticer who creates vivid characters even as he auditions for our admiration. But I thought of this again after finishing Joseph O’Neill’s Netherland this week. It’s a novel I’d wanted to read for years, and I enjoyed it a hell of a lot, while remaining conscious of its constant shifts into what amounts to nonfiction: beautifully written and reported essays on New York, London, the Hague, India, cricket, and just about everything else. It’s a gorgeous book, but it ends up feeling more like a collection of lovingly burnished parts than a cohesive whole, and its acts of noticing occasionally interfere with its ability to invent real interactions for its characters. It was Edmund Wilson, I think, who warned writers against mining their journals for material, and you can see why: it encourages a sort of novelistic bricolage rather than an organic discovery of the action, and the best approach lies somewhere in the middle. And there’s more than one way of telling a story. As I was studying the landscaping crew at the park, my daughter was engaged in a narrative of her own: she ran into her friend Elyse, played on the seesaw, and then had to leave abruptly for a diaper change. Or, as Beatrix put it, when I asked about her day: “Park. Elyse. Say hi. Seesaw. Poop. Go home.” And I don’t think I can do better than that.

Is storytelling kid’s stuff?

with 2 comments

John Tenniel's illustration of the Red Queen

Over the last few months, I’ve been spending a lot of time with my daughter at the main branch of the Oak Park Public Library. When you’re a full-time dad, you’re constantly in search of places where your child can romp happily for half an hour without continuous supervision, and our library fills that need admirably: it’s full of physical books, toys, activities, and new faces and friends, so I can grab a chair in the corner and take a richly deserved minute or two for myself while Beatrix goes exploring within my line of sight. Sometimes, when it looks like she’ll be staying put for a while, I’ll get up to browse the books on the shelves, both with an eye to my daughter’s reading and to my own. I’ll often pick up a title I remember and find myself lost in it all over again, and it’s a pleasure to discover that old favorites as different as The Way Things Work, The Eleventh Hour, and D’Aulaires’ Norse Myths have lost none of their fascination. There’s a considerable overlap between what kids and adults find interesting, and the best children’s books, like the best movies, can hold anyone’s attention.

I recently found myself thinking about this more intently, after discovering a shelf at the library that I’d somehow overlooked before. It’s a section devoted to classic literature for kids, and all of the usual suspects are here, from Anne of Green Gables to Alice’s Adventures in Wonderland—the latter of which is still the best children’s book ever written, and possibly, as Alan Perlis observed, the best book ever written about anything. But there were also many titles that weren’t originally written for younger readers but have been retroactively absorbed into the young adult canon. There was a generous selection of Dickens, for example, not far from Richard Lattimore’s translation of the Iliad and the collected stories of Edgar Allan Poe, and the same process has already gone to work on J.R.R. Tolkien. Novels of an earlier era that were written by grownups for other grownups start to look like children’s books: neither The Last of the Mohicans nor Huckleberry Finn nor To Kill a Mockingbird were conceived as works for young readers, but now we’re as likely to see them here as Laura Ingalls Wilder.

David Mitchell

There are a lot of possible explanations for this phenomenon, none of which are especially mysterious. Most of these books were four-quadrant novels in the first place: Dickens, like J.K. Rowling, was devoured by everyone at the time who could read. Many feature younger protagonists, so we naturally tend to classify them, rightly or wrongly, as children’s books, which also applies to stories, like the Greek myths, that contain elements of what look today like fantasy. And a lot of them are on school curricula. But there’s also a sense in which the novel, like any art form, advances in such a way to make its most innovative early examples feel a bit naive, or like more primal forms of storytelling that appeal to readers who are still working their way into the medium. Plato says that if the mythical sculptor Daedalus were to appear and start make statues again, we’d all laugh at him, and something similar seems to take place within literature. As the opening paragraph of James Wood’s recent review of the new David Mitchell novel makes clear, critics have a way of regarding storytelling as somewhat suspicious: “The embrace of sheer occurrence, unburdened by deeper meaning.” It feels, in short, like kid’s stuff.

But it isn’t, not really, and it’s easy to invert the argument I’ve given above: the books that last long enough to be assimilated into children’s literature are the ones that offer universal narrative pleasures that have allowed them to survive. Don Quixote can be found in the children’s section, at least in its abridged form, but it’s also, as Harold Bloom says, “the most advanced work of prose fiction we have.” A bright kid wants to read Homer or Poe because of the virtues that make them appealing to everyone—and it’s worth noting that most libraries keep two sets of each on hand, one in the children’s section, the other for adults. Every generation produces reams of stories written specifically for children, and nearly all of them have gone out of print, leaving only those books that pursued story without regard for any particular audience. The slow creep of classic literature into the children’s library is only a mirror image of the more rapid incursion, which we’ve seen in recent years, of young adult literature into the hands of grownups, and I don’t think there’s any doubt as to which is the most positive trend. But they’re both reflections of the same principle. Storytelling breaks through all the categories we impose, and the real measure of value comes when we see what children are reading, on their own, a hundred years from now.

The weather men

with 2 comments

Elmore Leonard

“Never open a book with the weather,” Elmore Leonard said, and he was absolutely right. Still, the fact that he felt compelled to put this admonition at the top of his ten rules of writing testifies to the fact that there’s something about weather—and, more generally, the description of the environment in which a story takes place—that novice authors find irresistible. The weather, as we all know, is a classic topic for small talk because it affects all of us equally, and we can all be expected to take at least a passing interest in what kind of day it looks to be. Much the same impulse applies to describing the weather in fiction: it comes easily to mind when we’re sketching the outlines of a scene, it allows us to ease into the day’s work without much effort, and it feels, based on our memories of the other stories we’ve read, like the sort of thing that belongs somewhere at the beginning. But while it’s fine to use the weather or the landscape as an entry point into the story when you’re working on a first draft, in the rewrite, nearly all of it can be cut, especially when it occurs in a story or chapter’s crucial opening lines.

A description of the weather is a bad choice for the opening of a story for the same reason it comes so easily: it’s fundamentally impersonal. Unless the story is explicitly about man versus nature—and even then, you’re usually better off starting with the man—most good narratives center on human problems, and particularly on the choices made by the protagonist to meet a series of objectives. There’s nothing in the weather that applies specifically to any one individual: the rain falls on the just and the unjust, so you’re wasting valuable space with lines that convey no information to the reader. There’s a place, obviously, for atmosphere and scenic description, but it generally fits best at a point where the conflict and personalities have already been established. Like a television show that returns from a commercial break on a tight closeup of the lead, reserving the wide shot until after the scene is in motion, a good scenic description sets the stage only once the players have been introduced. As the beginning, it’s the literary equivalent of small talk; it may be superficially painless, and it gets you safely to the other side of the first paragraph, but it’s hard to expect any reader to really care.

Gone With the Wind

Of course, there are times when the weather can be an active player in the narrative, and not just when the characters are set against it like King Lear in the storm. If you’re a writer, like Updike or Nabokov, given to what James Wood calls “propaganda on behalf of good noticing,” the weather can be just another subject on which you can exercise your gifts for description, although you’d better be sure before you begin that the result will reward this test on the reader’s patience. More subtly, the description of a character’s surroundings can be used to evoke an inner state or mood. Sometimes this skirts dangerously to the pathetic fallacy, or the urge to attribute human emotion to impersonal forces of nature, but when embedded within a conventional first-person or limited third-person viewpoint, it makes perfect sense. When we’re absorbed in what we’re doing, we may not notice the weather at all; when we’re worried, nervous, or depressed, we naturally pick out aspects of our surroundings that remind us of our own feelings. When every detail is channeled through one character’s point of view, the sky can be a mirror of the self—although, again, this assumes that we’ve already been given a particular pair of eyes though which to see.

Even in narratives that are written more objectively, there’s room for description that grounds characters in environments that are secretly expressions of personality. The fantasy author Steve Rasnic Tem calls this dream characterization:

A particular theory of gestalt dream interpretation suggests that every object in a dream is a piece of the dreamer. A chair, a table, a car, another human being—each would represent some aspect of the dreamer…We might say that all other objects in the story—the landscape, the other characters, the supernatural presence, even the individual events—represent some aspect of the protagonist…Each piece suggests or tells us something about our main character. Far more, I suspect, than a delineation of traits and opinions ever could.

And there’s no question that the environment of a scene can influence our impressions. There’s a famous story about David Selznick trying to decide what the weather should be in the final scene of Gone With the Wind, after Rhett delivers his last line to Scarlett. If Rhett had left on a pleasant evening, the audience might assume that he would return one day; or, if he walked off into the rain, that he would never come back. In the final version, he disappears into a dense fog, which neatly splits the difference. Even the weather, then, has its uses. But it needs to flow from character and situation, rather than being imposed from above, if the reader is going to give a damn.

Written by nevalalee

June 10, 2014 at 9:46 am

What do you care what other people think?

with 2 comments

Immanuel Kant

We’re often told that we shouldn’t care about what other people think, but of course, we’re mindful of this all the time, and sometimes it leads to better behavior, in ways both large and small. When I’m noodling around on the ukulele, I find that my performance gets more focused when I imagine myself playing for an imaginary audience. Whenever I make an investment decision, I ask myself whether John Bogle—or, more accurately, the obsessively frugal index investors on the Bogleheads forum—would approve. More generally, when I stand back to look at my life, I often think about how it would seem to someone observing from the outside. I’m not sure who this hypothetical observer would be; perhaps, to take a page from Matthew McConaughey’s Oscar speech, it’s myself ten years from now. It’s a small thing, but I’d like to believe that it makes me slightly more civilized in my everyday actions. The existentialists believed that we should act as if what we did set the example for the rest of mankind, which only paraphrases what Kant said two centuries earlier: “Live your life as though your every act were to become a universal law.”

Of course, that’s an impossibly high standard to maintain, so it’s usually enough to think in terms of one person, living or dead, real or imaginary, whose approval we’d like to earn. In writing, this takes the form of an ideal reader to whom all of our work is addressed, and I suspect that nearly every writer does this, whether consciously or not. In some ways, there’s no more fundamental decision in a writer’s life than the question of what reader you’re trying to impress. It shapes the projects you tackle and the style you employ, and it even influences some of your larger life decisions, like whether you want to end up in Iowa or New York. In practice, you’ll find yourself writing with an eye to real individuals with an ability to directly influence the outcome: trusted readers, prospective agents, busy editors. Over time, though, our ideal reader starts to resemble a composite of all these people, or a version of a particular person in our lives who may never see the draft we’re working on now. Ideally, this hypothetical reader should be benevolent but also a little scary, and the standards he or she sets for us should be at least somewhat higher than the ones we’d be willing to settle for ourselves.

Zadie Smith

Sometimes, our imaginary reader is another author whose work we admire, which can set insurmountable standards of its own: if we’re constantly wondering, as the critic James Wood says somewhere, what Flaubert would think of the sentence we’re writing, most of us wouldn’t get past the first paragraph. More commonly, this voice is often the product of the author’s own life story. In my own fiction, on the largest scale, I’m trying to live up to the standard that I set for myself when I was a child, back when nothing seemed more magical than the prospect of telling stories for a living. On a more granular level, I find that I’m often writing with an eye to the first writer who ever gave me useful feedback on a story. (I won’t mention him by name, but you can read more about him here.) Back when I was starting out, he read several of my stories and covered the pages with merciless notes and corrections, and although the process was draining, I’m convinced that it allowed me to get published five years earlier than I otherwise would. One of the stories he read, “Inversus,” was my first sale to Analog, and I don’t think it would have sold at all in its unedited form—which might well have discouraged me from pursuing that audience at all.

As a result, whenever I go over a draft, I’m frequently asking myself what he would think. It forces me to be harder on myself than I otherwise would: I’ll sometimes cross out entire pages and cut others to the bone, knowing that he’d react to what was currently there with a marginal question mark or even just a simple “No.” Of course, I’m really listening to my own inner voice, which has quietly taken on the qualities of the editors and readers I’ve come to respect. It’s a voice that is rightfully skeptical of everything it sees—as both Samuel Butler and Zadie Smith have pointed out, it’s a good habit to look over your work as if it were being read by an enemy—and I don’t think it would work nearly as well if I didn’t think of it as something external to me. I turn it off as much as I can during the first draft, but crank it up during the rewrite, when there’s no danger of fear or anxiety preventing me from at least finishing a manuscript. And although I try not to read published work with that voice, since there’s no changing what is already in print, I still sometimes sense it shaking its head when I go back to revisit a story, asking me: “Is that really what you wanted to say?”

Written by nevalalee

March 4, 2014 at 8:42 am

“Make it recognizable!”

leave a comment »

David Mamet

I’ve mentioned before how David Mamet’s little book On Directing Film rocked my world at a time when I thought I’d already figured out storytelling to my own satisfaction. It provides the best set of tools for constructing a plot I’ve ever seen, and to the extent that I can call any book a writer’s secret weapon, this is it. But I don’t think I’ve ever talked about the moment when I realized how powerful Mamet’s advice really is. The first section of the book is largely given over to a transcript of one of the author’s seminars at Columbia, in which the class breaks down the beats of a simple short film: a student approaches a teacher to request a revised grade. The crucial prop in the scene, which is told entirely without dialogue, is the student’s notebook, its contents unknown—and, as Mamet points out repeatedly, unimportant. Then he asks:

Mamet: What answer do we give to the prop person who says “what’s the notebook look like?” What are you going to say?

The students respond with a number of suggestions: put a label on it, make it look like a book report, make it look “prepared.” Mamet shoots them down one by one, saying that they’re things that the audience can’t be expected to care about, if they aren’t intrinsically impossible:

Mamet: No, you can’t make the book look prepared. You can make it look neat. That might be nice, but that’s not the most important thing for your answer to the prop person…To make it prepared, to make it neat, to make it convincing, the audience ain’t going to notice. What are they doing to notice?
Student: That it’s the same book they’ve seen already.
Mamet: So what’s your answer to the prop person?
Student: Make it recognizable.
Mamet: Exactly so! Good. You’ve got to be able to recognize it. That is the most important thing about this report. This is how you use the principle of throughline to answer questions about the set and to answer questions about the costumes.

A recognizable notebook

Now, this might seem like a small thing, but to me, this was an unforgettable moment: it was a powerful illustration of how close attention to the spine of the plot—the actions and images you use to convey the protagonist’s sequence of objectives—can result in immediate, practical answers to seemingly minor story problems, as long as you’re willing to rigorously apply the rules. “Make it recognizable,” in particular, is a rule whose true value I’ve only recently begun to understand. In writing a story, regardless of the medium, you only have a finite number of details that you can emphasize, so it doesn’t hurt to focus on ones that will help the reader recognize and remember important elements—a character, a prop, an idea—when they recur over the course of the narrative. Mamet notes that you can’t expect a viewer to read signs or labels designed to explain what isn’t clear in the action, and it took me a long time to see that this is equally true of the building blocks of fiction: if the reader needs to pause to remember who a character is or where a certain object has appeared before, you haven’t done your job as well as you could.

And like the instructions a director gives to the prop department, this rule translates into specific, concrete actions that a writer can take to keep the reader oriented. It’s why I try to give my characters names that can be readily distinguished from one another, to the point where I’ll often try to give each major player a name that begins with a different letter. This isn’t true to life, where, as James Wood points out, we’re likely to know three people named John and three more named Elizabeth, but it’s a useful courtesy to the reader. The same applies to other entities within the story: it can be difficult to keep track of the alliances in a novel like Dune, but Frank Herbert helps us tremendously by giving the families distinctive names like House Atreides and House Harkonnen. (Try to guess which house contains all the bad guys.) This is also why it’s useful to give minor characters some small characteristic to lock them in the reader’s mind: we may not remember that we’ve met Robert in Chapter 3 when he returns in Chapter 50, but we’ll recall his bristling eyebrows. Nearly every choice a writer makes should be geared toward making these moments of recognition as painless as possible, without the need for labels. As Mamet says: “The audience doesn’t want to read a sign; they want to watch a motion picture.” And to be told a story.

Written by nevalalee

June 19, 2013 at 9:02 am

Facts with a side of violence

with 2 comments

Frederick Forsyth

Over the last few weeks, I’ve been rereading The Dogs of War by Frederick Forsyth, my favorite suspense novelist. I’ve mentioned before that Forsyth is basically as good as it gets, and that he’s the writer I turn to the most these days in terms of pure enjoyment: he operates within a very narrow range of material and tone, but on those terms, he always delivers. Reading The Dogs of War again was a fascinating experience, because although it takes place in the world of mercenaries and other guns for hire, it contains surprisingly little action—maybe thirty pages’ worth over the course of four hundred dense pages. The rest of the novel is taken up by an obsessively detailed account of how, precisely, a privately funded war might be financed and equipped, from obtaining weapons to hiring a ship to acquiring the necessary amount of shirts and underwear. And although the amount of information is sometimes overwhelming, it’s always a superlatively readable book, if only because Forsyth is a master of organization and clarity.

Of course, it also works because it’s fun to learn about these things. The Dogs of War is perhaps the ultimate example of the kind of fiction that Anthony Lane, speaking of Allan Folsom’s The Day After Tomorrow, has dismissed as “not so much a novel as a six-hundred-page fact sheet with occasional breaks for violence.” Yet the pleasure we take in absorbing a few facts while reading a diverting thriller is perfectly understandable. Recently, I saw a posting on a social news site from a commenter who said that he didn’t read much, but was looking for novels that would teach him some things while telling an interesting story. I pointed him toward Michael Crichton, who is one of those novelists, like Forsyth, whose work has inspired countless imitators, but who remains the best of his breed. This kind of fiction is easy to dismiss, but conveying factual information to a reader is like any other aspect of writing: when done right, it can be a source of considerable satisfaction. In my own novels, I’ve indulged in such tidbits as how to build a handheld laser, how to open a Soviet weapons cache, and what exactly happened at the Dyatlov Pass.

Michael Crichton

That said, like all good things, the desire to satisfy a reader’s craving for information can also be taken too far. I’ve spoken elsewhere about the fiction of Irving Wallace, who crams his books with travelogues, dubious factoids, and masses of undigested research—along with a few clinical sex scenes—until whatever narrative interest the story once held is lost. And my feelings about Dan Brown are a matter of record. Here, as in most things, the key is balance: information can be a delight, but only in the context of a story that the reader finds engaging for the usual reasons. Its effectiveness can also vary within the work of a single author. Forsyth is great, but the weight of information in some of his later novels can be a little deadening; conversely, I’m not a fan of Tom Clancy, and gave up on The Cardinal of the Kremlin after struggling through a few hundred pages, but I found Without Remorse to be a really fine revenge story, hardware and all. The misuse of factual information by popular novelists has given it a bad reputation, but really, like any writing tool, it just needs to be properly deployed.

And it’s especially fascinating to see how this obsession with information—in a somewhat ambivalent form—has migrated into literary fiction. It’s hard to read Thomas Pynchon, for instance, without getting a kick from his mastery of everything from Tarot cards to aeronautical engineering, and James Wood points out that we see much the same urge in Jonathan Franzen:

The contemporary novel has such a desire to be clever about so many elements of life that it sometimes resembles a man who takes too many classes that he has no time to read: auditing abolishes composure. Of course, there are readers who will enjoy the fact that Franzen fills us in on campus politics, Lithuanian gangsters, biotech patents, the chemistry of depression, and so on…

Yet Franzen, like Pynchon, uses voluminous research to underline his point about how unknowable the world really is: if an author with the capacity to write limericks about the vane servomotor feels despair at the violent, impersonal systems of which we’re all a part, the rest of us don’t stand a chance. Popular novelists, by contrast, use information for the opposite reason, to flatter us that perhaps we, too, would make good mercenaries, if only we knew how to forge an end user certificate for a shipment of gun parts in Spain. In both cases, the underlying research gives the narrative a credibility it wouldn’t otherwise have. And the ability to use it correctly, according to one’s intentions, is one that every writer could stand to develop.

A fella smarter than myself

with 2 comments

Gene Hackman in Heist

I tried to imagine a fella smarter than myself. Then I tried to think, “What would he do?”

—David Mamet, Heist

Writers, by definition, are always trying to punch above their weight. When you sit down to write a novel for the first time, you’re almost comically inexperienced: however many books you’ve read or short stories you’ve written, you still don’t know the first thing about structuring—or even finishing—a long complicated narrative. Yet we all do it anyway. This is partly thanks to the irrational optimism that I’ve said elsewhere is a crucial element of any writer’s psychological makeup, in which we’re inclined to believe that we’re smarter and more prepared than we actually are. There’s nothing wrong with this; it’s the only way any of us will ever grow as writers, as we slowly evolve into the level of competence we’ve imagined for ourselves. Still, in any project, there always comes a time when a writer, however experienced, realizes that he’s taken on more than he can handle. The story is there, unwritten, and it’s beautiful in his head, but lost in the translation to the printed page. One day, he hopes, he’ll be good enough to realize it, but that doesn’t help him now. What he really needs is a way to temporarily become a better writer than he already is.

This may sound like witchcraft, but in reality, it’s something that writers do all the time. When we start out, we have no choice but to imitate the artists we admire, because when we set out to write that first page, we lack the experience of life and craft that only years of work can bring. Eventually, we move past imitation to find a voice and style of our own, but there are still times when we find ourselves compelled to channel the spirit of our betters. We do this when we start each day by reading a few pages from the work of a writer we like, or when we approach a tough moment in the plot by asking ourselves what Updike or Thomas Harris in his prime would do. Some of us go even further. In this week’s issue of The New Yorker, James Wood talks about a friend who became so obsessed by the work of the Norwegian writer Per Petterson that he copied out one of his novels word for word. This isn’t about stylistic plagiarism or slavish imitation, but a kind of sympathetic magic, a hope that we can conjure up the spirit of a more experienced writer just long enough to solve the problems in front of us.

Kevin Pollak in The Aristocrats

And the act of imitation itself can lead to surprising places. There’s a great deleted scene from the notorious documentary The Aristocrats in which Kevin Pollak delivers the titular joke in the style of Albert Brooks. After milking it for two delicious minutes, he takes a sip of coffee and says:

That’s the trippy thing about doing Brooks, though—I’m faster and funnier than I am as myself. It’s very, very sad. It’s a possession. I hate to do it because, literally, I’m listening to myself and thinking, “Why am I never this funny?”

I’m not a huge Kevin Pollak fan, but I love this clip, because it gets at something important and mysterious about the way artistic imitation works. Pollak is a skilled mimic who does a good, if not great, impression of Albert Brooks on all the superficial levels—his vocal tics, his tone, the way he holds his face and body. Somewhere along the line, though, these surface impressions work a deeper transformation, and he finds himself temporarily thinking like Brooks. This is why typing out the work of a writer we admire can be so helpful: there’s no better way of opening a window, even just for a crucial moment or two, into someone else’s brain.

The best kind of imitation, as Pollak says, is a possession, in which we will ourselves, almost unconsciously, into becoming better artists than we really are. Imitation can become dangerous, however, when we focus on the superficial without also channeling more fundamental habits of mind. This morning, while watching the new teaser trailer for Star Trek: Into Darkness, which clearly takes many of its cues from the recent films of Christopher Nolan, I was amused by the thought that while Nolan has done more than any contemporary director to push the envelope of visual and narrative complexity in mainstream movies, the big takeaway for other filmmakers—or at least those who assemble the trailers—has apparently been a “BWONG” sound effect. But big influences can arise from small beginnings. The qualities that most deserve imitation in the artists we admire have little to do with the obvious trademarks of their style, and if we imitate those aspects alone, we’re just being derivative. But sometimes it’s those little things that allow us to temporarily acquire the mindset of smarter artists than ourselves, until, finally, we’ve made it our own.

Written by nevalalee

December 6, 2012 at 10:12 am

“Passing through the main door of the house…”

leave a comment »

(Note: This post is the twentieth installment in my author’s commentary for The Icon Thief, covering Chapter 19. You can read the earlier installments here.)

One of the curious things about being a novelist is that you’re expected to construct elegant plots, create plausible characters, raise complex ethical and philosophical questions…and you’re also supposed to describe the drapes. Obviously, there’s a wide range of permissible levels of description, but if you’re operating, as James Wood notes, within the rubric of modernist realism—of which most mainstream suspense fiction is a very specific subset—you need to put a fair amount of thought into conveying the look and feel of locations, backgrounds, and everyday objects. And this is more than just window-dressing. Even the most mundane kinds of description are vital for creating atmosphere and sustaining the fictional dream, and like the use of technology in suspense fiction, it serves as a kind of synecdoche for the credibility of the rest of the novel. If a novelist lavishes the right amount of care on describing the ordinary, we’re more likely to trust him when he shows us something a little more farfetched.

This is why our best novelists tend to be great describers, even if that isn’t why we read them in the first place. There’s no particular reason why Proust should simultaneously be a perceptive art critic, our greatest chronicler of sexual jealousy, and also capable of describing how milk looks just before it’s about to boil over, but these skills all come from the same place. Part of the fun of reading a novelist like John Updike comes from his consistent ingenuity of description, as in his famous, and early, description of the rain on a window:

Its panes were strewn with drops that as if by amoebic decision would abruptly merge and break and jerkily run downward, and the window screen, like a sampler half-stitched, or a crossword puzzle invisibly solved, was inlaid erratically with minute, translucent tesserae of rain.

Updike is obviously showing off here, and Wood has called this sort of thing, with reference to Nabokov, “propaganda on behalf of good noticing.” At its worst, Updike’s fiction is nothing but noticing; at its best, it makes his characters and their situations all the more real, even if we sometimes doubt that they’d notice all the things that Updike makes them see. And although the thriller is more constrained when it comes to description—it would be hard to insert the above sentence into any mainstream suspense novel without seeming self-indulgent—it’s still a crucial part of the writer’s art.

In Chapter 19 of The Icon Thief, I was faced with a peculiar descriptive problem. I had to show my characters at an opulent party at an estate in the Hamptons, as well as in the mansion itself, and I had no choice but to describe it in detail. This is in many ways the high point of the novel, in which the action of the second half of the book is largely determined, and it deserved to be treated at full length. The luxurious surroundings aren’t just lifestyle porn, either, but an expression of an important character, the oligarch Anzor Archvadze, and a hint at his underlying personality. The location provides an important grounding for my central characters, Maddy and Ilya, who find themselves in the role of interlopers in this impressive setting, albeit in very different ways. Most of all, as a writer, I saw it as a delicious challenge. When my agent mentioned the parties in The Great Gatsby as one possible model, I could hardly back down. As a result, although I tried to keep the novel’s prose under tight control in most other places, this is the section of the book in which I realized that it might be necessary to describe the drapes, as well as much else besides.

The best solution, clearly, was to arrange for an invitation to one of these parties myself, but this didn’t seem like a realistic option. (I have been to my share of extravagant corporate parties, including one at the Temple of Dendur in the Metropolitan Museum of Art in New York, so I was able to draw upon those memories.) Instead, I found myself doing the next best thing. I went out to Southampton and peeked over hedges. I spent a lot of time reading memoirs of the Hamptons social scene, as well as browsing through interior design profiles and local gossip and real estate magazines, a large stack of which are available at every Hamptons newsstand. I even read a few trashy novels, like Candace Bushnell’s Trading Up, in hopes of picking up a detail or two. In retrospect, I’m glad my writing schedule here was relatively relaxed compared to my timeline for the two following novels, which allowed me to spend more time on this process than I normally could. And the resulting scene seems to work fine. When Maddy enters Archvadze’s mansion, in search of an impressive art collection, and instead sees a Jack Vettriano painting hanging above the mantelpiece, it’s a gag that works only because of the time I’ve invested in setting up every other detail of that scene. And we’re going to be spending a lot of time in this mansion…

Written by nevalalee

October 4, 2012 at 9:27 am

Better late than never: Tinker, Tailor, Soldier, Spy

leave a comment »

It’s taken me a long time to get around to le Carré. As I noted in my review of the recent movie adaptation of Tinker, Tailor, Soldier, Spy, my interest in his great subject—the psychology and culture of spycraft—has always been limited at best, so his books can seem forbiddingly hermetic to a reader like me. A writer like Frederick Forsyth, whom I admire enormously, does a nice job of balancing esoteric detail with narrative thrills, while le Carré, although he’s an ingenious plotter, deliberately holds back from the release of action for its own sake. The difference, perhaps, is that Forsyth was a journalist, while le Carré worked in intelligence himself, which accounts for much of the contrast in their work—one is a great explainer and popularizer, so that his books read like a men’s adventure novel and intelligence briefing rolled into one, while the other is all implication. As a result, while I’ve devoured most of Forsyth’s novels, I’ve tried and failed to get into le Carré more than once, and it’s only recently that I decided to remedy this situation once and for all.

Because there’s an important point to be made about le Carré’s reticence, which is that it ultimately feels more convincing, and lives more intriguingly in the imagination, than the paragraph-level thrills of other books. In interviews, le Carré has noted that many of the terms of spycraft that fill his novels were invented by himself, and weren’t actually used within MI6. This hardly matters, because a reader encountering this language for the first time—the lamplighters, the scalphunters, the janitors—has no doubt that this world is authentic. Forsyth, by contrast, stuffs his books with detail, nearly all of it compelling, but always with the sense that much of this information comes secondhand: we applaud the research, but don’t quite believe in the world. With le Carré, we feel as though we’re being ushered into a real place, sometimes tedious, often opaque, with major players glimpsed only in passing. And even if he’s inventing most of it, it’s still utterly persuasive.

This is the great strength of Tinker, Tailor, Soldier, Spy, which I finished reading this week. Le Carré is the strongest stylist in suspense fiction, and this book is a master class in the slow accumulation of detail and atmosphere. Sometimes we aren’t quite sure what is taking place, either because of the language of spycraft or the density of Britishisms—”a lonely queer in a trilby exercising his Sealyham”—but there’s never any break in the fictional dream. It’s a book that demands sustained engagement, that resolutely refuses to spell out its conclusions, and that always leaves us scrambling to catch up with the unassuming but formidable Smiley. In this respect, Tomas Alfredson’s movie is an inspired adaptation: it visualizes a few moments that the novel leaves offstage, but for the most part, it leave us to swim for ourselves in le Carré’s ocean of names, dates, and faces. (I haven’t seen the classic Alec Guinness version, which I’m saving for when the details of the plot have faded.)

And yet the overall impact is somewhat unsatisfying. Tinker, Tailor is a brilliantly written and constructed novel, but it’s an intellectual experience, not a visceral one. By the end of the book, we’ve come to know Smiley and a handful of others, but the rest are left enigmatic by design, so that the book’s key moment—the revelation of the mole’s identity—feels almost like an afterthought, with no real sense of pain or of betrayal. (The film has many of the same issues, and as I’ve noted before, it gives the game away with some injudicious casting.) This isn’t a flaw, precisely: it’s totally consistent with the book’s tone, which distrusts outbursts of emotion and buries feeling as deep as possible. That air of reserve can be fascinating, but it also leads to what James Wood, for somewhat different reasons, calls le Carré’s “clever coffin”—a narrowness of tone that limits the range of feeling that the work can express, which is often true of even the best suspense fiction. Le Carré’s talent is so great that it inadvertently exposes the limitations of the entire genre, and it’s a problem that we’re all still trying to solve.

Written by nevalalee

September 7, 2012 at 9:21 am

Are authors really too nice?

leave a comment »

Like it or not, authors have to live with other authors. Some may prefer otherwise, and do their best to keep their distance, but most of us end up spending a fair amount of time—in person, in print, and online—interacting with our fellow writers. You can check it up to camaraderie, careerism, or the simple sense that there’s no one else with whom we can talk about the things that matter most to us, as well as the knowledge that, for better or worse, we’re going to be collaborating and competing with these people for a long time. As a result, most of us generally avoid criticizing one another’s work, at least in public. Which isn’t to say that writers aren’t neurotic, needy, petty people—most of us certainly are. But while we may secretly begrudge a friend’s success or agree that this year’s big book is a big bore, we generally keep these opinions to ourselves or share them only in private. As a result, only a handful of major novelists—Updike, Vidal, maybe a few others—have also been major critics. It isn’t for lack of intelligence; it’s more out of prudence or caution.

That’s why I don’t agree with Dwight Garner’s recent assertion that Twitter has somehow made writers less willing to criticize one another in public. Most writers have long since concluded, and rightly so, that it isn’t worth the headache. At best, we tend to reserve our critical arrows for those unlikely to be hurt by what we say, or even to read it at all, which is the real reason why the dead, the famous, and the canonized are such tempting targets. But when it comes to writers on our own level, there’s little to gain and much to lose by criticizing them in print. This isn’t omerta, or a gentlemen’s agreement, but a modus vivendi that avoids problems down the line. Even Norman Mailer, no stranger to conflict, came to the same conclusion. Fifty years ago, in his essay “Some Children of the Goddess,” he took potshots at contemporaries like Styron, Salinger, and Roth, and some never forgave him for it. Ever since, he avoided criticizing his peers, or lobbed his missiles at more resilient targets like Tom Wolfe. And if Mailer, of all people, decided that being a critic was more trouble that it was worth, I can’t blame other writers for concluding the same thing.

And yet it’s also a genuine loss. Dave Eggers isn’t wrong when he advises us not to criticize a novel until we’ve written one, or a movie until we’ve made one. There’s no question that we’d avoid a lot of the nonsense written about movies and books—like the idea, for instance, that a director is the sole author of a film, despite all evidence to the contrary—if more criticism were written by people with experience in the creative field in question. As someone who has done a bit of freelancing myself, I can say that while critics can be driven by ambitions and impulses of their own, these are qualitatively different from the process that underlies the creation of any extended, original work of art. Ideally, then, a literary critic would know something about how a novel is put together, with all the compromises, accidents, and beartraps involved—and there’s no one more qualified to do this than working novelists themselves. But for all the reasons I’ve listed above, there are good reasons why most writers prefer to keep out of it, especially when it comes to the contemporaries about whom they know the most.

In short, the people best equipped to write intelligently about contemporary literature—the writers themselves—have more than enough reason to stand down, and it isn’t necessarily realistic or fair to expect otherwise. Consequently, our best literary critics have often been those with some experience of creative work who have since thrown in their lot on the critical side, which is how we end up with valuable voices like Edmund Wilson or James Wood, who have written novels of their own but found their true calling elsewhere. This isn’t a perfect solution, but it’s a pretty good one, and I’d much rather be reviewed by a critic who at least knew what writing a publishable novel was like. In the end, though, this will always be an issue for literary criticism, which differs from all other fields in that critics and their subjects use the same tools and draw on the same pool of talent. It makes objectivity, bravery, and expertise in a critic all the more precious. And if you want to know what a writer really thinks of his peers—well, just corner him at a party, and believe me, you’ll get an earful.

Written by nevalalee

August 24, 2012 at 9:50 am

Criticizing the critical critic

with 4 comments

Last week, Dwight Garner of the New York Times—arguably one of the two or three most famous literary critics now at work, along with his colleague Michiko Kakutani and The New Yorker‘s James Wood—wrote a long opinion piece titled “A Critic’s Case for Critics Who Are Actually Critical.” In it, he decries what he sees as the decline of serious criticism, as well as the hostility toward the role of critics themselves, who are seen, at least by authors, as negative, dismissive, and cruel. To illustrate this view, he quotes a decade-old interview with Dave Eggers, who says:

Do not dismiss a book until you have written one, and do not dismiss a movie until you have made one, and do not dismiss a person until you have met them. It is a fuckload of work to be open-minded and generous and understanding and forgiving and accepting, but Christ, that is what matters. What matters is saying yes.

(Incidentally, Eggers conducted this interview with my old college literary magazine, whose fiction board I joined a few months later. Garner doesn’t quote the interview’s last few lines, which, if I recall correctly, became something of a running joke around the Advocate building for years afterward: “And if anyone wants to hurt me for that, or dismiss me for saying that, for saying yes, I say Oh do it, do it you motherfuckers, finally, finally, finally.”)

Well, Garner finally, finally, finally goes after Eggers, a writer he says he admires, saying that he “deplores” the stance expressed above: “The sad truth about the book world,” Garner writes, “is that it doesn’t need more yes-saying novelists and certainly no more yes-saying critics. We are drowning in them.” What the world really needs, he argues, are uncompromising critics who are willing to honestly engage with works of art, both good and bad, and to be harsh when the situation requires it. He says that the best work of critics like Pauline Kael “is more valuable—and more stimulating—than all but the most first-rate novels.” He points out that any writer who consents for his or her novel to be published tacitly agrees to allow critics to review it however they like. And he bemoans the fact that social media has made it hard for critics to be as honest and hard as they should be. Twitter, he says, has degenerated into a mutual lovefest between authors, and doesn’t allow for anything like real criticism: “On it, negative words have the same effect as a bat flying into a bridal shower.”

The trouble with Garner’s argument, aside from its quixotic attempt to persuade authors to feel kindly toward critics, is that I don’t think it’s factually correct. Garner quotes Jonah Peretti’s observation that “Twitter is a simple service used by smart people,” which isn’t true at all—Twitter, for better or worse, is used by all kinds of people, and when we venture out of our own carefully cultivated circles, we’re treated to the sight of humanity in its purest form, including people who didn’t realize the Titanic was real. The same goes for the comments section of any news or opinion site, which is generally a swamp of negativity. The trouble with social media isn’t that it encourages people to be uncritically positive or negative: it’s that it encourages unconsidered discourse of all kinds. Twitter, by design, isn’t a place for reasoned commentary; at its best, it’s more like a vehicle for small talk. And we shouldn’t judge it by the same standards that use for other forms of criticism, any more than we should judge guests at a cocktail party for not saying what they really feel about the people around them. That’s also why attempts at criticism on Twitter tend to look uglier than the author may have intended—it’s the nature of the form.

And when we’re dealing with the choice, admittedly not a great one, between uncritical positivity and negativity, I’d have to say that the former is the lesser of two evils. That’s what Eggers is saying in the interview quoted above: he isn’t proposing, as Garner would have it, “mass intellectual suicide,” but an extreme solution to what he rightly sees as an extreme problem, which is the ease in which we can fall back into dismissive snark, long before “snark” had even attained its current meaning. It’s best, of course, to make nuanced, perceptive, complex arguments, but if we don’t have the time for it—and being a good critic takes time—then it’s marginally better, at least for our own souls, to be enthusiastic bores. I’ve argued before, and I still believe, that every worthwhile critic builds his or her work on a foundation of genuine enthusiasm for the art in question. Hard intellectual engagement comes later, as a sort of refinement of joy, and when it doesn’t, that’s the worst kind of intellectual suicide, which disguises itself as its opposite. Dwight Garner is a really good critic. But to get where Garner is now, you need to pass through Eggers first.

Written by nevalalee

August 23, 2012 at 10:13 am

%d bloggers like this: