Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘The New York Times

The year of magical thinking

with one comment

Elon Musk at Trump Tower

Maybe if I’m part of that mob, I can help steer it in wise directions.

—Homer Simpson, “Whacking Day”

Yesterday, Tesla founder Elon Musk defended his decision to remain on President Trump’s economic advisory council, stating on Twitter: “My goals are to accelerate the world’s transition to sustainable energy and to help make humanity a multi-planet civilization.” A few weeks earlier, Peter Thiel, another member of the PayPal mafia and one of Trump’s most prominent defenders, said obscurely to the New York Times: “Even if there are aspects of Trump that are retro and that seem to be going back to the past, I think a lot of people want to go back to a past that was futuristic—The Jetsons, Star Trek. They’re dated but futuristic.” Musk and Thiel both tend to speak using the language of science fiction, in part because it’s the idiom that they know best. Musk includes Asimov’s Foundation series among his favorite books, and he’s a recipient of the Heinlein Prize for accomplishments in commercial space activities. Thiel is a major voice in the transhumanist movement, and he’s underwritten so much research into seasteading that I’m indebted to him for practically all the technical background of my novella “The Proving Ground.” As Thiel said to The New Yorker several years ago, in words that have a somewhat different ring today:

One way you can describe the collapse of the idea of the future is the collapse of science fiction. Now it’s either about technology that doesn’t work or about technology that’s used in bad ways. The anthology of the top twenty-five sci-fi stories in 1970 was, like, “Me and my friend the robot went for a walk on the moon,” and in 2008 it was, like, “The galaxy is run by a fundamentalist Islamic confederacy, and there are people who are hunting planets and killing them for fun.”

Despite their shared origins at PayPal, Musk and Thiel aren’t exactly equivalent here: Musk has been open about his misgivings toward Trump’s policy on refugees, while Thiel, who seems to have little choice but to double down, had a spokesperson issue the bland statement: “Peter doesn’t support a religious test, and the administration has not imposed one.” Yet it’s still striking to see two of our most visible futurists staking their legacies on a relationship with Trump, even if they’re coming at it from different angles. As far as Musk is concerned, I don’t agree with his reasoning, but I understand it. His decision to serve in an advisory capacity to Trump seems to come down to his relative weighting of two factors, which aren’t mutually exclusive, but are at least inversely proportional. The first is the possibility that his presence will allow him to give advice that will affect policy decisions to some incremental but nontrivial extent. It’s better, this argument runs, to provide a reasonable voice than to allow Trump to be surrounded by nothing but manipulative Wormtongues. The second possibility is that his involvement with the administration will somehow legitimize or enable its policies, and that this risk far exceeds his slight chance of influencing the outcome. It’s a judgment call, and you can assign whatever values you like to those two scenarios. Musk has clearly thought long and hard about it. But I’ll just say that if it turns out that there’s even the tiniest chance that an occasional meeting with Musk—who will be sharing the table with eighteen others—could possibly outweigh the constant presence of Steve Bannon, a Republican congressional majority, and millions of angry constituents in any meaningful way, I’ll eat my copy of the Foundation trilogy.

Donald Trump and Peter Thiel

Musk’s belief that his presence on the advisory council might have an impact on a president who has zero incentive to appeal to anyone but his own supporters is a form of magical thinking. In a way, though, I’m not surprised, and it’s possible that everything I admire in Musk is inseparable from the delusion that underlies this decision. Whatever you might think of them personally, Musk and Thiel are undoubtedly imaginative. In his New Yorker profile, Thiel blamed many of this country’s problems on “a failure of imagination,” and his nostalgia for vintage science fiction is rooted in a longing for the grand gestures that it embodied: the flying car, the seastead, the space colony. Achieving such goals requires not only vision, but a kind of childlike stubbornness that chases a vanishingly small chance of success in the face of all evidence to the contrary. What makes Musk and Thiel so fascinating is their shared determination to take a fortune built on something as prosaic as an online payments system and to turn it into a spaceship. So far, Musk has been much more successful at translating his dreams into reality, and Thiel’s greatest triumph to date has been the destruction of Gawker Media. But they’ve both seen their gambles pay off to an extent that might mislead them about their ability to make it happen again. It’s this sort of indispensable naïveté that underlies Musk’s faith in his ability to nudge Trump in the right direction, and, on a more sinister level, Thiel’s eagerness to convince us to sign up for a grand experiment with high volatility in both directions—even if most of us don’t have the option of fleeing to New Zealand if it all goes up in flames.

This willingness to submit involuntary test subjects to a hazardous cultural project isn’t unique to science fiction fans. It’s the same attitude that led Norman Mailer, when asked about his support of the killer Jack Henry Abbott, to state: “I’m willing to gamble with a portion of society to save this man’s talent. I am saying that culture is worth a little risk.” (And it’s worth remembering that the man whom Abbott stabbed to death, Richard Adan, was the son of Cuban immigrants.) But when Thiel advised us before the election not to take Trump “literally,” it felt like a symptom of the suspension of disbelief that both science fiction writers and startup founders have to cultivate:

I think a lot of the voters who vote for Trump take Trump seriously but not literally. And so when they hear things like the Muslim comment or the wall comment or things like that, the question is not “Are you going to build a wall like the Great Wall of China?” or, you know, “How exactly are you going to enforce these tests?” What they hear is “We’re going to have a saner, more sensible immigration policy.”

We’ll see how that works out. But in the meantime, the analogy to L. Ron Hubbard is a useful one. Plenty of science fiction writers, including John W. Campbell, A.E. van Vogt, and Theodore Sturgeon, were persuaded by dianetics, in part because it struck them as a risky idea with an unlimited upside. Yet whatever psychological benefits dianetics provided—and it probably wasn’t any less effective than many forms of talk therapy—were far outweighed by the damage that Hubbard and his followers inflicted. It might help to mentally replace the name “Trump” with “Hubbard” whenever an ethical choice needs to be made. What would it mean to take Hubbard “seriously, but not literally?” And if Hubbard asked you to join his board of advisors, would it seem likely that you could have a positive influence, even if it meant adding your name to the advisory council of the Church of Scientology? Or would it make more sense to invest the same energy into helping those whose lives the church was destroying?

The grand projects

with 4 comments

The Lisle Letters

Thirty-five years ago, on October 18, 1981, the New York Times published a long article by the critic D.J.R. Bruckner. Titled “The Grand Projects,” it was a survey of what Bruckner called “the big books or projects that need decades to finish,” and which only a handful of academic publishers in the country are equipped to see from beginning to end. I first came across it in a photocopy tucked into the first volume of one of the books that it mentions, The Plan of St. Gall, the enormous study of monastic life that I bought a few years ago after dreaming about it for decades. At the time, I was just starting to collect rare and unusual books for their own sake, and I found myself using Bruckner’s article—which I recently discovered was the first piece that he ever published for the Times—as a kind of map of the territory. I purchased a copy of Howard Adelmann’s massive Marcello Malpighi and the Evolution of Embryology mostly because Bruckner said: “Go to a library and see it one day; it is wonderful just to look at.” And last week, as a treat for myself after a rough month, I finally got my hands on the six volumes of Muriel St. Clare Byrne’s The Lisle Letters, which Bruckner mentions alongside The Plan of St. Gall as one of the great triumphs of the university press. For the moment, I have everything on my list, although I suppose that Costa Rican Natural History by Daniel Janzen is beckoning from the wings.

But I’ve also found that my motives for collecting these books have changed—or at least they’ve undergone a subtle shift of emphasis. I was initially drawn to these beautiful sets, frankly, for aesthetic reasons. As the product of years or decades of collaborative work, they’re invariably gorgeous in design, typography, printing, and construction. These are books that are meant to last forever. I don’t have as much time to read for my own pleasure as I once did, so I’ve begun to treasure what I’ve elsewhere called tomes, or books so large that their unread pages feel comforting, rather than accusatory. It’s unlikely that I’ll ever have the chance to work through Marcello Malpighi from the first folio page to the last, but I’m happy just to be living in the same house with it. When I’m honest with myself, I acknowledge that it has something to do with a middlebrow fondness for how those uniform sets look when lined up on my bookshelves: it’s the same impulse that led me to pick up books as different as William T. Vollmann’s Rising Up and Rising Down and the sixteen volumes of Richard Francis Burton’s translation of The Arabian Nights. At some point, it amounts to buying books as furniture. I can’t totally defend myself from this charge, except by saying that the pleasure that they give me is one that encompasses all the senses. I like to look at them, but also to handle them, leaf through them, and sometimes even smell them. And I’ll occasionally even read them for an hour.

Marcello Malpighi and the Evolution of Embryology

Over the last year or so, however, I’ve begun to see them in another light. Now they represent an investment of time, which is invisible, but no less vast than the amount of space that they physically occupy. (You could even say that the resulting book is a projection, in three-dimensional space, of the temporal process that produced it. A big book is invariably the product of a big life.) The undisputed champion here has to be The Lisle Letters, which was the end result of fifty years of work by Muriel St. Clare Byrne. She was in her thirties when she began the project, and it was published on her eighty-sixth birthday. It’s an edited and wonderfully annotated selection of the correspondence of Arthur Plantagenet, 1st Viscount Lisle, the illegitimate son of Henry VIII. The surviving letters, which encompass one of the most eventful periods in Tudor history, were an important source for the novelist Hilary Mantel in the writing of Wolf Hall. Like most of the tomes that I love, it uses its narrow subject as an entry point into a much larger era, and I especially like Byrne’s explanation of why these particular letters are so useful. Lisle wasn’t even in England for most of it—he was Lord Deputy of Calais, on the northern coast of France. Yet he still had to manage his affairs back home, mostly through letters, which means that the correspondence preserves countless details of daily life that otherwise wouldn’t have been committed to writing. The letters had long been known to historians, but no one had ever gone through systematically and considered them as a whole. Byrne saw that somebody had to do it, and she did. And it only took her five decades.

It’s the time and effort involved that fascinates me now, even more than the tangible pleasures of the books themselves. In some ways, these are just different aspects of the same thing: the academic presses, which can afford to break even or even lose money on monumental projects, can provide scholars with the time they need, and they can publish works intended for only a few thousand readers with the resources they deserve. Occasionally, you see the same impulse in mainstream publishing: Robert Caro’s biography of Lyndon Johnson sometimes seems less like a commercial enterprise than a public service. (When asked in that wonderful profile by Charles McGrath if Caro’s books were profitable, Sonny Mehta, the head of Knopf, paused and said: “They will be, because there is nothing like them.”) In the end, Caro will have spent as much time on Johnson as Byrne did on Lisle, and the fact that he did it outside the university system is equally remarkable. It’s no accident, of course, that I’ve begun to think in these terms after embarking on a big nonfiction project of my own. Astounding can’t compare to any of these books in size: it’s supposed to appeal to a wide audience, and there are certain constraints in length that are written right into the contract. I don’t have decades to write it, either. When all is said and done, I’ll probably end up devoting three years to it, which isn’t trivial, but it isn’t a lifetime. But I keep these books around to remind me of the devotion and obsessiveness that such projects require. We desperately need authors and publishers like this. And whenever I feel overwhelmed by the work that lies ahead, I just have to ask myself what Caro—or Muriel St. Clare Byrne—would do.

The science fiction election

with 2 comments

Donald Trump

On July 18, 2015, Nate Cohn of The New York Times published a blog post titled “The Trump Campaign’s Turning Point.” Here are the first three paragraphs, which I suspect Cohn himself might prefer we forget:

Donald Trump’s surge in the polls has followed the classic pattern of a media-driven surge. Now it will most likely follow the classic pattern of a party-backed decline.

Mr. Trump’s candidacy probably reached an inflection point on Saturday after he essentially criticized John McCain for being captured during the Vietnam War. Republican campaigns and elites quickly moved to condemn his comments—a shift that will probably mark the moment when Trump’s candidacy went from boom to bust.

His support will erode as the tone of coverage shifts from publicizing his anti-establishment and anti-immigration views, which have some resonance in the party, to reflecting the chorus of Republican criticism of his most outrageous comments and the more liberal elements of his record.

Needless to say, Cohn was slightly off here, and he recently wrote a long mea culpa that attempted to explain why he got it so wrong. But I remember being surprised by the tone of the post even at the time. Statements like “Mr. Trump’s candidacy probably reached an inflection point” and “his support will erode as the tone of coverage shifts” seemed weirdly overconfident in advance of any hard numbers, particularly for a blog that was openly designed to mimic the data-driven approach pioneered by Nate Silver. The phrase “inflection point,” in particular, seemed odd: Cohn was describing a graph that didn’t exist, as if the curve were already before his eyes. But I also understand the impulse. The desire to predict the future is central to all political coverage, even if it’s unstated: the twists and turns of a campaign inevitably come down to the outcome of a few binary moments, and whenever a journalist reports on an incident, the implication is that it matters in ways that will translate into real votes—otherwise, why bother? Unfortunately, amid the noise of the primary and general elections, it can be hard to figure out which events are truly significant. If there’s one thing that we’ve learned this year, it’s that the issues, controversies, and personality traits that the media thought would have an impact ended up not mattering much at all. But the fact that so many journalists—who have a huge incentive to at least appear to be right—were so mistaken about Trump won’t stop them from continuing to make predictions. It certainly hasn’t so far.

The Simpsons episode "Citizen Kang"

Yet there’s another, equally strong inclination that serves, at least in theory, to counterbalance the incentive to predict: the need to create a compelling narrative. A few days ago, David Roberts of Vox wrote a depressing but, I think, fundamentally accurate piece on how media coverage of the upcoming election is likely to unfold. Here are his main points:

There will be a push to lift Donald Trump up and bring Hillary Clinton down, until they are at least something approximating two equivalent choices. It’s not a conspiracy; it won’t be coordinated. It doesn’t need to be. It’s just a process of institutions, centers of power and influence, responding to the incentive structure that’s evolved around them. The U.S. political ecosystem needs this election to be competitive…

The campaign press requires, for its ongoing health and advertising revenue, a real race. It needs controversies. “Donald Trump is not fit to be president” may be the accurate answer to pretty much every relevant question about the race, but it’s not an interesting answer. It’s too final, too settled. No one wants to click on it.

I think he’s right, and that there’s going to be significant pressure in the media to turn this election into a case of Kang vs. Kodos. But it’s also worth pointing out that the two impulses we’re discussing here—to predict the future with apparent accuracy and to create a narrative of equivalency where none exists—are fundamentally incompatible. So how can a journalist who needs to crank out a story on a daily basis for the next six months possibly manage to do both?

As it happens, there’s a literary genre that depends on writers being able to navigate that very contradiction. Science fiction has always prided itself on its predictive abilities, and with as little justification as most political pundits: when it’s right about the future, it’s usually by accident. But the need to seem prescient is still there, if only as a narrative strategy. The genre also needs to tell engaging stories, however, and you’ll often find cases in which one impulse, excuse me, trumps the other, as Jack Williamson notes in an observation that I never tire of quoting:

The average author is more stage magician, a creator of convincing illusions, than scientist or serious prophet. In practice, once you’re into the process of actually writing a work of fiction, the story itself gets to be more important than futurology. You become more involved in following the fictional logic you’ve invented for your characters, the atmosphere, the rush of action; meanwhile, developing real possibilities recedes. You may find yourself even opting for the least probable event rather than the most probable, simply because you want the unexpected.

Replace a few of the relevant nouns, and this is as good a description of political journalism as any I’ve ever seen. In both cases, writers feel obliged to cobble together an implausible but exciting narrative with what seems like predictive accuracy. It’s something that both readers and voters ought to keep in mind over the next year. Because if you think that this election already seems like science fiction, you’re even more right than you know.

Written by nevalalee

May 10, 2016 at 8:40 am

The watchful protectors

leave a comment »

Ben Affleck in Batman V. Superman: Dawn Of Justice

In the forward to his new book Better Living Through Criticism, the critic A.O. Scott imagines a conversation with a hypothetical interlocutor who asks: “Would it be accurate to say that you wrote this whole book to settle a score with Samuel L. Jackson?” “Not exactly,” Scott replies. The story, in case you’ve forgotten, is that after reading Scott’s negative review of The Avengers, Jackson tweeted that it was time to find the New York Times critic a job “he can actually do.” As Scott recounts:

Scores of his followers heeded his call, not by demanding that my editors fire me but, in the best Twitter tradition, by retweeting Jackson’s outburst and adding their own vivid suggestions about what I was qualified to do with myself. The more coherent tweets expressed familiar, you might even say canonical, anticritical sentiments: that I had no capacity for joy; that I wanted to ruin everyone else’s fun; that I was a hater, a square, and a snob; even—and this was kind of a new one—that the nerdy kid in middle school who everybody picked on because he didn’t like comic books had grown up to be me.

Before long, it all blew over, although not before briefly turning Scott into “both a hissable villain and a make-believe martyr for a noble and much-maligned cause.” And while he says that he didn’t write his book solely as a rebuttal to Jackson, he implies that the kerfuffle raised a valuable question: what, exactly, is the function of a critic these days?

It’s an issue that seems worth revisiting after this weekend, when a movie openly inspired by the success of The Avengers rode a tide of fan excitement to a record opening, despite a significantly less positive response from critics. (Deadline quotes an unnamed studio executive: “I don’t think anyone read the reviews!”) By some measures, it’s the biggest opening in history for a movie that received such a negative critical reaction, and if anything, the disconnect between critical and popular reaction is even more striking this time around. But it doesn’t seem to have resulted in the kind of war of words that blindsided Scott four years ago. Part of this might be due to the fact that fans seem much more mixed on the movie itself, or that the critical consensus was uniform enough that no single naysayer stood out. You could even argue—as somebody inevitably does whenever a critically panned movie becomes a big financial success—that the critical reaction is irrelevant for this kind of blockbuster. To some extent, you’d be right: the only tentpole series that seems vulnerable to reviews is the Bond franchise, which skews older, and for the most part, the moviegoers who lined up to see Dawn of Justice were taking something other than the opinions of professional critics into account. This isn’t a superpower on the movie’s part: it simply reflects a different set of concerns. And you might reasonably ask whether this kind of movie has rendered the role of a professional critic obsolete.

A.O. Scott

But I would argue that such critics are more important than ever, and for reasons that have a lot to do with the “soulless corporate spectacle” that Scott decried in The AvengersI’ve noted here before that the individual installments in such franchises aren’t designed to stand on their own: when you’ve got ten more sequels on the release schedule, it’s hard to tell a self-contained, satisfying story, and even harder to change the status quo. (As Joss Whedon said in an interview with Mental Floss: “You’re living in franchise world—not just Marvel, but in most big films—where you can’t kill anyone, or anybody significant.”) You could be cynical and say that no particular film can be allowed to interfere with the larger synergies at stake, or, if you’re in a slightly more generous mood, you could note that this approach is perfectly consistent with the way in which superhero stories have always been told. For the most part, no one issue of Batman is meant to stand as a definitive statement: it’s a narrative that unfolds month by month, year by year, and the character of Batman himself is far more important than any specific adventure. Sustaining that situation for decades on end involves a lot of artistic compromises, as we see in the endless reboots, resets, spinoffs, and alternate universes that the comic book companies use to keep their continuities under control. Like a soap opera, a superhero comic has to create the illusion of forward momentum while remaining more or less in the same place. It’s no surprise that comic book movies would employ the same strategy, which also implies that we need to start judging them by the right set of standards.

But you could say much the same thing about a professional critic. What A.O. Scott says about any one movie may not have an impact on what the overall population of moviegoers—even the ones who read the New York Times—will pay to see, and a long string of reviews quickly blurs together. But a critic who writes thoughtfully about the movies from week to week is gradually building up a narrative, or at least a voice, that isn’t too far removed from what we find in the comics. Critics are usually more concerned with meeting that day’s deadline than with adding another brick to their life’s work, but when I think of Roger Ebert or Pauline Kael, it’s sort of how I think of Batman: it’s an image or an attitude created by its ongoing interactions with the minds of its readers. (Reading Roger Ebert’s memoirs is like revisiting a superhero’s origin story: it’s interesting, but it only incidentally touches the reasons that Ebert continues to mean so much to me.) The career of a working critic these days naturally unfolds in parallel with the franchise movies that will dominate studio filmmaking for the foreseeable future, and if the Justice League series will be defined by our engagement with it for years to come, a critic whose impact is meted out over the same stretch of time is better equipped to talk about it than almost anyone else—as long as he or she approaches it as a dialogue that never ends. If franchises are fated to last forever, we need critics who can stick around long enough to see larger patterns, to keep the conversation going, and to offer some perspective to balance out the hype. These are the critics we deserve. And they’re the ones we need right now.

You are here

with one comment

Adam Driver in Star Wars: The Force Awakens

Remember when you were watching Star Wars: The Force Awakens and Adam Driver took off his mask, and you thought you were looking at some kind of advanced alien? You don’t? That’s strange, because it says you did, right here in Anthony Lane’s review in The New Yorker:

So well is Driver cast against type here that evil may turn out to be his type, and so extraordinary are his features, long and quiveringly gaunt, that even when he removes his headpiece you still believe that you’re gazing at some form of advanced alien.

I’m picking on Lane a little here, because the use of the second person is so common in movie reviews and other types of criticism—including this blog—that we hardly notice it, any more than we notice the “we” in this very sentence. Film criticism, like any form of writing, evolves its own language, and using that insinuating “you,” as if your impressions had melded seamlessly with the critic’s, is one of its favorite conventions. (For instance, in Manohla Dargis’s New York Times review of the same film, she says: “It also has appealingly imperfect men and women whose blunders and victories, decency and goofiness remind you that a pop mythology like Star Wars needs more than old gods to sustain it.”) But who is this “you,” exactly? And why has it started to irk me so much?

The second person has been used by critics for a long time, but in its current form, it almost certainly goes back to Pauline Kael, who employed it in the service of images or insights that could have occurred to no other brain on the planet, as when she wrote of Madeline Kahn in Young Frankenstein: “When you look at her, you see a water bed at just the right temperature.” This tic of Kael’s has been noted and derided for almost four decades, going back to Renata Adler’s memorable takedown in the early eighties, in which she called it “the intrusive ‘you'” and noted shrewdly: “But ‘you’ is most often Ms. Kael’s ‘I,’ or a member or prospective member of her ‘we.'” Adam Gopnik later said: “It wasn’t her making all those judgments. It was the Pop Audience there beside her.” And “the second-person address” clearly bugged Louis Menand, too, although his dislike of it was somewhat undermined by the fact that he internalized it so completely:

James Agee, in his brief service as movie critic of The Nation, reviewed many nondescript and now long-forgotten pictures; but as soon as you finish reading one of his pieces, you want to read it again, just to see how he did it…You know what you think about Bonnie and Clyde by now, though, and so [Kael’s] insights have lost their freshness. On the other hand, she is a large part of the reason you think as you do.

Pauline Kael

Kael’s style was so influential—I hear echoes of it in almost everything I write—that it’s no surprise that her intrusive “you” has been unconsciously absorbed by the generations of film critics that followed. If it bothers you as it does me, you can quietly replace it throughout with “I” without losing much in the way of meaning. But that’s part of the problem. The “you” of film criticism conceals a neurotic distrust of the first person that prevents critics from honoring their opinions as their own. Kael said that she used “you” because she didn’t like “one,” which is fair enough, but there’s also nothing wrong with “I,” which she wasn’t shy about using elsewhere. To a large extent, Kael was forging her own language, and I’m willing to forgive that “you,” along with so much else, because of the oceanic force of the sensibilities to which it was attached. But separating the second person from Kael’s unique voice and turning it into a crutch to be indiscriminately employed by critics everywhere yields a more troubling result. It becomes a tactic that distances the writer slightly from his or her own judgments, creating an impression of objectivity and paradoxical intimacy that has no business in a serious review. Frame these observations in “I,” and the critic would feel more of an obligation to own them and make sense of them; stick them in a convenient “you,” and they’re just one more insight to be tossed off, as if the critic happened to observe it unfolding in your brain and can record it here without comment.

Obviously, there’s nothing wrong with wanting to avoid the first person in certain kinds of writing. It rarely has a place in serious reportage, for instance, despite the efforts of countless aspiring gonzo journalists who try to do what Norman Mailer, Hunter S. Thompson, and only a handful of others have ever done well. (It can even plague otherwise gifted writers: I was looking forward to Ben Lerner’s recent New Yorker piece about art conservation, but I couldn’t get past his insistent use of the first person.) But that “I” absolutely belongs in criticism, which is fundamentally a record of a specific viewer, listener, or reader’s impressions of his or her encounter with a piece of art. All great critics, whether they use that “you” or not, are aware of this, and it can be painful to read a review by an inexperienced writer that labors hard to seem “objective.” But if our best critics so often fall into the “you” trap, it’s a sign that even they aren’t entirely comfortable with giving us all of themselves, and I’ve started to see it as a tiny betrayal—meaningful or not—of what ought to be the critic’s intensely personal engagement with the work. And if it’s only a tic or a trick, then we sacrifice nothing by losing it. Replace that “you” with “I” throughout, making whatever other adjustments seem necessary, and the result is heightened and clarified, with a much better sense of who was really sitting there in the dark, feeling emotions that no other human being would ever feel in quite the same way.

A writer’s climate

leave a comment »

Elizabeth Kolbert

Yesterday, the Pulitzer Prize for General Nonfiction was awarded to Elizabeth Kolbert’s excellent, sobering book The Sixth Extinction. As it happens, I finished reading it the other week, and it’s lying on my desk as I write this, which may be the first time I’ve ever gotten in on a Pulitzer winner on the ground floor. Recently, I’ve worked my way through a stack of books on climate change, including This Changes Everything by Naomi Klein, Windfall by McKenzie Funk, and Don’t Even Think About It by George Marshall. I also read Jonathan Franzen’s infamous article in The New Yorker, of course. And for a while, they provided a lens through which I saw almost everything else. There was the New York Times piece on Royal Dutch Shell’s acquisition of BG Group, for instance, which doesn’t mention climate change once; or their writeup, a few days later, on the imposition of new rules for offshore oil and gas exploration, even as the Atlantic Coast is being opened up for drilling. The Times describes this latter development as “a decision that has infuriated environmentalists”—which, when you think about it, is an odd statement. Climate change affects everybody, and if you believe, as many do, that the problem starts at the wellhead, pigeonholing it as an environmental issue only makes it easier to ignore.

I don’t mean to turn this into a post on the problem of climate change itself, which is a topic on which my own thoughts are still evolving. But like any great social issue—and it’s hard to see it as anything else—the way in which we choose to talk about it inevitably affects our responses. Franzen touches on this in his essay, in which he contrasts the “novelistic” challenge of conservation with the tweetable logic, terrifying in its simplicity, of global warming. I happen to think he’s wrong, but it’s still crucial for writers in general, and journalists especially, to think hard about how to cover an issue that might be simple in its outlines but dauntingly complex in its particulars. It may be the only thing we’re qualified to do. And Kolbert’s approach feels a lot like one that both Franzen and I can agree is necessary: novelistic, detailed, with deeply reported chapters on the author’s own visits to locations from Panama to Iceland to the Great Barrier Reef. Reading her book, we’re painlessly educated and entertained on a wide range of material, and while its message may be bleak, her portraits of the scientists she encounters leave us with a sense of possibility, however qualified it may be. (It helps that Kolbert has a nice dry sense of humor, as when she describes one researcher’s work as performing “handjobs on crows.”)

Naomi Klein

And in its focus on the author’s firsthand experiences, I suspect that it will live longer in my imagination than a work like Klein’s This Changes Everything, which I read around the same time. Klein’s book is worthy and important, but it suffers a little in its determination to get everything in, sometimes to the detriment of the argument itself. Nuclear power, for instance, deserves to be at the center of any conversation about our response to climate change, whether or not you see it as a viable part of the solution, but Klein dismisses it in a footnote. And occasionally, as in her discussion of agroecology—or the use of small, diverse farms as an alternative to industrial agriculture—it feels as if she’s basing her opinion on a single article from National Geographic. (It doesn’t help that she quotes one expert as saying that the Green Revolution didn’t really save the world from hunger, since starvation still exists, which is a little like saying that modern medicine has failed because disease hasn’t been totally eradicated. There’s also no discussion of the possibility that industrial agriculture has substantially decreased greenhouse emissions by reducing the total land area that needs to be converted to farming. Whatever your feelings on the subject, these issues can’t simply be swept aside.)

But there’s no one right way to write about climate change, and Klein’s global perspective, as a means of organizing our thoughts on the subject, is useful, even if it needs to be supplemented by more nuanced takes. (I particularly loved Funk’s book Windfall, which is loaded with as many fascinating stories as Kolbert’s.) Writers, as I’ve said elsewhere, tend to despair over how little value their work seems to hold in the face of such challenges. But if these books demonstrate one thing, it’s that the first step toward meaningful action, whatever form it assumes, lies in describing the world with the specificity, clarity, and diligence it demands. It doesn’t always call for jeremiads or grand plans, and it’s revealing that Kolbert’s book is both the best and the least political of the bunch. And it’s safe to say that talented writers will continue to be drawn to the subject: truly ambitious authors will always be tempted to tackle the largest themes possible, if only out of the “real egotism” that Albert Szent-Györgyi identifies as a chief characteristic of a great researcher. Writers, in fact, are the least likely of any of us to avoid confronting the unthinkable, simply because they have a vested interest in shaping the conversation about our most difficult issues. It’s fine for them to dream big; we need people who will. But they’ll make the greatest impact by telling one story at a time.

The Reddit Wedding

leave a comment »

The front page of Reddit

Early last Sunday, after giving my daughter a bottle, putting on the kettle for coffee, and glancing over the front page of the New York Times, I moved on to the next stop in my morning routine: I went to Reddit. Like many of us, I’ve started to think of Reddit as a convenient curator of whatever happens to be taking place online that day, and after customizing the landing page to my tastes—unsubscribing from the meme factories, keeping the discussions of news and politics well out of view—it has gradually turned into the site where I spend most of my time. (It’s also started to leave a mark on my home life: I have a bad habit of starting conversations with my wife with “There was a funny thread on Reddit today…) That morning, I was looking over the top posts when I noticed a link to an article about the author George R.R. Martin and his use of the antiquated word processor WordStar to write all of his fiction, including A Song of Ice and Fire, aka Game of Thrones. At first, I was amused, because I’d once thought about submitting that very tidbit myself. A second later, I realized why the post looked so familiar. It was linked to this blog.

At that point, my first thought, and I’m not kidding, was, “Hey, I wonder if I’ll get a spike in traffic.” And I did. In fact, if you’re curious about what it means to end up on the front page of Reddit, as of this writing, that post—which represented about an hour’s work from almost a year ago—has racked up close to 300,000 hits, more than doubling the lifetime page views for this entire blog. At its peak, it was the third most highly ranked post on Reddit that morning, a position it held very briefly: within a few hours, it had dropped off the front page entirely, although not before inspiring well over 1,500 comments. Most of the discussion revolved around WordStar, the merits of different word processing platforms, and about eighty variations on the joke of “Oh, so that’s why it’s taking Martin so long to finish.” The source of the piece was mentioned maybe once or twice, and several commenters seemed to think that this was Martin’s blog. And the net impact on this site itself, after the initial flurry of interest, was minimal. A few days later, traffic has fallen to its usual modest numbers, and only a handful of new arrivals seem to have stuck around. (If you’re one of them, I thank you.) And it’s likely that none of this site’s regular readers noticed that anything out of the ordinary was happening at all.

My blog stats

In short, because of one random link, this blog received an influx of visitors equivalent to the population of Cincinnati, and not a trace remains—I might as well have dreamed it. But then again, this isn’t surprising, given how most people, including me, tend to browse content these days. When I see an interesting link on Reddit, I’ll click on it, skim the text, then head back to the post for the comments. (For a lot of articles, particularly on science, I’ll read the comments first to make sure the headline wasn’t misleading.) I’ll rarely, if ever, pause to see what else the destination site has to offer; it’s just too easy to go back to Reddit or Digg or Twitter to find the next interesting article from somewhere else. In other words, I’m just one of the many guilty parties in what has been dubbed the death of the homepage. The New York Times landing page has lost eighty million visitors over the last two years, and it isn’t hard to see why. We’re still reading the Times, but we’re following links from elsewhere, which not only changes the way we read news, but the news we’re likely to read: less hard reporting, more quizzes, infographics, entertainment and self-help items, as well as the occasional diverting item from a site like this.

And it’s a reality that writers and publishers, including homegrown operations like mine, need to confront. The migration of content away from homepages and into social media isn’t necessarily a bad thing; comments on Reddit, for instance, are almost invariably more capably ranked and moderated, more active, and more interesting than the wasteland of comments on even major news sites. (Personally, I’d be fine if most newspapers dropped commenting altogether, as Scientific American and the Chicago Sun-Times recently did, and left the discussion to take place wherever the story gets picked up.) But it also means that we need to start thinking of readers less as a proprietary asset retained over time than as something we have to win all over again with every post, while getting used to the fact that none of it will last. Or almost none of it. A few days after my post appeared on Reddit, George R.R. Martin was interviewed by Conan O’Brien, who asked him about his use of WordStar—leading to another burst of coverage, even though Martin’s preferences in word processing have long been a matter of record. And while I can’t say for sure, between you and me, I’m almost positive that it wouldn’t have come up if someone on Conan’s staff hadn’t seen my post. It isn’t much. But it’s nice.

Written by nevalalee

May 19, 2014 at 9:52 am

The seductions of structure

leave a comment »

Structure of an essay by John McPhee

Learning about a writer’s outlining methods may not be as interesting as reading about his or her sex life, but it exercises a peculiar fascination of its own—at least for other writers. Everyone else probably feels a little like I did while reading Shawn McGrath’s recent appreciation of the beautiful source code behind Doom 3: I understood what he was getting at, but the article itself read like a dispatch from a parallel universe of lexical analyzers and rigid parameters. Still, the rules of good structure are surprisingly constant across disciplines. You don’t want more parts than you need; the parts you do have should be arranged in a logical form; and endless tinkering is usually required before the result has the necessary balance and beauty. And for the most part, the underlying work ought to remain invisible. The structure of a good piece of fiction is something like the structure of a comfortable chair. You don’t necessarily want to think about it while you’re in it, but if the structure has been properly conceived, your brain, or your rear end, will thank you.

In recent weeks, I’ve been lucky enough to read two enjoyable pieces of structure porn. The first is John McPhee’s New Yorker essay on the structure of narrative nonfiction; the second is Aaron Hamburger’s piece in the New York Times on outlining in reverse. McPhee’s article goes into his methods in great, sometimes laborious detail, and there’s something delightful in hearing him sing the praises of his outlining and text editing software. His tools may be computerized, but they only allow him to streamline what he’d always done with a typewriter and scissors:

After reading and rereading the typed notes and then developing the structure and then coding the notes accordingly in the margins and then photocopying the whole of it, I would go at the copied set with the scissors, cutting each sheet into slivers of varying size…One after another, in the course of writing, I would spill out the sets of slivers, arrange them ladderlike on a card table, and refer to them as I manipulated the Underwood.

Regular readers will know that this is the kind of thing I love. Accounts of how a book is written tend to dwell on personal gossip or poetic inspiration, and while such stories can be inspiring or encouraging, as a working writer, I’d much rather hear more about those slivers of paper.

Scene cards on the author's desk

And the reason I love them so much is that they get close to the heart of writing as a profession, which has surprising affinities with more technical or mechanical trades. Writing a novel, in particular, hinges partially on a few eureka moments, but it also presents daunting organizational and logistical challenges. A huge amount of material needs to be kept under control, and a writer’s brain just isn’t large or flexible enough to handle it all at once. Every author develops his or her own strategies for corralling ideas, and for most of us, it boils down to taking good notes, which I’ve compared elsewhere to messages that I’ve left, a la Memento, for my future self to rediscover. By putting our thoughts on paper—or, like McPhee does, in a computerized database—we make them easier to sort and retrieve. It looks like little more than bookkeeping, but it liberates us. McPhee says it better than I ever could: “If this sounds mechanical, the effect was absolutely the reverse…The procedure eliminated all distraction and concentrated only the material I had to deal with in a given day or week. It painted me into a corner, yes, but in doing so it freed me to write.”

This kind of organization can also take place closer to the end of the project, as Hamburger notes in his Times piece. Hamburger says that he dislikes using outlines to plan a writing project, and prefers to work more organically, but also observes that it can be useful to view the resulting material with a more objective, even mathematical eye. What he describes is similar to what I’ve called writing by numbers: you break the story down to individual scenes, count the pages or paragraphs, and see how each piece fits in with the shape of the story as a whole. Such an analysis often reveals hidden weaknesses or asymmetries, and the solution can often be as simple as the ten percent rule:

In [some] stories, I found that most of the scenes were roughly equal in length, and so cutting became as easy as an across-the-board budget cut. I dared myself to try to cut ten percent from each scene, and then assessed what was left. Happily, I didn’t always achieve my goal—because let’s face it, writing is not math and never should be. Yet what I learned about my story along the way proved invaluable.

I agree with this wholeheartedly, with one caveat: I believe that writing often is math, although not exclusively, and only as a necessary prop for emotion and intuition. Getting good ideas, as every writer knows, is the easy part. It’s the structure that makes them dance.

Written by nevalalee

January 28, 2013 at 9:50 am

“If there are ten readers out there…”

leave a comment »

George Saunders

I want to be more expansive. If there are ten readers out there, let’s assume I’m never going to reach two of them. They’ll never be interested. And let’s say I’ve already got three of them, maybe four. If there’s something in my work that’s making numbers five, six and seven turn off to it, I’d like to figure out what that is. I can’t change who I am and what I do, but maybe there’s a way to reach those good and dedicated readers that the first few books might not have appealed to. I’d like to make a basket big enough that it included them.

George Saunders, to The New York Times

Written by nevalalee

January 5, 2013 at 9:50 am

“Two hundred European cities have bus links with Frankfurt”

with 7 comments

Let’s say you’re reading a novel, perhaps a thriller, and while you wouldn’t say it’s a great book, you’re reasonably engaged by the plot and characters. The story is clocking along nicely, the author’s prose is clean and unobtrusive, and suddenly you’re brought up short by something like this:

He was sitting all alone in the enormous cabin of a Falcon 2000EX corporate jet as it bounced its way through turbulence. In the background, the dual Pratt & Whitney engines hummed evenly.

Hold on. What do those Pratt & Whitney engines have to do with anything? Is this a novel or an aircraft catalog? Well, it’s neither, at least not at the moment: rather, it’s an instance of a novelist being reluctant to part with a laboriously acquired piece of research. Suspense novelists are especially guilty of this sort of thing—the above example is from Dan Brown’s The Lost Symbol, admittedly not the most original target in the world—but it’s something that every writer needs to beware: the temptation to overload one’s fiction with factual detail, especially detail that was the result of a long and painful research process.

This tendency is easy to understand in historical and science fiction, in which so much energy has gone into researching a story set in another time and place, but it’s less obvious why it should also be so common in thrillers, which in other respects have become ever more streamlined. Anthony Lane, in an amusing article on the top ten books on the New York Times bestseller list of May 15, 1994, quotes a sentence from Allan Folsom’s thriller The Day After Tomorrow (the one about the Frankfurt bus lines), which he claims is the most boring clause in any of the books he’s read for his essay. He then says:

The odd thing about pedantry, however, is that it can’t be trusted. Many of the writers on this list are under the impression that if they do the factual spadework, the fiction will dig itself in and hunker down, solid and secure. The effect, unfortunately, is quite the opposite. It suggests that the writers are hanging on for grim life to what they know for fear of unleashing what they don’t know; they are frightened, in other words, of their own imagination…When Flaubert studied ancient Carthage for Salammbô, or the particulars of medieval falconry for “The Legend of St. Julien Hospitalier,” he was furnishing and feathering a world that had already taken shape within his mind; when Allan Folsom looks at bus timetables, his book just gets a little longer.

True enough. Lane is mistaken, though, when he blames this tendency, elsewhere in his article, on the work of James Michener, which consists of “gathering more research than any book could possibly need, then refusing to jettison a particle of it for the sake of dramatic form.” Michener is probably to blame for such excesses in historical fiction, but as far as thrillers are concerned, there’s another, more relevant culprit: Frederick Forsyth. Much of the pleasure of The Day of the Jackal (which Lane elsewhere claims to read once a year) comes from Forsyth’s expertise, real or cunningly feigned, in such matters as identity theft and the construction of an assassin’s rifle, which makes the less plausible elements of his novel all the more convincing. He’s so good at this, in fact, that legions of inferior writers have been seduced by his example. (Even Forsyth himself, in his later novels, isn’t entirely immune.)

Here, then, is the novelist’s dilemma: an appropriate amount of research will lure readers into the fictional dream, but too much will yank them out. So what’s a writer to do? The answer here, as in most other places, is that good habits of writing in general will trim away the worst of these particular excesses. For instance, Stephen King’s invaluable advice to cut all your drafts by ten percent applies twice as much to expository or factual passages. We haven’t discussed point of view yet, but by restricting each scene to the point of view of a particular character, you’re less likely to introduce extraneous information. And the endless labor of rereading, editing, and revision, once time has given you sufficient detachment from your own work, will gradually alert you to places where the research has begun to interfere with the underlying story.

There’s another place where excessive research can also be dangerous, and that’s in the writing process itself. Nearly every novel requires some degree of background material, but how much is too much? It’s always hard to say when research turns into procrastination, but here’s my own rule of thumb: two or three months of research is probably enough for the beginning of any project. Later on, you can always take a break to do more, and should certainly go back and check your facts once the novel is done, but any more than three months at the start, and you risk losing the momentum that encouraged you to write the novel in the first place. And once that momentum is gone, not even a Pratt & Whitney engine will get it back.

%d bloggers like this: