Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘New York Times

Burrowing into The Tunnel

with 3 comments

Last fall, it occurred to me that someone should write an essay on the parallels between the novel The Tunnel by William H. Gass, which was published in 1995, and the contemporary situation in America. Since nobody else seemed to be doing it, I figured that it might as well be me, although it was a daunting project even to contemplate—Gass’s novel is over six hundred pages long and famously impenetrable, and I knew that doing it justice would take at least three weeks of work. Yet it seemed like something that had to exist, so I wrote it up at the end of last year. For various reasons, it took a long time to see print, but it’s finally out now in the New York Times Book Review. It isn’t the kind of thing that I normally do, but it felt like a necessary piece, and I’m pretty proud of how it turned out. And if the intervening seven months don’t seem to have dated it at all, it only puts me in mind of what the radio host on The Simpsons once said about the DJ 3000 computer: “How does it keep up with the news like that?”

Written by nevalalee

July 12, 2019 at 2:35 pm

Outside the Wall

leave a comment »

On Thursday, I’m heading out to the fortieth annual International Conference on the Fantastic in the Arts in Orlando, Florida, where I’ll be participating in two events. One will be a reading at 8:30am featuring Jeanne Beckwith, James Patrick Kelly, Rachel Swirsky, and myself, moderated by Marco Palmieri. (I’m really looking forward to meeting Jim Kelly, who had an unforgettable story, “Monsters,” in the issue of Asimov’s Science Fiction that changed my life.) The other will be the panel “The Changing Canon of SF” at 4:15pm, moderated by James Patrick Kelly, at which Mary Anne Mohanraj, Rich Larson, and Erin Roberts will also be appearing.

In other news, I’m scheduled to speak next month at the Windy City Pulp and Paper Convention in Lombard, Illinois, where I’ll be giving a talk on Friday April 12 at 7pm. (Hugo nominations close soon, by the way, and if you’re planning to fill out a ballot, I’d be grateful if you’d consider nominating Astounding for Best Related Work.) And if you haven’t already seen it, please check out my recent review in the New York Times of John Lanchester’s dystopian novel The Wall. I should have a few more announcements here soon—please stay tuned for more!

Visions of tomorrow

with 4 comments

As I’ve mentioned here before, one of my few real regrets about Astounding is that I wasn’t able to devote much room to discussing the artists who played such an important role in the evolution of science fiction. (The more I think about it, the more it seems to me that their collective impact might be even greater than that of any of the writers I discuss, at least when it comes to how the genre influenced and was received by mainstream culture.) Over the last few months, I’ve done my best to address this omission, with a series of posts on such subjects as Campbell’s efforts to improve the artwork, his deliberate destruction of the covers of Unknown, and his surprising affection for the homoerotic paintings of Alejandro Cañedo. And I can reveal now that this was all in preparation for a more ambitious project that has been in the works for a while—a visual essay on the art of Astounding and Unknown that has finally appeared online in the New York Times Book Review, with the highlights scheduled to be published in the print edition this weekend. It took a lot of time and effort to put it together, especially by my editors, and I’m very proud of the result, which honors the visions of such artists as H.W. Wesso, Howard V. Brown, Hubert Rogers, Manuel Rey Isip, Frank Kelly Freas, and many others. It stands on its own, but I’ve come to think of it as an unpublished chapter from my book that deserves to be read alongside its longer companion. As I note in the article, it took years for the stories inside the magazine to catch up to the dreams of its readers, but the artwork was often remarkable from the beginning. And if you want to know what the fans of the golden age really saw when they imagined the future, the answer is right here.

Written by nevalalee

January 11, 2019 at 7:25 am

The last resolution

with 10 comments

By just about any measure, this was the most rewarding year of my professional life. My group biography Astounding: John W. Campbell, Isaac Asimov, Robert A. Heinlein, L. Ron Hubbard, and the Golden Age of Science Fiction was released by HarperCollins in October. I published one novelette, “The Spires,” in Analog, with another, “At the Fall,” scheduled to come out sometime next year. My novella “The Proving Ground” was anthologized and reprinted in several places, including in the final edition of the late Gardner Dozois’s The Year’s Best Science Fiction. I wrote a few new pieces of nonfiction, including an essay on Isaac Asimov and psychohistory for the New York Times, and I saw John W. Campbell’s Frozen Hell, based on the original manuscript of “Who Goes There?” that I rediscovered at Harvard, blow past all expectations on Kickstarter. (The book, which will include introductions by me and Robert Silverberg, is scheduled to appear in June.) My travels brought me to conventions and conferences in San Jose, Chicago, New Orleans, and Boston. Perhaps best of all, I’ve confirmed I’ll be spending the next three years writing the book of my dreams, a big biography of Buckminster Fuller, which is something that I couldn’t have imagined a decade ago. Even as the world falls apart in other ways, I’ve been lucky enough to spend much of my time thinking about what matters most to me, even if it makes me feel like the narrator of Borges’s “Tlön, Uqbar, Orbis Tertius,” who continues to work quietly in his hotel room as the civilization around him enters its long night.

In good times and bad, I’ve also found consolation on this blog, where I’ve posted something every day—and I have trouble believing this myself—for more than eight years. (My posts on science fiction alone add up to a longer book than Astounding, and they account for only a fraction of what I’ve written here.) At the moment, however, it doesn’t look like I’ll be able to keep up my streak. I won’t stop posting here entirely, but I can’t maintain the same pace that I have in the past, and I’ve resolved to take an extended break. For a long time, I planned to skip a day without any advance notice, but it seems appropriate for me to step away now, at the end of this very eventful year. I expect that this blog will go silent for a week or two, followed by occasional posts thereafter when anything grabs my attention, and I may well miss my morning routine enough to return eventually to something approximating my old schedule. In the meantime, though, I want to thank everyone who has hung in there, whether you’re a longtime reader or a recent visitor. Eight years ago, I started this blog without any thought about what it might become, but it unexpectedly turned into the place where I’ve tried to figure out what I think and who I am, at least as a writer, during some of the best and worst years of my life. I’m no longer as optimistic as I once was about what comes next, but I’ve managed to become something like the writer I wanted to be. And a lot of it happened right here.

The unfinished lives

with 3 comments

Yesterday, the New York Times published a long profile of Donald Knuth, the legendary author of The Art of Computer Programming. Knuth is eighty now, and the article by Siobhan Roberts offers an evocative look at an intellectual giant in twilight:

Dr. Knuth usually dresses like the youthful geek he was when he embarked on this odyssey: long-sleeved T-shirt under a short-sleeved T-shirt, with jeans, at least at this time of year…Dr. Knuth lives in Stanford, and allowed for a Sunday visitor. That he spared an entire day was exceptional—usually his availability is “modulo nap time,” a sacred daily ritual from 1 p.m. to 4 p.m. He started early, at Palo Alto’s First Lutheran Church, where he delivered a Sunday school lesson to a standing-room-only crowd.

This year marks the fiftieth anniversary of the publication of the first volume of Knuth’s most famous work, which is still incomplete. Knuth is busy writing the fourth installment, one fascicle at a time, although its most recent piece has been delayed “because he keeps finding more and more irresistible problems that he wants to present.” As Roberts writes: “Dr. Knuth’s exacting standards, literary and otherwise, may explain why his life’s work is nowhere near done. He has a wager with Sergey Brin, the co-founder of Google and a former student…over whether Mr. Brin will finish his Ph.D. before Dr. Knuth concludes his opus…He figures it will take another twenty-five years to finish The Art of Computer Programming, although that time frame has been a constant since about 1980.”

Knuth is a prominent example, although far from the most famous, of a literary and actuarial phenomenon that has grown increasingly familiar—an older author with a projected work of multiple volumes, published one book at a time, that seems increasingly unlikely to ever see completion. On the fiction side, the most noteworthy case has to be that of George R.R. Martin, who has been fielding anxious inquiries from fans for most of the last decade. (In an article that appeared seven long years ago in The New Yorker, Laura Miller quotes Martin, who was only sixty-three at the time: “I’m still getting e-mail from assholes who call me lazy for not finishing the book sooner. They say, ‘You better not pull a Jordan.’”) Robert A. Caro is still laboring over what he hopes will be the final volume of his biography of Lyndon Johnson, and mortality has become an issue not just for him, but for his longtime editor, as we read in Charles McGrath’s classic profile in the Times:

Robert Gottlieb, who signed up Caro to do The Years of Lyndon Johnson when he was editor in chief of Knopf, has continued to edit all of Caro’s books, even after officially leaving the company. Not long ago he said he told Caro: “Let’s look at this situation actuarially. I’m now eighty, and you are seventy-five. The actuarial odds are that if you take however many more years you’re going to take, I’m not going to be here.”

That was six years ago, and both men are still working hard. But sometimes a writer has no choice but to face the inevitable. When asked about the concluding fifth volume of his life of Picasso, with the fourth one still on the way, the biographer John Richardson said candidly: “Listen, I’m ninety-one—I don’t think I have time for that.”

I don’t have the numbers to back this up, but such cases—or at least the public attention that they inspire—seem to be growing more common these days, on account of some combination of lengthening lifespans, increased media coverage of writers at work, and a greater willingness from publishers to agree to multiple volumes in the first place. The subjects of such extended commitments tend to be monumental in themselves, in order to justify the total investment of the writer’s own lifetime, and expanding ambitions are often to blame for blown deadlines. Martin, Caro, and Knuth all increased the prospective number of volumes after their projects were already underway, or as Roberts puts it: “When Dr. Knuth started out, he intended to write a single work. Soon after, computer science underwent its Big Bang, so he reimagined and recast the project in seven volumes.” And this “recasting” seems particularly common in the world of biographies, as the author discovers more material that he can’t bear to cut. The first few volumes may have been produced with relative ease, but as the years pass and anticipation rises, the length of time it takes to write the next installment grows, until it becomes theoretically infinite. Such a radical change of plans, which can involve extending the writing process for decades, or even beyond the author’s natural lifespan, requires an indulgent publisher, university, or other benefactor. (John Richardson’s book has been underwritten by nothing less than the John Richardson Fund for Picasso Research, which reminds me of what Homer Simpson said after being informed that he suffered from Homer Simpson syndrome: “Oh, why me?”) And it may not be an accident that many of the examples that first come to mind are white men, who have the cultural position and privilege to take their time.

It isn’t hard to understand a writer’s reluctance to let go of a subject, the pressures on a book being written in plain sight, or the tempting prospect of working on the same project forever. And the image of such authors confronting their mortality in the face of an unfinished book is often deeply moving. One of the most touching examples is that of Joseph Needham, whose Science and Civilization in China may have undergone the most dramatic expansion of them all, from an intended single volume to twenty-seven and counting. As Kenneth Girdwood Robinson writes in a concluding posthumous volume:

The Duke of Edinburgh, Chancellor of the University of Cambridge, visited The Needham Research Institute, and interested himself in the progress of the project. “And how long will it take to finish it?” he enquired. On being given a rather conservative answer, “At least ten years,” he exclaimed, “Good God, man, Joseph will be dead before you’ve finished,” a very true appreciation of the situation…In his closing years, though his mind remained lucid and his memory astonishing, Needham had great difficulty even in moving from one chair to another, and even more difficulty in speaking and in making himself understood, due to the effect of the medicines he took to control Parkinsonism. But a secretary, working closely with him day by day, could often understand what he had said, and could read what he had written, when others were baffled.

Needham’s decline eventually became impossible to ignore by those who knew him best, as his biographer Simon Winchester writes in The Man Who Loved China: “It was suggested that, for the first time in memory, he take the day off. It was a Friday, after all: he could make it a long weekend. He could charge his batteries for the week ahead. ‘All right,’ he said. ‘I’ll stay at home.’” He died later that day, with his book still unfinished. But it had been a good life.

Go set a playwright

with 2 comments

If you follow theatrical gossip as avidly as I do, you’re probably aware of the unexpected drama that briefly surrounded the new Broadway adaptation of Harper Lee’s To Kill a Mockingbird, which was written for the stage by Aaron Sorkin. In March, Lee’s estate sued producer Scott Rudin, claiming that the production was in breach of contract for straying drastically from the book. According to the original agreement, the new version wasn’t supposed to “depart in any manner from the spirit of the novel nor alter its characters,” which Sorkin’s interpretation unquestionably did. (Rudin says just as much on the record: “I can’t and won’t present a play that feels like it was written in the year the book was written in terms of its racial politics. It wouldn’t be of interest. The world has changed since then.”) But the question isn’t quite as straightforward as it seems. As a lawyer consulted by the New York Times explains:

Does “spirit” have a definite and precise meaning, or could there be a difference of opinion as to what is “the spirit” of the novel? I do not think that a dictionary definition of “spirit” will resolve that question. Similarly, the contract states that the characters should not be altered. In its pre-action letter, Harper Lee’s estate repeatedly states that the characters “would never have” and “would not have” done numerous things; unless as a matter of historical fact the characters would not have done something…who is to say what a creature of fiction “would never have” or “would not have” done?

Now that the suit has been settled and the play is finally on Broadway, this might all seem beside the point, but there’s one aspect of the story that I think deserves further exploration. Earlier this week, Sorkin spoke to Greg Evans of Deadline about his writing process, noting that he took the initial call from Rudin for good reasons: “The last three times Scott called me and said ‘I have something very exciting to talk to you about,’ I ended up writing Social Network, Moneyball, and Steve Jobs, so I was paying attention.” His first pass was a faithful version of the original story, which took him about six months to write: “I had just taken the greatest hits of the book, the most important themes, the most necessary themes. I stood them up and dramatized them. I turned them into dialogue.” When he was finished, he had a fateful meeting with Rudin:

He had two notes. The first was, “We’ve got to get to the trial sooner.” That’s a structural note. The second was the note that changed everything. He said, “Atticus can’t be Atticus for the whole play. He’s got to become Atticus,” and of course, he was right. A protagonist has to change. A protagonist has to be put through something and change as a result, and a protagonist has to have a flaw. And I wondered how Harper Lee had gotten away with having Atticus be Atticus for the whole book, and it’s because Atticus isn’t the protagonist in the book. Scout is. But in the play, Atticus was going to be the protagonist, and I threw out that first draft. I started all over again, but this time the goal wasn’t to be as much like the book as possible. The goal wasn’t to swaddle the book in bubble wrap and then gently transfer it to a stage. I was going to write a new play.

This is fascinating stuff, but it’s worth emphasizing that while Rudin’s first piece of feedback was “a structural note,” the second one was as well. The notions that “a protagonist has to change” and “a protagonist has to have a flaw” are narrative conventions that have evolved over time, and for good reason. Like the idea of building the action around a clear sequence of objectives, they’re basically artificial constructs that have little to do with the accurate representation of life. Some people never change for years, and while we’re all flawed in one way or another, our faults aren’t always reflected in dramatic terms in the situations in which we find ourselves. These rules are useful primarily for structuring the audience’s experience, which comes down to the ability to process and remember information delivered over time. (As Kurt Vonnegut, who otherwise might not seem to have much in common with Harper Lee, once said to The Paris Review: “I don’t praise plots as accurate representations of life, but as ways to keep readers reading.”) Yet they aren’t essential, either, as the written and filmed versions of To Kill a Mockingbird make clear. The original novel, in particular, has a rock-solid plot and supporting characters who can change and surprise us in ways that Atticus can’t. Unfortunately, it’s hard for plot alone to carry a play, which is largely a form about character, and Atticus is obviously the star part. Sorkin doesn’t shy away from using the backbone that Lee provides—the play does indeed get to the jury trial, which is still the most reliable dramatic convention ever devised, more quickly than the book does—but he also grasped the need to turn the main character into someone who could give shape to the audience’s experience of watching the play. It was this consideration, and not the politics, that turned out to be crucial.

There are two morals to this story. One is how someone like Sorkin, who can fall into traps of his own as a writer, benefits from feedback from even stronger personalities. The other is how a note on structure, which Sorkin takes seriously, forced him to engage more deeply with the play’s real material. As all writers know, it’s harder than it looks to sequence a story as a series of objectives or to depict a change in the protagonist, but simply by thinking about such fundamental units of narrative, a writer will come up with new insights, not just about the hero, but about everyone else. As Sorkin says of his lead character in an interview with Vulture:

He becomes Atticus Finch by the end of the play, and while he’s going along, he has a kind of running argument with Calpurnia, the housekeeper, which is a much bigger role in the play I just wrote. He is in denial about his neighbors and his friends and the world around him, that it is as racist as it is, that a Maycomb County jury could possibly put Tom Robinson in jail when it’s so obvious what happened here. He becomes an apologist for these people.

In other words, Sorkin’s new perspective on Atticus also required him to rethink the roles of Calpurnia and Tom Robinson, which may turn out to be the most beneficial change of all. (This didn’t sit well with the Harper Lee estate, which protested in its complaint that black characters who “knew their place” wouldn’t behave this way at the time.) As Sorkin says of their lack of agency in the original novel: “It’s noticeable, it’s wrong, and it’s also a wasted opportunity.” That’s exactly right—and I like the last reason the best. In theater, as in any other form of narrative, the technical considerations of storytelling are more important than doing the right thing. But to any experienced writer, it’s also clear that they’re usually one and the same.

Written by nevalalee

December 14, 2018 at 8:39 am

The Great Man and the WASP

with 3 comments

Last week, the New York Times opinion columnist Ross Douthat published a piece called “Why We Miss the WASPs.” Newspaper writers don’t get to choose their own headlines, and it’s possible that if the essay had run under a different title, it might not have attracted the same degree of attention, which was far from flattering. Douthat’s argument—which was inspired by the death of George H.W. Bush and his obvious contrast with the current occupant of the White House—can be summarized concisely:

Bush nostalgia [is] a longing for something America used to have and doesn’t really any more—a ruling class that was widely (not universally, but more widely than today) deemed legitimate, and that inspired various kinds of trust (intergenerational, institutional) conspicuously absent in our society today. Put simply, Americans miss Bush because we miss the WASPs—because we feel, at some level, that their more meritocratic and diverse and secular successors rule us neither as wisely nor as well.

Douthat ostentatiously concedes one point to his critics in advance: “The old ruling class was bigoted and exclusive and often cruel, it had failures aplenty, and as a Catholic I hold no brief for its theology.” But he immediately adds that “building a more democratic and inclusive ruling class is harder than it looks, and even perhaps a contradiction in terms,” and he suggests that one solution would be a renewed embrace of the idea that “a ruling class should acknowledge itself for what it really is, and act accordingly.”

Not surprisingly, Douthat’s assumptions about the desirable qualities of “a ruling class” were widely derided. He responded with a followup piece in which he lamented the “misreadings” of those who saw his column as “a paean to white privilege, even a brief for white supremacy,” while never acknowledging any flaws in his argument’s presentation. But what really sticks with me is the language of the first article, which is loaded with rhetorical devices that both skate lightly over its problems and make it difficult to deal honestly with the issues that it raises. One strategy, which may well have been unconscious, is a familiar kind of distancing. As Michael Harriot writes in The Root:

I must applaud opinion writer Ross Douthat for managing to put himself at an arms-length distance from the opinions he espoused. Douthat employed the oft-used Fox News, Trumpian “people are saying…” trick, essentially explaining that some white people think like this. Not him particularly—but some people.

It’s a form of evasiveness that resembles the mysterious “you” of other sorts of criticism, and it enables certain opinions to make it safely into print. Go back and rewrite the entire article in the first person, and it becomes all but unreadable. For instance, it’s hard to imagine Douthat writing a sentence like this: “I miss Bush because I miss the WASPs—because I feel, at some level, that their more meritocratic and diverse and secular successors rule us neither as wisely nor as well.”

But even as Douthat slips free from the implications of his argument on one end, he’s ensnared at the other by his own language. We can start with the term “ruling class” itself, which appears in the article no fewer than five times, along with a sixth instance in a quotation from the critic Helen Andrews. The word “establishment” appears seventeen times. If asked, Douthat might explain that he’s using both of these terms in a neutral sense, simply to signify the people who end up in political office or in other positions of power. But like the “great man” narrative of history or the “competent man” of science fiction, these words lock us into a certain set of assumptions, by evoking an established class that rules rather than represents, and they beg the rather important question of whether we need a ruling class at all. Even more insidiously, Douthat’s entire argument rests on the existence of the pesky but convenient word “WASP” itself. When the term appeared half a century ago, it was descriptive and slightly pejorative. (According to the political scientist Andrew Harris, who first used it in print, it originated in the “the cocktail party jargon of the sociologists,” and the initial letter initially stood for “wealthy.” As it stands, the term is slightly redundant, although it still describes exactly the same group of people, and foregrounding their whiteness isn’t necessarily a bad idea.) Ultimately, however, it turned into a tag that allows us to avoid spelling out everything that it includes, which makes it easier to let such attitudes slip by unexamined. Let’s rework that earlier sentence one more time: “I miss Bush because I miss the white Anglo-Saxon Protestants—because I feel, at some level, that their more meritocratic and diverse and secular successors rule us neither as wisely nor as well.” And this version, at least, is much harder to “misread.”

At this point, I should probably confess that I take a personal interest in everything that Douthat writes. Not only are we both Ivy Leaguers, but we’re members of the same college class, although I don’t think we ever crossed paths. In most other respects, we don’t have a lot in common, but I can relate firsthand to the kind of educational experience—which John Stuart Mill describes in today’s quotation—that leads public intellectuals to become more limited in their views than they might realize. Inspired by a love of the great books and my summer at St. John’s College, I spent most of my undergraduate years reading an established canon of writers, in part because I was drawn to an idea of elitism in its most positive sense. What I didn’t see for a long time was that I was living in an echo chamber. It takes certain forms of privilege and status for granted, and it makes it hard to talk about these matters in the real world without a conscious effort of will. (In his original article, Douthat’s sense of the possible objections to his thesis is remarkably blinkered in itself. After acknowledging the old ruling class’s bigotry, exclusivity, and cruelty, he adds: “And don’t get me started on its Masonry.” That was fairly low down my list of concerns, but now I’m frankly curious.) I understand where Douthat is coming from, because I came from it, too. But that isn’t an excuse for looking at the WASPs, or a dynasty that made a fortune in the oil business, and feeling “nostalgic for their competence,” which falls apart the second we start to examine it. If they did rule us once, then they bear responsibility for the destruction of our planet and the perpetuation of attitudes that put democracy itself at risk. If they’ve managed to avoid much of the blame, it’s only because it took decades for us to see the full consequences of their actions, which have emerged more clearly in the generation that they raised in their image. It might well be true, as Douthat wrote, that they trained their children “for service, not just success.” But they also failed miserably.

Written by nevalalee

December 11, 2018 at 9:13 am

%d bloggers like this: