Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Google

Optimizing the future

with 7 comments

On Saturday, an online firestorm erupted over a ten-page memo written by James Damore, a twenty-eight-year-old software engineer at Google. Titled “Google’s Ideological Echo Chamber,” it led to its author being fired a few days later, and the furor is far from over—I have the uncomfortable feeling that it’s just getting started. (Damore has said that he intends to sue, and his case has already become a cause célèbre in exactly the circles that you’d expect.) In his memo, Damore essentially argues that the acknowledged gender gap in management and engineering roles at tech firms isn’t due to bias, but to “the distribution of preferences and abilities of men and women in part due to biological causes.” In women, these differences include “openness directed towards feelings and aesthetics rather than ideas,” “extraversion expressed as gregariousness rather than assertiveness,” “higher agreeableness,” and “neuroticism,” while men have a “higher drive for status” that leads them to take positions demanding “long, stressful hours that may not be worth it if you want a balanced and fulfilling life.” He summarizes:

I’m not saying that all men differ from women in the following ways or that these differences are “just.” I’m simply stating that the distribution of preferences and abilities of men and women differ in part due to biological causes and that these differences may explain why we don’t see equal representation of women in tech and leadership. Many of these differences are small and there’s significant overlap between men and women, so you can’t say anything about an individual given these population level distributions.

Damore quotes a decade-old research paper, which I suspect that he first encountered through the libertarian site Quillette, stating that as “society becomes more prosperous and more egalitarian, innate dispositional differences between men and women have more space to develop and the gap that exists between men and women in their personality becomes wider.” And he concludes: “We need to stop assuming that gender gaps imply sexism.”

I wasn’t even going to write about this here, but it rang a bell. Back in 1968, a science fiction fan named Ron Stoloff attended the World Science Fiction Convention in Berkeley, where he was disturbed both by the lack of diversity and by the presence of at least one fan costumed as Lt. Uhura in blackface. He wrote up his thoughts in an essay titled “The Negro and Science Fiction,” which was published the following year in the fanzine The Vorpal Sword. (I haven’t been able to track down the full issue, but you can find the first page of his article here.) On May 1, 1969, the editor John W. Campbell wrote Stoloff a long letter objecting to the argument and to the way that he had been characterized. It’s a fascinating document that I wish I could quote in full, but the most revealing section comes after Campbell asks rhetorically: “Look guy—do some thinking about this. How many Negro authors are there in science fiction?” He then answers his own question:

Now consider what effect a biased, anti-Negro editor could have on that. Manuscripts come in by mail from all over the world…I haven’t the foggiest notion what most of the authors look like—and I never yet heard of an editor who demanded a photograph of an author before he’d print his work! Nor demanded a notarized document proving he was write.

If Negro authors are extremely few—it’s solely because extremely few Negroes both wish to, and can, write in open competition. There isn’t any possible field of endeavor where race, religion, and sex make less difference. If there aren’t any individuals of a particular group in the authors’ column—it’s because either they didn’t want to, or weren’t able to. It’s got to be unbiased by the very nature of the process of submission.

Campbell’s argument is fundamentally the same as Damore’s. It states that the lack of black writers in the pages of Analog, like the underrepresentation of women in engineering roles at Google, isn’t due to bias, but because “either they didn’t want to, or weren’t able to.” (Campbell, like Damore, makes a point of insisting elsewhere that he’s speaking of the statistical description of the group as a whole, not of individuals, which strikes him as a meaningful distinction.) Earlier in the letter, however, Campbell inadvertently suggests another explanation for why “Negro authors are extremely few,” and it doesn’t have anything to do with ability:

Think about it a bit, and you’ll realize why there is so little mention of blacks in science fiction; we see no reason to go saying “Lookee lookee lookee! We’re using blacks in our stories! See the Black Man! See him in a spaceship!”

It is my strongly held opinion that any Black should be thrown out of any story, spaceship, or any other place—unless he’s a black man. That he’s got no business there just because he’s black, but every right there if he’s a man. (And the masculine embraces the feminine; Lt. Uhura is portrayed as no clinging vine, and not given to the whimper, whinny, and whine type behavior. She earned her place by competence—not by having a black skin.)

There are two implications here. The first is that all protagonists should be white males by default, a stance that Campbell might not even have seen as problematic—and it’s worth noting that even if race wasn’t made explicit in the story, the magazine’s illustrations overwhelmingly depicted its characters as white. There’s also the clear sense that black heroes have to “earn” their presence in the magazine, which, given the hundreds of cardboard “competent men” that Campbell cheerfully featured over the years, is laughable in itself. In fiction, as in life, if you’re black, you’ve evidently got to be twice as good to justify yourself.

It never seems to have occurred to Campbell that the dearth of minority writers in the genre might have been caused by a lack of characters who looked like them, as well as by similar issues in the fandom, and he never believed that he had the ability or the obligation to address the situation as an editor. (Elsewhere in the same letter, he writes: “What I am against—and what has been misinterpreted by a number of people—is the idea that any member of any group has any right to preferential treatment because he is a member.”) Left to itself, the scarcity of minority voices and characters was a self-perpetuating cycle that made it easy to argue that interest and ability were to blame. The hard part about encouraging diversity in science fiction, or anywhere else, is that it doesn’t happen by accident. It requires systematic, conscious effort, and the payoff might not be visible for years. That’s as hard and apparently unrewarding for a magazine that worries mostly about managing its inventory from one month to the next as it is for a technology company focused on yearly or quarterly returns. If Campbell had really wanted to see more black writers in Analog in the late sixties, he should have put more black characters in the magazine in the early forties. You could excuse this by saying that he had different objectives, and that it’s unfair to judge him in retrospect, but it’s equally true that it was a choice that he could have made, but didn’t. And science fiction was the poorer for it. In his memo, Damore writes:

Philosophically, I don’t think we should do arbitrary social engineering of tech just to make it appealing to equal portions of both men and women. For each of these changes, we need principled reasons for why it helps Google; that is, we should be optimizing for Google—with Google’s diversity being a component of that.

Replace “tech” with “science fiction,” “men and women” with “black and white writers,” and “Google” with “Analog,” and you have a fairly accurate representation of Campbell’s position. He clearly saw his job as the optimization of science fiction. A diverse roster of writers, which would have resulted in far more interesting “analog simulations” of reality of the kind that he loved, would have gone a long way toward improving it. He didn’t make the effort, and the entire genre suffered as a result. Google, to its credit, seems to understand that diversity also offers competitive advantages when you aren’t just writing about the future, but inventing it. And anything else would be suboptimal.

Written by nevalalee

August 10, 2017 at 9:15 am

“Where all other disguises fell away…”

leave a comment »

"Where all other disguises fell away..."

Note: This post is the forty-sixth installment in my author’s commentary for Eternal Empire, covering Chapter 45. You can read the previous installments here.

Occasionally, a piece of technology appears in the real world that fits the needs of fiction so admirably that authors rush to adopt it in droves. My favorite example is the stun gun. The ability to immobilize characters without killing or permanently incapacitating them is one that most genre writers eventually require. It allows the hero to dispatch a henchman or two while removing the need to murder them in cold blood, which is essential if your protagonist is going to remain likable, and it also lets the villain temporarily disable the hero while still keeping him alive for future plot purposes. Hence the ubiquitous blow to the back of the head that causes unconsciousness, which was a cliché long before movies like Conspiracy Theory ostentatiously drew attention to it. The beauty of the stun gun is that it produces all of the necessary effects—instantaneous paralysis with no lasting consequences—that the convention requires, while remaining comfortably within the bounds of plausibility. In my case, it was the moment when Mathis is conveniently dispatched toward the end of Casino Royale that woke me up to its possibilities, and I didn’t hesitate to use it repeatedly in The Icon Thief. By now, though, it’s become so overused that writers are already seeking alternatives, and even so meticulous an entertainment as the David Fincher version of The Girl With the Dragon Tattoo falls back on the even hoarier device of knockout gas. But the stun gun is here to stay.

Much the same principle applies to the two most epochal technological developments of our time, which have affected fiction as much as they’ve transformed everyday life: the cell phone and the Internet. Even the simple flip phone was a game changer, instantly rendering obsolete all stories that depend on characters being unable to contact one another or the police—which is why service outages and spotty coverage seem so common in horror movies. It’s hard not to watch movies or television from earlier in this century without reflecting on how so many problems could be solved by a simple phone call. (I’m catching up on The People v. O.J. Simpson, and I find myself thinking about the phones they’re using, or the lack thereof, as much as the story itself.) And the smartphone, with the instant access it provides to all the world’s information, generates just as many new problems and solutions, particularly for stories that hinge on the interpretation of obscure facts. Anyone writing conspiracy fiction these days has felt this keenly: there isn’t much call for professional symbologists when ordinary bystanders can solve the mystery by entering a couple of search terms. In City of Exiles, there’s a dramatic moment when Wolfe asks Ilya: “What is the Dyatlov Pass?” On reading it, my editor noted, not unreasonably: “Doesn’t anybody there have a cell phone?” In the end, I kept the line, and I justified it to myself by compressing the timeline: Wolfe has just been too busy to look it up herself. But I’m not sure if it works.

"It was a search engine request..."

Search engines are a particularly potent weapon of storytelling, to the point where they’ve almost become dangerous. At their best, they can provide a neat way of getting the story from one plot point to the next: hence the innumerable movie scenes in which someone like Jason Bourne stops in an Internet café and conducts a few searches, cut into an exciting montage, that propel him to the next stage of his journey. Sometimes, it seems too easy, but as screenwriter Tony Gilroy has said on more than one occasion, for a complicated action movie, you want to get from one sequence to the next with the minimum number of intermediate steps—and the search engine was all but designed to provide such shortcuts. More subtly, a series of queries can be used to provide a glimpse into a character’s state of mind, while advancing the plot at the same time. (My favorite example is when Bella looks up vampires in the first Twilight movie.) Google itself was ahead of the curve in understanding that a search can provide a stealth narrative, in brilliant commercials like “Parisian Love.” We’re basically being given access to the character’s interior monologue, which is a narrative tool of staggering usefulness. Overhearing someone’s thoughts is easy enough in prose fiction, but not in drama or film, and conventions like the soliloquy and the voiceover have been developed to address the problem, not always with complete success.  Showing us a series of search queries is about as nifty a solution as exists, to the point where it starts to seem lazy.

And an additional wrinkle is that our search histories don’t dissipate as our thoughts do: they linger, which means that other characters, as well as the viewer or reader, have potential access to them as well. (This isn’t just a convention of fiction, either: search histories have become an increasingly important form of evidence in criminal prosecutions. This worries me a bit, since anyone looking without the proper context at my own searches, which are often determined by whatever story I’m writing at the time, might conclude that I’m a total psychopath.) I made good use of this in Chapter 45 of Eternal Empire, in which Wolfe manages to access Asthana’s search history on her home computer and deduces that she was looking into Maddy Blume. It’s a crucial moment in the narrative, which instantly unites two widely separated plotlines, and this was the most efficient way I could devise of making the necessary connection. In fact, it might be a little too efficient: it verges on unbelievable that Asthana, who is so careful in all other respects, would fail to erase her search history. I tried to make it more acceptable by adding an extra step with a minimum of technical gobbledegook—Asthana has cleared her browser history, so Wolfe checks the contents of the disk and memory caches, which are saved separately to her hard drive—but it still feels like something of a cheat. But as long as search histories exist, authors will use them as a kind of trace evidence, like the flecks of cigarette ash that Sherlock Holmes uses to identify a suspect. And unlike most clues, they’re written for all to see…

The droid you’re looking for

with 13 comments

The ZTE Maven

A few weeks ago, I replaced my iPhone 5 with a cheap Android device. Yesterday, Apple reported that its phone sales had slowed to their lowest growth rate ever. Clearly, the two facts are connected—and I’m not entirely kidding about this. My phone had been giving me problems for about six months, ever since the display was cracked in an accident last year, but I managed to soldier on with a replacement screen until shortly after Christmas. A gentle fall from my sofa to the carpeted floor was enough to finish it off, and dead lines appeared on the touchscreen, forcing me to rotate the phone repeatedly to perform even the simplest of tasks. (For a while, I seriously considered trying to write all of my texts without using the letter “P,” which was the only one that was permanently disabled.) Finally, I gave in. I was still several months away from my next upgrade, so I went to what my daughter called “the boring cell phone store” to evaluate my options. After about five minutes of consideration, I ended up shelling out forty bucks for a Chinese-made Android model, reasoning that it would make a serviceable interim phone, if nothing else, until I could shop around for a more lasting replacement. But as it turns out, I love this goddamned little phone. I like it so much, in fact, that it’s caused me to question my lifelong devotion to Apple, which has begun to seem like an artifact of a set of assumptions about consumer technology that no longer apply.

My new phone is a ZTE Maven that runs Android 5.1. Its specs aren’t particularly impressive: eight gigs of storage, 4.5-inch screen, 1.2Ghz quad-core processor. Online reviews give it an average of three stars out of five. But I’ve been consistently delighted by it. The camera is significantly worse than the one for my old phone—a real issue for the parent of a three-year-old, since nearly every shot I take of my daughter, who refuses to sit still, ends up smeared into an indecipherable blur. But I’ve learned to live with it. And in every other respect, it either matches or exceeds my iPhone’s performance. Google apps, which I use for email, maps, and web browsing, load much faster and more smoothly than before. Notifications are so seamless that they take much of the fun out of checking my email: I simply know at a glance whether or not I’ve got a new message, while my old phone kept me in delightful suspense as it slowly spun its wheels. Eight gigabytes doesn’t leave me with any room for media, but between Google Photos, which I now use to store all of my old pictures in the cloud, and streaming music from Amazon and other sources, I don’t need to keep much of anything on the phone itself. And while it might seem unfair to compare a newish Android device to an iPhone that is several product cycles behind, the fact that the latter cost ten times as much is a big part of the reason that I held onto it for so long.

The iPhone 5

And to repeat: this phone cost forty dollars. It doesn’t matter if I drop it in the toilet or lose it or break it, or if my eye is caught by something new. There’s nothing stored on it that can’t be replaced. I have close to zero connection to it as a consumer fetish item, which, paradoxically, makes me regard it with even more affection than my old iPhone. If an iPhone lags, it feels like a betrayal; if this phone stalls for a few seconds, which happens rarely, I want to give it an encouraging pat on the head and say that I know it’s doing its best. And it constantly surprises me on the upside. Part of this is due to the relentless march of technology, in which a phone that would have seemed miraculous ten years ago is now being all but given away, but it more subtly reflects the fact that the actual phone is no longer a significant object in itself. Instead, it’s a tiny aperture that provides a window between two huge collections of information to either side. You could think of our online existence as an hourglass, with one large bulb encompassing the data and activities of our personal lives and the other embracing everything in the cloud. The slender neck connecting those two gigantic spheres is your phone—it’s the channel through which one passes into the other. And we’re at a stage where it doesn’t really matter what physical device provides the interface where those larger masses press together, like the film that forms between two soap bubbles. You could even argue that the cheaper the device, the more it fulfills its role as an intermediary, rather than as a focal point.

As a result, my crummy little Android phone—which, once again, isn’t even all that crummy—feels more like the future than whatever Apple is currently marketing. There’s something slightly disingenuous in the way Apple keeps pushing users to the cloud, to the extent of removing all of its old ports and optical drives, while still insisting that this incorporeal universe of information can best be accessed through a sleek, expensive machine. Elsewhere, I’ve written that Apple seems to use thinness or lightness as a quantifiable proxy for good design, and it could theoretically do the same with price, although the odds of this actually happening seem close to zero. Apple’s business model depends so much on charging a premium that it doesn’t feel much pressure to innovate below the line, at least not in ways that are visible to consumers. But this isn’t just about cheap phones: it’s about the question of whether we need any particular phone, rather than a series of convenient, provisional, and ultimately disposable lenses through which to see the content on the other side. The truth, I think, is that we don’t need much of a phone at all, or that it only has to be the phone we need for the moment—and it should be priced at a point at which we have no qualms about casually replacing it when necessary, any more than we’d think twice about buying a new light bulb when the old one burns out. If the Internet, as people never tire of telling us, is a utility like heat or running water, the phone isn’t even the fuse box: it’s the fuse. And it took a forty-dollar phone to get me to realize this.

Written by nevalalee

January 27, 2016 at 9:48 am

Posted in Writing

Tagged with , , ,

The lost library

with 2 comments

Google Book Search

“The most merciful thing in the world, I think, is the inability of the human mind to correlate all its contents,” H.P. Lovecraft writes in “The Call of Cthulhu.” He continues:

We live on a placid island of ignorance in the midst of black seas of infinity, and it was not meant that we should voyage far. The sciences, each straining in is own direction, have hitherto harmed us little, but some day the piecing together of dissociated knowledge will open up such terrifying vistas of reality, and of our frightful position therein, that we shall either go mad from the revelation or flee from the deadly light into the peace and safety of a new dark age.

Lovecraft’s narrator would be relieved, I think, by the recent blog post by Tim Wu of The New Yorker on the sorry state of Google Books. As originally conceived, this was a project that could have had the most lasting impact of any development of the information revolution—an accurate, instantaneous search of all the books ever published, transforming every page into metadata. Instead, it became mired in a string of lawsuits, failed settlements, and legislative inaction, and it limps on as a shadow of what it might have been.

And while the result might have saved us from going mad in the Lovecraftian sense, it’s an incalculable loss to those of us who believe that we’d profit more than we’d suffer from that kind of universal interconnectedness. I don’t mean to minimize what Google has done: even in its stunted, incomplete form, this is still an amazing tool for scholars and curious characters of all kinds, and we shouldn’t take it for granted. I graduated from college a few years before comprehensive book search—initially developed by Amazon—was widely available, and when I contemplate the difference between the way I wrote my senior thesis and what would be possible now, it feels like an incomprehensible divide. It’s true that easy access to search results can be a mixed blessing: there’s a sense in which the process of digging in libraries and leafing through physical books for a clue purifies the researcher’s brain, preparing it to recognize and act upon that information when it sees it. This isn’t always the case when a search result is just one click away. But for those who have the patience to use a search as a starting point, or as a map of the territory, it’s precious beyond reckoning. Making it fully accessible should be the central intellectual project of our time. Instead, it has stalled, perhaps forever, as publishers and authors dicker over rights issues that pale in comparison to the benefits to be gained from global access to ideas.

Google Book Search

I’m not trying to dismiss the fears of authors who are worried about the financial consequences of their work being available for free: these are valid concerns, and a solution that would wipe out any prospect of making a living from writing books—as it already threatens to do with journalism and criticism—would outweigh any possible gain. But if we just focus on books that are out of print and no longer profit any author or publisher in their present form, we’re talking about an enormous step forward. There’s no earthly reason why books that are currently impossible to buy should remain that way. Once something goes out of print, it should be fair game, at least until the copyright holder decides to do something about it. Inhibiting free access to books that can’t possibly do any good to their rights holders now, with an eye to some undefined future time when those rights might have value again, doesn’t help anybody. (If anything, a book that exists in searchable form is of greater potential value than a copy moldering unread on a library shelf.) Any solution to the problem of authors’ rights is inevitably going to be built out of countless compromises and workarounds, so we may as well approach it from a baseline of making everything accessible until we can figure out a way forward, rather than keeping these books out of sight until somebody legislates a solution. If nothing else, opening up those archives more fully would create real pressure to come up with a workable arrangement with authors. As it stands, it’s easier to do nothing.

And the fact that we’ve been waiting so long for an answer, even as Google, Amazon, and others devote their considerable resources to other forms of search, suggests that our priorities are fundamentally out of whack. Enabling a search of libraries is qualitatively different from doing the same for online content: instead of focusing solely on material that has been generated over the last few decades, and in which recent content outweighs the old by orders of magnitude, we’d be opening up the accumulated work of centuries. Not all of it is worth reading, any more than the vast majority of content produced every day deserves our time and attention, but ignoring that huge trove of information—thirty million books or more, with all their potential connections—is an act of appalling shortsightedness. A comprehensive search of books that were otherwise inaccessible, and which didn’t relegate most of the results to a snippet view for no discernible reason, would have a far greater impact on how we think, feel, and innovate than most of the technological projects that suck up money and regulatory attention. It might only appeal to a small slice of readers and researchers, but it happens to be a slice that is disproportionately likely to come up with works and ideas that affect the rest of us. But it requires a voice in its favor as loud as, or louder than, the writers and publishers who object to it. The books are there. They need to be searchable and readable. Anything else just doesn’t scan.

Written by nevalalee

September 15, 2015 at 9:04 am

Quote of the Day

leave a comment »

Google Material Design

When an object enters the frame, ensure it’s moving at its peak velocity. This behavior emulates natural movement: a person entering the frame of vision does not begin walking at the edge of the frame but well before it. Similarly, when an object exits the frame, have it maintain its velocity, rather than slowing down as it exits the frame. Easing in when entering and slowing down when exiting draw the user’s attention to that motion, which, in most cases, isn’t the effect you want.

Google Material Design

Written by nevalalee

October 22, 2014 at 7:12 am

Posted in Quote of the Day

Tagged with ,

Googling the rise and fall of literary reputations

leave a comment »

Note: To celebrate the third anniversary of this blog, I’ll be spending the week reposting some of my favorite pieces from early in its run. This post originally appeared, in a somewhat different form, on December 17, 2010.

As the New York Times recently pointed out, Google’s new online book database, which allows users to chart the evolving frequency of words and short phrases over 5.2 million digitized volumes, is a wonderful toy. You can look at the increasing frequency of George Carlin’s seven dirty words, for example—not surprisingly, they’ve all become a lot more common over the past few decades—or chart the depressing ascent of the word “alright.” Most seductively of all, perhaps, you can see at a glance how literary reputations have risen or fallen over time.

Take the five in the graph above, for instance. It’s hard not to see that, for all the talk of the death of Freud, he’s doing surprisingly well, and even passed Shakespeare in the mid-’70s (around the same time, perhaps not coincidentally, as Woody Allen’s creative peak). Goethe experienced a rapid fall in popularity in the mid-’30s, though he had recovered nicely by the end of World War II. Tolstoy, by contrast, saw a modest spike sometime around the Big Three conference in Tehran, and a drop as soon as the Soviet Union detonated its first atomic bomb. And Kafka, while less popular during the satisfied ’50s, saw a sudden surge in the paranoid decades thereafter:

Obviously, it’s possible to see patterns anywhere, and I’m not claiming that these graphs reflect real historical cause and effect. But it’s fun to think about. Even more fun is to look at the relative popularity of five leading American novelists of the last half of the twentieth century:

The most interesting graph is that for Norman Mailer, who experiences a huge ascent up to 1970, when his stature as a cultural icon was at his peak (just after his run for mayor of New York). Eventually, though, his graph—like those of Gore Vidal, John Updike, Philip Roth, and Saul Bellow—follows the trajectory that we’d suspect for that of an established, serious author: a long, gradual rise followed by a period of stability, as the author enters the official canon. Compare this to a graph of four best-selling novelists of the 1970s:

For Harold Robbins, Jacqueline Susann, Irving Wallace, and Arthur Hailey—and if you don’t recognize their names, ask your parents—we see a rapid rise in popularity followed by an equally rapid decline, which is what we might expect for authors who were once hugely popular but had no lasting value. And it’ll be interesting to see what this graph will look like in fifty years for, say, Stephenie Meyer or Dan Brown, and in which category someone like Jonathan Franzen or J.K. Rowling will appear. Only time, and Google, will tell.

Googling the rise and fall of literary reputations: the sequel

with 2 comments

After playing over the weekend with the new word frequency tool in Google Books, I quickly came to realize that last week’s post barely scratched the surface. It’s fun to compare novelists against other writers in the same category, for example, but what happens when we look at authors in different categories altogether? Here’s what we get, for instance, when we chart two of the most famous literary authors of the latter half of the century against their counterparts on the bestseller list:

The results may seem surprising at first, but they aren’t hard to understand. Books by Philip Roth and John Updike might be outsold by Harold Robbins and Jacqueline Susann in their initial run (the occasional freak like Couples or Portnoy’s Complaint aside), but as they enter the canon, they’re simply talked about more often, by other writers, than their bestselling contemporaries. (Robbins and Susann, by contrast, probably aren’t cited very often outside their own books.) Compared to the trajectory of a canonical author, the graph of a bestseller begins to look less like a mountain and more like a molehill—or a speed bump. But now look here:

Something else altogether seems to be at work in this chart, and it’s only a reminder of the singularity of Stephen King’s career. Soon after his debut—Carrie, ‘Salem’s Lot, The Shining, and The Stand were all published within the same five years—King had overtaken the likes of Robbins and Susann both on the bestseller lists and in terms of cultural impact. Then something even stranger happened: he became canonical. He was prolific, popular, and wrote books that were endlessly referenced within the culture. As a result, his graph looks like no other—an appropriately monstrous hybrid of the bestselling author and serious novelist.

So what happens when we extend the graph beyond the year 2000, which is where the original numbers end? Here’s what we see:

A number of interesting things begin to happen in the last decade. Robbins and Susann look more like speed bumps than ever before. King’s popularity begins to taper off just as he becomes officially canonical—right when he receives lifetime achievement honors from the National Book Awards. And Roth and Updike seem to have switched places in 2004, or just after the appearance of The Plot Against America, which marks the peak, so far, of Roth’s late resurgence.

Of course, the conclusions I’ve drawn here are almost certainly flawed. There’s no way of knowing, at least not without looking more closely at the underlying data, whether the number of citations of a given author reflects true cultural prominence or something else. And it’s even harder to correlate any apparent patterns—if they’re actually there at all—with particular works or historical events, especially given the lag time of the publishing process. But there’s one chart, which I’ve been saving for last, which is so striking that I can’t help but believe that it represents something real:

This is a chart of the novelists who, according to a recent New York Times poll, wrote the five best American novels of the past twenty-five years: Toni Morrison (Beloved), Don DeLillo (Underworld), John Updike (Rabbit Angstrom), Cormac McCarthy (Blood Meridian), and Philip Roth (American Pastoral). The big news here, obviously, is Morrison’s amazing ascent around 1987, when Beloved was published. It isn’t hard to see why: Beloved was the perfect storm of literary fiction, a bestselling, critically acclaimed novel that also fit beautifully into the college curriculum. Morrison’s decline in recent years has less to do, I expect, with any real fall in her reputation than with a natural settling to more typical levels. (Although it’s interesting to note that the drop occurs shortly after Morrison received the Nobel Prize, thus locking her into the canon. Whether or not this drop is typical of officially canonized authors is something I hope to explore in a later post.)

It might be argued, and rightly so, that it’s unfair to turn literary reputation into such a horse race. But such numbers are going to be an inevitable part of the conversation from now on, and not just in terms of citations. It’s appropriate that Google unveiled this new search tool just as Amazon announced that it was making BookScan sales numbers available to its authors, allowing individual writers to do what I’m doing here, on a smaller and more personal scale. And if there’s any silver lining, it’s this: as the cases of Robbins and Susann remind us, in the end, sales don’t matter. After all, looking at the examples given above, which of these graphs would you want?

%d bloggers like this: