Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Washington Post

Wounded Knee and the Achilles heel

with one comment

On February 27, 1973, two hundred Native American activists occupied the town of Wounded Knee in South Dakota. They were protesting against the unpopular tribal president of the Oglala Lakota Sioux, along with the federal government’s failure to negotiate treaties, and the ensuing standoff—which resulted in two deaths, a serious casualty, and a disappearance—lasted for over seventy days. It also galvanized many of those who watched it unfold, including the author Paul Chaat Smith, who writes in his excellent book Everything You Know About Indians is Wrong:

Lots occurred over the next two and a half months, including a curious incident in which some of the hungry, blockaded Indians attempted to slaughter a cow. Reporters and photographers gathered to watch. Nothing happened. None of the Indians—some urban activists, some from Sioux reservations—actually knew how to butcher cattle. Fortunately, a few of the journalists did know, and they took over, ensuring dinner for the starving rebels. That was a much discussed event during and after Wounded Knee. The most common reading of this was that basically we were fakes. Indians clueless about butchering livestock were not really Indians.

Smith dryly notes that the protesters “lost points” with observers after this episode, which overshadowed many of the more significant aspects of the occupation, and he concludes: “I myself know nothing about butchering cattle, and would hope that doesn’t invalidate my remarks about the global news media and human rights.”

I got to thinking about this passage in the aftermath of Elizabeth Warren’s very bad week. More specifically, I was reminded of it by a column by the Washington Post opinion writer Dana Milbank, who focuses on Warren’s submissions to the cookbook Pow Wow Chow: A Collection of Recipes from Families of the Five Civilized Tribes, which was edited by her cousin three decades ago. One of the recipes that Warren contributed was “Crab with Tomato Mayonnaise Dressing,” which leads Milbank to crack: “A traditional Cherokee dish with mayonnaise, a nineteenth-century condiment imported by settlers? A crab dish from landlocked Oklahoma? This can mean only one thing: canned crab. Warren is unfit to lead.” He’s speaking with tongue partially in cheek—a point that probably won’t be caught by thousands of people who are just browsing the headlines—but when I read these words, I thought immediately of these lines from Smith’s book:

It presents the unavoidable question: Are Indian people allowed to change? Are we allowed to invent completely new ways of being Indian that have no connection to previous ways we have lived? Authenticity for Indians is a brutal measuring device that says we are only Indian as long as we are authentic. Part of the measurement is about percentage of Indian blood. The more, the better. Fluency in one’s Indian language is always a high card. Spiritual practices, living in one’s ancestral homeland, attending powwows, all are necessary to ace the authenticity test. Yet many of us believe taking the authenticity tests is like drinking the colonizer’s Kool-Aid—a practice designed to strengthen our commitment to our own internally warped minds. In this way, we become our own prison guards.

And while there may be other issues with Warren’s recipe, it’s revealing that we often act as if the Cherokee Nation somehow ceased to evolve—or cook for itself—after the introduction of mayonnaise.

This may seem like a tiny point, but it’s also an early warning of a monstrous cultural reckoning lurking just around the corner, at at time when we might have thought that we had exhausted every possible way to feel miserable and divided. If Warren runs for president, which I hope she does, we’re going to be plunged into what Smith aptly describes as a “snake pit” that terrifies most public figures. As Smith writes in a paragraph that I never tire of quoting:

Generally speaking, smart white people realize early on, probably even as children, that the whole Indian thing is an exhausting, dangerous, and complicated snake pit of lies. And…the really smart ones somehow intuit that these lies are mysteriously and profoundly linked to the basic construction of the reality of daily life, now and into the foreseeable future. And without it ever quite being a conscious thought, these intelligent white people come to understand that there is no percentage, none, in considering the Indian question, and so the acceptable result is to, at least subconsciously, acknowledge that everything they are likely to learn about Indians in school, from books and movies and television programs, from dialogue with Indians, from Indian art and stories, from museum exhibits about Indians, is probably going to be crap, so they should be avoided.

This leads him to an unforgettable conclusion: “Generally speaking, white people who are interested in Indians are not very bright.” But that’s only because most of the others are prudent enough to stay well away—and even Warren, who is undeniably smart, doesn’t seem to have realized that this was a fight that she couldn’t possibly win.

One white person who seems unquestionably interested in Indians, in his own way, is Donald Trump. True to form, he may not be very bright, but he also displays what Newt Gingrich calls a “sixth sense,” in this case for finding a formidable opponent’s Achilles heel and hammering at it relentlessly. Elizabeth Warren is one of the most interesting people to consider a presidential run in a long time, but Trump may have already hamstrung her candidacy by zeroing in on what might look like a trivial vulnerability. And the really important point here is that if Warren’s claims about her Native American heritage turn out to be her downfall, it’s because the rest of us have never come to terms with our guilt. The whole subject is so unsettling that we’ve collectively just agreed not to talk about it, and Warren made the unforgivable mistake, a long time ago, of folding it into her biography. If she’s being punished for it now, it’s because it precipitates something that was invisibly there all along, and this may only be the beginning. Along the way, we’re going to run up against a lot of unexamined assumptions, like Milbank’s amusement at that canned crab. (As Smith reminds us: “Indians are okay, as long as they meet non-Indian expectations about Indian religious and political beliefs. And what it really comes down to is that Indians are okay as long as we don’t change too much. Yes, we can fly planes and listen to hip-hop, but we must do these things in moderation and always in a true Indian way.” And mayonnaise is definitely out.) Depending on your point of view, this issue is either irrelevant or the most important problem imaginable, and like so much else these days, it may take a moronic quip from Trump—call it the Access Hollywood principle—to catalyze a debate that more reasonable minds have postponed. In his discussion of Wounded Knee, Smith concludes: “Yes, the news media always want the most dramatic story. But I would argue there is an overlay with Indian stories that makes it especially difficult.” And we might be about to find out how difficult it really is.

Written by nevalalee

October 19, 2018 at 8:44 am

The difference engine

leave a comment »

Earlier this month, within the space of less than a day, two significant events occurred in the life of Donna Strickland, an assistant professor at the University of Waterloo. She won the Nobel Prize in Physics, and she finally got her own Wikipedia page. As the biologist and Wikipedia activist Dawn Bazely writes in an excellent opinion piece for the Washington Post:

The long delay was not for lack of trying. Last May, an editor had rejected a submitted entry on Strickland, saying the subject did not meet Wikipedia’s notability requirement. Strickland’s biography went up shortly after her award was announced. If you click on the “history” tab to view the page’s edits, you can replay the process of a woman scientist finally gaining widespread recognition, in real time.

And it isn’t an isolated problem, as Bazely points out: “According to the Wikimedia Foundation, as of 2016, only 17 percent of the reference project’s biographies were about women.” When Bazely asked some of her students to create articles on women in ecology or the sciences, she found that their efforts frequently ran headlong into Wikipedia’s editing culture: “Many of their contributions got reversed almost immediately, in what is known as a ‘drive-by deletion’…I made an entry for Kathy Martin, current president of the American Ornithological Society and a global authority on arctic and alpine grouse. Almost immediately after her page went live, a flag appeared over the top page: ‘Is this person notable enough?’”

Strickland’s case is an unusually glaring example, but it reflects a widespread issue that extends far beyond Wikipedia itself. In a blog post about the incident, Ed Erhart, a senior editorial associate at the Wikimedia foundation, notes that the original article on Strickland was rejected by an editor who stated that it lacked “published, reliable, secondary sources that are independent of the subject.” But he also raises a good point about the guidelines used to establish academic notability: “Academics may be writing many of the sources volunteer Wikipedia editors use to verify the information on Wikipedia, but they are only infrequently the subject of those same sources. And when it does occur, they usually feature men from developed nations—not women or other under-represented groups.” Bazely makes a similar observation:

We live in a world where women’s accomplishments are routinely discounted and dismissed. This occurs at every point in the academic pipeline…Across disciplines, men cite their own research more often than women do. Men give twice as many academic talks as women—engagements which give scholars a chance to publicize their work, find collaborators and build their resumes for potential promotions and job offers. Female academics tend to get less credit than males for their work on a team. Outside of academia, news outlets quote more male voices than female ones—another key venue for proving “notability” among Wikipedia editors. These structural biases have a ripple effect on our crowdsourced encyclopedia.

And this leads to an undeniable feedback effect, in which the existing sources used to establish notability are used to create Wikipedia articles, when serve as evidence of notability in the future.

Bazely argues that articles on male subjects don’t seem to be held to the same high standards as those for women, which reflects the implicit biases of its editors, the vast majority of whom are men. She’s right, but I also think that there’s a subtle historical element at play. Back during the wild west days of Wikipedia, when the community was still defining itself, the demographics of its most prolific editors were probably even less diverse than they are now. During those formative years, thousands of pages were generated under a looser set of standards, and much of that material has been grandfathered into the version that exists today. I should know, because I was a part of it. While I may not have been a member of the very first generation of Wikipedia editors—one of my friends still takes pride in the fact that he created the page for “knife”—I was there early enough to originate a number of articles that I thought were necessary. I created pages for such people as Darin Morgan and Julee Cruise, and when I realized that there wasn’t an entry for “mix tape,” I spent the better part of two days at work putting one together. By the standards of the time, I was diligent and conscientious, but very little of what I did would pass muster today. My citations were erratic, I included my own subjective commentary and evaluations along with verifiable facts, and I indulged in original research, which the site rightly discourages. Multiply this by a thousand, and you get a sense of the extent to which the foundations of Wikipedia were laid by exactly the kind of editor in his early twenties for whom writing a cultural history of the mix tape took priority over countless other deserving subjects. (It isn’t an accident that I had started thinking about mix tapes again because of Nick Hornby’s High Fidelity, which provides a scathing portrait of a certain personality type, not unlike my own, that I took for years at face value.)

And I don’t even think that I was wrong. Wikipedia is naturally skewed in favor of the enthusiasms of its users, and articles that are fun to research, write, and discuss will inevitably get more attention. But the appeal of a subject to a minority of active editors isn’t synonymous with notability, and it takes a conscious effort to correct the result, especially when it comes to the older strata of contributions. While much of what I wrote fifteen years ago has been removed or revised by other hands, a lot of it still persists, because it’s easier to monitor new edits than to systematically check pages that have been around for years. And it leaves behind a residue of the same kinds of unconscious assumptions that I’ve identified elsewhere in other forms of canonization. Wikipedia is part of our cultural background now, invisible and omnipresent, and we tend to take it for granted. (Like Google, it can be hard to research it online because its name has become a synonym for information itself. Googling “Google,” or keywords associated with it, is a real headache, and looking for information about Wikipedia—as opposed to information presented in a Wikipedia article—presents many of the same challenges.) And nudging such a huge enterprise back on course, even by a few degrees, doesn’t happen by accident. One way is through the “edit-a-thons” that often occur on Ada Lovelace Day, which is named after the mathematician whose posthumous career incidentally illustrates how historical reputations can be shaped by whoever happens to be telling the story.  We think of Lovelace, who worked with Charles Babbage on the difference engine, as a feminist hero, but as recently as the early sixties, one writer could cite her as an example of genetic mediocrity: “Lord Byron’s surviving daughter, Ada, what did she produce in maturity? A system for betting on horse races that was a failure, and she died at thirty-six, shattered and deranged.” The writer was the popular novelist Irving Wallace, who is now deservedly forgotten. And the book was a bestseller about the Nobel Prize.

Written by nevalalee

October 15, 2018 at 9:04 am

The Machine of Lagado

with one comment

Yesterday, my wife wrote to me in a text message: “Psychohistory could not predict that Elon [Musk] would gin up a fraudulent stock buyback price based on a pot joke and then get punished by the SEC.” This might lead you to wonder about our texting habits, but more to the point, she was right. Psychohistory—the fictional science of forecasting the future developed by Isaac Asimov and John W. Campbell in the Foundation series—is based on the assumption that the world will change in the future more or less as it has in the past. Like all systems of prediction, it’s unable to foresee black swans, like the Mule or Donald Trump, that make nonsense of our previous assumptions, and it’s useless for predicting events on a small scale. Asimov liked to compare it to the kinetic theory of gases, “where the individual molecules in the gas remain as unpredictable as ever, but the average person is completely predictable.” This means that you need a sufficiently large number of people, such as the population of the galaxy, for it to work, and it also means that it grows correspondingly less useful as it becomes more specific. On the individual level, human behavior is as unforeseeable as the motion of particular molecules, and the shape of any particular life is impossible to predict, even if we like to believe otherwise. The same is true of events. Just as a monkey or a dartboard might do an equally good job of picking stocks as a qualified investment advisor, the news these days often seems to have been generated by a bot, like the Subreddit Simulator, that automatically cranks out random combinations of keywords and trending terms. (My favorite recent example is an actual headline from the Washington Post: “Border Patrol agent admits to starting wildfire during gender-reveal party.”)

And the satirical notion that combining ideas at random might lead to useful insights or predictions is a very old one. In Gulliver’s Travels, Jonathan Swift describes an encounter with a fictional machine—located in the academy of Lagado, the capital city of the island of Balnibarbi—by which “the most ignorant person, at a reasonable charge, and with a little bodily labour, might write books in philosophy, poetry, politics, laws, mathematics, and theology, without the least assistance from genius or study.” The narrator continues:

[The professor] then led me to the frame, about the sides, whereof all his pupils stood in ranks. It was twenty feet square, placed in the middle of the room. The superfices was composed of several bits of wood, about the bigness of a die, but some larger than others. They were all linked together by slender wires. These bits of wood were covered, on every square, with paper pasted on them; and on these papers were written all the words of their language, in their several moods, tenses, and declensions; but without any order…The pupils, at his command, took each of them hold of an iron handle, whereof there were forty fixed round the edges of the frame; and giving them a sudden turn, the whole disposition of the words was entirely changed.  He then commanded six-and-thirty of the lads, to read the several lines softly, as they appeared upon the frame; and where they found three or four words together that might make part of a sentence, they dictated to the four remaining boys, who were scribes.

And Gulliver concludes: “Six hours a day the young students were employed in this labour; and the professor showed me several volumes in large folio, already collected, of broken sentences, which he intended to piece together, and out of those rich materials, to give the world a complete body of all arts and sciences.”

Two and a half centuries later, an updated version of this machine figured in Umberto Eco’s novel Foucault’s Pendulum, which is where I first encountered it. The book’s three protagonists, who work as editors for a publishing company in Milan, are playing in the early eighties with their new desktop computer, which they’ve nicknamed Abulafia, after the medieval cabalist. One speaks proudly of Abulafia’s usefulness in generating random combinations: “All that’s needed is the data and the desire. Take, for example, poetry. The program asks you how many lines you want in the poem, and you decide: ten, twenty, a hundred. Then the program randomizes the line numbers. In other words, a new arrangement each time. With ten lines you can make thousands and thousands of random poems.” This gives the narrator an idea:

What if, instead, you fed it a few dozen notions taken from the works of [occult writers]—for example, the Templars fled to Scotland, or the Corpus Hermeticum arrived in Florence in 1460—and threw in a few connective phrases like “It’s obvious that” and “This proves that?” We might end up with something revelatory. Then we fill in the gaps, call the repetitions prophecies, and—voila—a hitherto unpublished chapter of the history of magic, at the very least!

Taking random sentences from unpublished manuscripts, they enter such lines as “Who was married at the feast of Cana?” and “Minnie Mouse is Mickey’s fiancee.” When strung together, the result, in one of Eco’s sly jokes, is a conspiracy theory that exactly duplicates the thesis of Holy Blood, Holy Grail, which later provided much of the inspiration for The Da Vinci Code. “Nobody would take that seriously,” one of the editors says. The narrator replies: “On the contrary, it would sell a few hundred thousand copies.”

When I first read this as a teenager, I thought it was one of the great things in the world, and part of me still does. I immediately began to look for similar connections between random ideas, which led me to some of my best story ideas, and I still incorporate aspects of randomness into just about everything that I do. Yet there’s also a pathological element to this form of play that I haven’t always acknowledged. What makes it dangerous, as Eco understood, is the inclusion of such seemingly innocent expressions as “it’s obvious that” and “this proves that,” which instantly transforms a scenario into an argument. (On the back cover of the paperback edition of Foucault’s Pendulum, the promotional copy describes Abulafia as “an incredible computer capable of inventing connections between all their entires,” which is both a great example of hyping a difficult book and a reflection of how credulous we can be when it comes to such practices in real life.) We may not be able to rule out any particular combination of events, but not every explanatory system is equally valid, even if all it takes is a modicum of ingenuity to turn it into something convincing. I used to see the creation of conspiracy theories as a diverting game, or as a commentary on how we interpret the world around us, and I devoted an entire novel to exorcising my fascination with this idea. More recently, I’ve realized that this attitude was founded on the assumption that it was still possible to come to some kind of cultural consensus about the truth. In the era of InfoWars, Pizzagate, and QAnon, it no longer seems harmless. Not all patterns are real, and many of the horrors of the last century were perpetuated by conspiracy theorists who arbitrarily seized on one arrangement of the facts—and then acted on it accordingly. Reality itself can seem randomly generated, but our thoughts and actions don’t need to be.

Written by nevalalee

October 2, 2018 at 9:36 am

The paper of record

leave a comment »

One of my favorite conventions in suspense fiction is the trope known as Authentication by Newspaper. It’s the moment in a movie, novel, or television show—and sometimes even in reality—when the kidnapper sends a picture of the victim holding a copy of a recent paper, with the date and headline clearly visible, as a form of proof of life. (You can also use it with piles of illicit cash, to prove that you’re ready to send payment.) The idea frequently pops up in such movies as Midnight Run and Mission: Impossible 2, and it also inspired a classic headline from The Onion: “Report: Majority Of Newspapers Now Purchased By Kidnappers To Prove Date.” It all depends on the fact that a newspaper is a datable object that is widely available and impossible to fake in advance, which means that it can be used to definitively establish the earliest possible day in which an event could have taken place. And you can also use the paper to verify a past date in subtler ways. A few weeks ago, Motherboard had a fascinating article on a time-stamping service called Surety, which provides the equivalent of a dated seal for digital documents. To make it impossible to change the date on one of these files, every week, for more than twenty years, Surety has generated a public hash value from its internal client database and published it in the classified ad section of the New York Times. As the company notes: “This makes it impossible for anyone—including Surety—to backdate timestamps or validate electronic records that were not exact copies of the original.”

I was reminded of all this yesterday, after the Times posted an anonymous opinion piece titled “I Am Part of the Resistance Inside the Trump Administration.” The essay, which the paper credits to “a senior official,” describes what amounts to a shadow government within the White House devoted to saving the president—and the rest of the country—from his worst impulses. And while the author may prefer to remain nameless, he certainly doesn’t suffer from a lack of humility:

Many of the senior officials in [Trump’s] own administration are working diligently from within to frustrate parts of his agenda and his worst inclinations. I would know. I am one of them…It may be cold comfort in this chaotic era, but Americans should know that there are adults in the room. We fully recognize what is happening. And we are trying to do what’s right even when Donald Trump won’t.

The result, he claims, is “a two-track presidency,” with a group of principled advisors doing their best to counteract Trump’s admiration for autocrats and contempt for international relations: “This isn’t the work of the so-called deep state. It’s the work of the steady state.” He even reveals that there was early discussion among cabinet members of using the Twenty-Fifth Amendment to remove Trump from office, although it was scuttled by concern of precipitating a crisis somehow worse than the one in which we’ve found ourselves.

Not surprisingly, the piece has generated a firestorm of speculation about the author’s identity, both online and in the White House itself, which I won’t bother covering here. What interests me are the writer’s reasons for publishing it in the first place. Over the short term, it can only destabilize an already volatile situation, and everyone involved will suffer for it. This implies that the author has a long game in mind, and it had better be pretty compelling. On Twitter, Nate Silver proposed one popular theory: “It seems like the person’s goal is to get outed and secure a very generous advance on a book deal.” He may be right—although if that’s the case, the plan has quickly gone sideways. Reaction on both sides has been far more critical than positive, with Erik Wemple of the Washington Post perhaps putting it best:

Like most anonymous quotes and tracts, this one is a PR stunt. Mr. Senior Administration Official gets to use the distributive power of the New York Times to recast an entire class of federal appointees. No longer are they enablers of a foolish and capricious president. They are now the country’s most precious and valued patriots. In an appearance on Wednesday afternoon, the president pronounced it all a “gutless” exercise. No argument here.

Or as the political blogger Charles P. Pierce says even more savagely in his response on Esquire: “Just shut up and quit.”

But Wemple’s offhand reference to “the distributive power” of the Times makes me think that the real motive is staring us right in the face. It’s a form of Authentication by Newspaper. Let’s say that you’re a senior official in the Trump administration who knows that time is running out. You’re afraid to openly defy the president, but you also want to benefit—or at least to survive—after the ship goes down. In the aftermath, everyone will be scrambling to position themselves for some kind of future career, even though the events of the last few years have left most of them irrevocably tainted. By the time it falls apart, it will be too late to claim that you were gravely concerned. But the solution is a stroke of genius. You plant an anonymous piece in the Times, like the founders of Surety publishing its hash value in the classified ads, except that your platform is vastly more prominent. And you place it there precisely so that you can point to it in the future. After Trump is no longer a threat, you can reveal yourself, with full corroboration from the paper of record, to show that you had the best interests of the country in mind all along. You were one of the good ones. The datestamp is right there. That’s your endgame, no matter how much pain it causes in the meantime. It’s brilliant. But it may not work. As nearly everyone has realized by now, the fact that a “steady state” of conservatives is working to minimize the damage of a Trump presidency to achieve “effective deregulation, historic tax reform, a more robust military and more” is a scandal in itself. This isn’t proof of life. It’s the opposite.

Written by nevalalee

September 6, 2018 at 8:59 am

From Montgomery to Bilbao

leave a comment »

On August 16, 2016, the Equal Justice Initiative, a legal rights organization, unveiled its plans for the National Memorial for Peace and Justice, which would be constructed in Montgomery, Alabama. Today, less than two years later, it opens to the public, and the timing could hardly seem more appropriate, in ways that even those who conceived of it might never have imagined. As Campbell Robertson writes for the New York Times:

At the center is a grim cloister, a walkway with eight hundred weathered steel columns, all hanging from a roof. Etched on each column is the name of an American county and the people who were lynched there, most listed by name, many simply as “unknown.” The columns meet you first at eye level, like the headstones that lynching victims were rarely given. But as you walk, the floor steadily descends; by the end, the columns are all dangling above, leaving you in the position of the callous spectators in old photographs of public lynchings.

And the design represents a breakthrough in more ways than one. As the critic Philip Kennicott points out in the Washington Post: “Even more remarkable, this memorial…was built on a budget of only $15 million, in an age when major national memorials tend to cost $100 million and up.”

Of course, if the memorial had been more costly, it might not exist at all, and certainly not with the level of independence and the clear point of view that it expresses. Yet if there’s one striking thing about the coverage of the project, it’s the absence of the name of any one architect or designer. Neither of these two words even appears in the Times article, and in the Post, we only read that the memorial was “designed by [Equal Justice Initiative founder Bryan] Stevenson and his colleagues at EJI in collaboration with the Boston-based MASS Design Group.” When you go to the latter’s official website, twelve people are credited as members of the project design team. This is markedly different from the way in which we tend to talk about monuments, museums, and other architectural works that are meant to invite our attention. In many cases, the architect’s identity is a selling point in itself, as it invariably is with Frank Gehry, whose involvement in a project like the Guggenheim Museum Bilbao is consciously intended to rejuvenate an entire city. In Montgomery, by contrast, the designer is essentially anonymous, or part of a collaboration, which seems like an aesthetic choice as conscious as the design of the space itself. The individual personality of the architect departs, leaving the names and events to testify on their own behalf. Which is exactly as it should be.

And it’s hard not to compare this to the response to the design of the Vietnam Veterans Memorial in 1981. The otherwise excellent documentary by Ken Burns and Lynn Novick alludes to the firestorm that it caused, but it declines to explore how much of the opposition was personal in nature. As James Reston, Jr. writes in the definitive study A Rift in the Earth:

After Maya Lin’s design was chosen and announced, the public reaction was intense. Letters from outraged veterans poured into the Memorial Fund office. One claimed that Lin’s design had “the warmth and charm of an Abyssinian dagger.” “Nihilistic aesthetes” had chosen it…Predictably, the names of incendiary antiwar icons, Jane Fonda and Abbie Hoffman, were invoked as cheering for a design that made a mockery of the Vietnam dead…As for the winner with Chinese ancestry, [donor H. Ross] Perot began referring to her as “egg roll.”

If anything, the subject matter of the National Memorial for Peace and Justice is even more fraught, and the decision to place the designers in the background seems partially intended to focus the conversation on the museum itself, and not on those who made it.

Yet there’s a deeper lesson here about architecture and its creators. At first, you might think that a building with a singular message would need to arise from—or be identified with—an equally strong personality, but if anything, the trend in recent years has gone the other way. As Reinier de Graaf notes in Four Walls and a Roof, one of the more curious developments over the last few decades is the way in which celebrity architects, like Frank Gehry, have given up much of their own autonomy for the sake of unusual forms that no human hand or brain could properly design:

In partially delegating the production of form to the computer, the antibox has seemingly boosted the production of extravagant shapes beyond any apparent limits. What started as a deliberate meditation on the notion of form in the early antibodies has turned into a game of chance. Authorship has become relative: with creation now delegated to algorithms, the antibox’s main delight is the surprise it causes to the designers.

Its opposite number is the National Memorial for Peace and Justice, which was built with simple materials and techniques that rely for their impact entirely on the insight, empathy, and ingenuity of the designer, who then quietly fades away. The architect can afford to disappear, because the work speaks for those who are unable to speak for themselves. And that might be the most powerful message of all.

Checks and balances

with one comment

About a third of the way through my upcoming book, while discussing the May 1941 issue of Astounding Science Fiction, I include the sentence: “The issue also featured Heinlein’s “Universe,” which was based on Campbell’s premise about a lost generation starship.” My copy editor amended this to “a lost-generation starship,” to which I replied: “This isn’t a ‘lost-generation’ starship, but a generation starship that happens to be lost.” And the exchange gave me a pretty good idea for a story that I’ll probably never write. (I don’t really have a plot for it yet, but it would be about Hemingway and Fitzgerald on a trip to Alpha Centauri, and it would be called The Double Sun Also Rises.) But it also reminded me of one of the benefits of a copy edit, which is its unparalleled combination of intense scrutiny and total detachment. I sent drafts of the manuscript to some of the world’s greatest nitpickers, who saved me from horrendous mistakes, and the result wouldn’t be nearly as good without their advice. But there’s also something to be said for engaging the services of a diligent reader who doesn’t have any connection to the subject. I deliberately sought out feedback from a few people who weren’t science fiction fans, just to make sure that it remained accessible to a wider audience. And the ultimate example is the copy editor, who is retained to provide an impartial consideration of every semicolon without any preconceived notions outside the text. It’s what Heinlein might have had in mind when he invented the Fair Witness, who said when asked about the color of a nearby house: “It’s white on this side.”

But copy editors are human beings, not machines, and they occasionally get their moment in the spotlight. Recently, their primary platform has been The New Yorker, which has been quietly highlighting the work of its copy editors and fact checkers over the last few years. We can trace this tendency back to Between You & Me, a memoir by Mary Norris that drew overdue attention to the craft of copy editing. In “Holy Writ,” a delightful excerpt in the magazine, Norris writes of the supposed objectivity and rigor of her profession: “The popular image of the copy editor is of someone who favors rigid consistency. I don’t usually think of myself that way. But, when pressed, I do find I have strong views about commas.” And she says of their famous detachment:

There is a fancy word for “going beyond your province”: “ultracrepidate.” So much of copy editing is about not going beyond your province. Anti-ultracrepidationism. Writers might think we’re applying rules and sticking it to their prose in order to make it fit some standard, but just as often we’re backing off, making exceptions, or at least trying to find a balance between doing too much and doing too little. A lot of the decisions you have to make as a copy editor are subjective. For instance, an issue that comes up all the time, whether to use “that” or “which,” depends on what the writer means. It’s interpretive, not mechanical—though the answer often boils down to an implicit understanding of commas.

In order to be truly objective, in other words, you have to be a little subjective. Which equally true of writing as a whole.

You could say much the same of the fact checker, who resembles the copy editor’s equally obsessive cousin. As a rule, books aren’t fact-checked, which is a point that we only seem to remember when the system breaks down. (Astounding was given a legal read, but I was mostly on my own when it came to everything else, and I’m grateful that some of the most potentially contentious material—about L. Ron Hubbard’s writing career—drew on an earlier article that was brilliantly checked by Matthew Giles of Longreads.) As John McPhee recently wrote of the profession:

Any error is everlasting. As Sara [Lippincott] told the journalism students, once an error gets into print it “will live on and on in libraries carefully catalogued, scrupulously indexed…silicon-chipped, deceiving researcher after researcher down through the ages, all of whom will make new errors on the strength of the original errors, and so on and on into an exponential explosion of errata.” With drawn sword, the fact-checker stands at the near end of this bridge. It is, in part, why the job exists and why, in Sara’s words, a publication will believe in “turning a pack of professional skeptics loose on its own galley proofs.”

McPhee continues: “Book publishers prefer to regard fact-checking as the responsibility of authors, which, contractually, comes down to a simple matter of who doesn’t pay for what. If material that has appeared in a fact-checked magazine reappears in a book, the author is not the only beneficiary of the checker’s work. The book publisher has won a free ticket to factual respectability.” And its absence from the publishing process feels like an odd evolutionary vestige of the book industry that ought to be fixed.

As a result of such tributes, the copy editors and fact checkers of The New Yorker have become cultural icons in themselves, and when an error does make it through, it can be mildly shocking. (Last month, the original version of a review by Adam Gopnik casually stated that Andrew Lloyd Webber was the composer of Chess, and although I knew perfectly well that this was wrong, I had to look it up to make sure that I hadn’t strayed over into a parallel universe.) And their emergence at this particular moment may not be an accident. The first installment of “Holy Writ” appeared on February 23, 2015, just a few months before Donald Trump announced that he was running for president, plunging us all into world in which good grammar and factual accuracy can seem less like matters of common decency than obstacles to be obliterated. Even though the timing was a coincidence, it’s tempting to read our growing appreciation for these unsung heroes as a statement about the importance of the truth itself. As Alyssa Rosenberg writes in the Washington Post:

It’s not surprising that one of the persistent jokes from the Trump era is the suggestion that we’re living in a bad piece of fiction…Pretending we’re all minor characters in a work of fiction can be a way of distancing ourselves from the seeming horror of our time or emphasizing our own feelings of powerlessness, and pointing to “the writers” often helps us deny any responsibility we may have for Trump, whether as voters or as journalists who covered the election. But whatever else we’re doing when we joke about Trump and the swirl of chaos around him as fiction, we’re expressing a wish that this moment will resolve in a narratively and morally comprehensible fashion.

Perhaps we’re also hoping that reality itself will have a fact checker after all, and that the result will make a difference. We don’t know if it will yet. But I’m hopeful that we’ll survive the exponential explosion of errata.

Astounding Stories #21: Black Man’s Burden

with 3 comments

Note: With less than half a year to go until the publication of Astounding: John W. Campbell, Isaac Asimov, Robert A. Heinlein, L. Ron Hubbard, and the Golden Age of Science Fiction, I’m returning, after a long hiatus, to the series in which I highlight works of science fiction that deserve to be rediscovered, reappraised, or simply enjoyed by a wider audience. You can read the earlier installments here

“This never gets old,” T’Challa says in Black Panther, just before we see the nation of Wakanda in its full glory for the first time. It’s perhaps the most moving moment in this often overwhelmingly emotional film, and it speaks to how much of its power hinges on the idea of Wakanda itself. Most fictional countries in the movies—a disproportionate number of which seem to be located in Africa, South America, or the Middle East—are narrative evasions, but not here. As Ishaan Tharoor wrote recently in the Washington Post:

Wakanda, like many places in Africa, is home to a great wealth of natural resources. But unlike most places in Africa, it was able to avoid European colonization. Shielded by the powers of vibranium, the element mined beneath its surface that enabled the country to develop the world’s most advanced technology, Wakanda resisted invaders while its rulers constructed a beautiful space-age kingdom.

Or as the writer Evan Narcisse observed elsewhere to the Post: “Wakanda represents this unbroken chain of achievement of black excellence that never got interrupted by colonialism.” It’s imaginary, yes, but that’s part of the point. In his review, Anthony Lane of The New Yorker delivered a gentle rebuke: “I wonder what weight of political responsibility can, or should, be laid upon anything that is accompanied by buttered popcorn. Vibranium is no more real than the philosopher’s stone…Are 3-D spectacles any more reliable than rose-tinted ones, when we seek to imagine an ideal society?” But the gap between dreams and reality is precisely how the best science fiction—and Black Panther, along with so much else, is a kickass science fiction movie—compels us to see the world with new eyes.

The fiction published by the editor John W. Campbell rarely tackled issues of race directly, and the closest that it ever came was probably a series that began with Black Man’s Burden, the first installment of which ran in the December 1961 issue of Analog. It revolves around a coalition of African-American academics working undercover to effect social and political change in North Africa, with the ultimate goal of uniting the region in the scientific and cultural values of the West. The protagonist is a sociologist named Homer Crawford, who explains:

The distrust of the European and the white man as a whole was prevalent, especially here in Africa. However, and particularly in Africa, the citizens of the new countries were almost unbelievably uneducated, untrained, incapable of engineering their own destiny…We of the Reunited Nations teams are here because we are Africans racially but not nationally, we have no affiliations with clan, tribe, or African nation. We are free to work for Africa’s progress without prejudice. Our job is to remove obstacles wherever we find them. To break up log jams. To eliminate prejudices against the steps that must be taken if Africa is to run down the path of progress, rather than to crawl.

All of this is explained to the reader at great length. There’s some effective action, but much of the story consists of the characters talking, and if these young black intellectuals all end up sounding a lot like John W. Campbell, that shouldn’t be surprising—the author, Mack Reynolds, later said that the story and its sequels “were written at a suggestion of John Campbell’s and whole chunks of them were based on his ideas.” Many sections are taken verbatim from the editor’s letters and editorials, ranging from his musings on judo, mob psychology, and the virtues of the quarterstaff to blanket statements that border on the unforgivable: “You know, with possibly a few exceptions, you can’t enslave a man if he doesn’t want to be a slave…The majority of Jefferson’s slaves wanted to be slaves.”

We’re obviously a long way from Wakanda here—but although Black Man’s Burden might seem easy to hate, oddly enough, it isn’t. Mack Reynolds, who had lived in North Africa, was a talented writer, and the serial as a whole is intelligent, restrained, consistently interesting, and mindful of the problems with its own premise. To encourage the locals to reject tribalism in favor of modern science, medicine, and education, for instance, the team attributes many of its ideas to a fictional savior figure, El Hassan, on the theory that such societies “need a hero,” and by the end, Homer Crawford has reluctantly assumed the role himself. (There are shades not just of T.E. Lawrence but of Paul Atreides, whose story would appear in the magazine just two years later.) But he has few illusions about the nature of his work. As one of his colleagues puts it in the sequel:

Monarchies are of the past, and El Hassan is the voice of the future, something new. We won’t admit he’s just a latter-day tyrant, an opportunist seizing power because it’s there crying to be seized. Actually, El Hassan is in the tradition of Genghis Khan, Temerlane, or, more recently, Napoleon. But he’s a modern version, and we’re not going to hang the old labels on him.

Crawford mordantly responds: “As a young sociologist, I never expected to wind up a literal tyrant.” And Reynolds doesn’t pretend to offer easy solutions. The sequel, Border, Breed, Nor Birth, closes with a bleak denial of happy endings, while the concluding story, “Black Sheep Astray,” ends with Crawford, overthrown after a long rule as El Hassan, returning to start a new revolution among the younger generation, at the likely cost of his life. The leads are drawn with considerable care—even if Reynolds has a bad habit of saying that they look “surprisingly like” Joe Louis or Lena Horne—and their mere presence in Analog is striking enough that one prominent scholar has used it to question Samuel R. Delany’s claim that Campbell rejected one of his stories because “his readership would be able to relate to a black main character.”

Yet this overlooks the fact that an ambitious, messy, uncategorizable novel like Delany’s Nova is worlds apart from a serial that was commissioned and written to Campbell’s specifications. And its conceptual and literary limitations turn out to be closely related. Black Man’s Burden is constructed with diligence and real craft, but this doesn’t make its basic premise any more tenable. It interrogates many of its assumptions, but it doesn’t really question the notion of a covert operation to shape another country’s politics through propaganda, guerrilla action, and the assimilation of undercover agents into the local population. This isn’t science fiction. It’s what intelligence agencies on both sides were doing throughout the Cold War. (If anything, the whisper campaign for El Hassan seems primitive by contemporary standards. These days, the plan would include data analysis, viral messaging in support of favored policies or candidates, and the systematic weaponization of social media on the part of foreign nationals. What would be wrong with that?) By the story’s own logic, the project has to be run by black activists because the locals are suspicious of white outsiders, but there’s no suggestion that their underlying goals are any different—and if the same story would be unthinkable with a white protagonist, it implies that it has problems here that can’t be addressed with a change of race. It’s also characteristically evasive when it comes to how psychohistory actually works. Reading it again, I found myself thinking of what William Easterly writes in The White Man’s Burden:

A Planner thinks he already knows the answers; he thinks of poverty as a technical engineering problem that his answers will solve. A Searcher admits he doesn’t know the answers in advance…A Planner believes outsiders know enough to impose solutions. A Searcher believes only insiders have enough knowledge to find solutions, and that most solutions must be homegrown.

Planners still exist in foreign aid—but they can also edit magazines. Campbell was one of them. Black Man’s Burden was his idea of how to deal with race in Analog, even as he failed to make any effort to look for black writers who knew about the subject firsthand. And it worked about as well here as it did anywhere else.

%d bloggers like this: