Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Washington Post

From Montgomery to Bilbao

leave a comment »

On August 16, 2016, the Equal Justice Initiative, a legal rights organization, unveiled its plans for the National Memorial for Peace and Justice, which would be constructed in Montgomery, Alabama. Today, less than two years later, it opens to the public, and the timing could hardly seem more appropriate, in ways that even those who conceived of it might never have imagined. As Campbell Robertson writes for the New York Times:

At the center is a grim cloister, a walkway with eight hundred weathered steel columns, all hanging from a roof. Etched on each column is the name of an American county and the people who were lynched there, most listed by name, many simply as “unknown.” The columns meet you first at eye level, like the headstones that lynching victims were rarely given. But as you walk, the floor steadily descends; by the end, the columns are all dangling above, leaving you in the position of the callous spectators in old photographs of public lynchings.

And the design represents a breakthrough in more ways than one. As the critic Philip Kennicott points out in the Washington Post: “Even more remarkable, this memorial…was built on a budget of only $15 million, in an age when major national memorials tend to cost $100 million and up.”

Of course, if the memorial had been more costly, it might not exist at all, and certainly not with the level of independence and the clear point of view that it expresses. Yet if there’s one striking thing about the coverage of the project, it’s the absence of the name of any one architect or designer. Neither of these two words even appears in the Times article, and in the Post, we only read that the memorial was “designed by [Equal Justice Initiative founder Bryan] Stevenson and his colleagues at EJI in collaboration with the Boston-based MASS Design Group.” When you go to the latter’s official website, twelve people are credited as members of the project design team. This is markedly different from the way in which we tend to talk about monuments, museums, and other architectural works that are meant to invite our attention. In many cases, the architect’s identity is a selling point in itself, as it invariably is with Frank Gehry, whose involvement in a project like the Guggenheim Museum Bilbao is consciously intended to rejuvenate an entire city. In Montgomery, by contrast, the designer is essentially anonymous, or part of a collaboration, which seems like an aesthetic choice as conscious as the design of the space itself. The individual personality of the architect departs, leaving the names and events to testify on their own behalf. Which is exactly as it should be.

And it’s hard not to compare this to the response to the design of the Vietnam Veterans Memorial in 1981. The otherwise excellent documentary by Ken Burns and Lynn Novick alludes to the firestorm that it caused, but it declines to explore how much of the opposition was personal in nature. As James Reston, Jr. writes in the definitive study A Rift in the Earth:

After Maya Lin’s design was chosen and announced, the public reaction was intense. Letters from outraged veterans poured into the Memorial Fund office. One claimed that Lin’s design had “the warmth and charm of an Abyssinian dagger.” “Nihilistic aesthetes” had chosen it…Predictably, the names of incendiary antiwar icons, Jane Fonda and Abbie Hoffman, were invoked as cheering for a design that made a mockery of the Vietnam dead…As for the winner with Chinese ancestry, [donor H. Ross] Perot began referring to her as “egg roll.”

If anything, the subject matter of the National Memorial for Peace and Justice is even more fraught, and the decision to place the designers in the background seems partially intended to focus the conversation on the museum itself, and not on those who made it.

Yet there’s a deeper lesson here about architecture and its creators. At first, you might think that a building with a singular message would need to arise from—or be identified with—an equally strong personality, but if anything, the trend in recent years has gone the other way. As Reinier de Graaf notes in Four Walls and a Roof, one of the more curious developments over the last few decades is the way in which celebrity architects, like Frank Gehry, have given up much of their own autonomy for the sake of unusual forms that no human hand or brain could properly design:

In partially delegating the production of form to the computer, the antibox has seemingly boosted the production of extravagant shapes beyond any apparent limits. What started as a deliberate meditation on the notion of form in the early antibodies has turned into a game of chance. Authorship has become relative: with creation now delegated to algorithms, the antibox’s main delight is the surprise it causes to the designers.

Its opposite number is the National Memorial for Peace and Justice, which was built with simple materials and techniques that rely for their impact entirely on the insight, empathy, and ingenuity of the designer, who then quietly fades away. The architect can afford to disappear, because the work speaks for those who are unable to speak for themselves. And that might be the most powerful message of all.

Checks and balances

with one comment

About a third of the way through my upcoming book, while discussing the May 1941 issue of Astounding Science Fiction, I include the sentence: “The issue also featured Heinlein’s “Universe,” which was based on Campbell’s premise about a lost generation starship.” My copy editor amended this to “a lost-generation starship,” to which I replied: “This isn’t a ‘lost-generation’ starship, but a generation starship that happens to be lost.” And the exchange gave me a pretty good idea for a story that I’ll probably never write. (I don’t really have a plot for it yet, but it would be about Hemingway and Fitzgerald on a trip to Alpha Centauri, and it would be called The Double Sun Also Rises.) But it also reminded me of one of the benefits of a copy edit, which is its unparalleled combination of intense scrutiny and total detachment. I sent drafts of the manuscript to some of the world’s greatest nitpickers, who saved me from horrendous mistakes, and the result wouldn’t be nearly as good without their advice. But there’s also something to be said for engaging the services of a diligent reader who doesn’t have any connection to the subject. I deliberately sought out feedback from a few people who weren’t science fiction fans, just to make sure that it remained accessible to a wider audience. And the ultimate example is the copy editor, who is retained to provide an impartial consideration of every semicolon without any preconceived notions outside the text. It’s what Heinlein might have had in mind when he invented the Fair Witness, who said when asked about the color of a nearby house: “It’s white on this side.”

But copy editors are human beings, not machines, and they occasionally get their moment in the spotlight. Recently, their primary platform has been The New Yorker, which has been quietly highlighting the work of its copy editors and fact checkers over the last few years. We can trace this tendency back to Between You & Me, a memoir by Mary Norris that drew overdue attention to the craft of copy editing. In “Holy Writ,” a delightful excerpt in the magazine, Norris writes of the supposed objectivity and rigor of her profession: “The popular image of the copy editor is of someone who favors rigid consistency. I don’t usually think of myself that way. But, when pressed, I do find I have strong views about commas.” And she says of their famous detachment:

There is a fancy word for “going beyond your province”: “ultracrepidate.” So much of copy editing is about not going beyond your province. Anti-ultracrepidationism. Writers might think we’re applying rules and sticking it to their prose in order to make it fit some standard, but just as often we’re backing off, making exceptions, or at least trying to find a balance between doing too much and doing too little. A lot of the decisions you have to make as a copy editor are subjective. For instance, an issue that comes up all the time, whether to use “that” or “which,” depends on what the writer means. It’s interpretive, not mechanical—though the answer often boils down to an implicit understanding of commas.

In order to be truly objective, in other words, you have to be a little subjective. Which equally true of writing as a whole.

You could say much the same of the fact checker, who resembles the copy editor’s equally obsessive cousin. As a rule, books aren’t fact-checked, which is a point that we only seem to remember when the system breaks down. (Astounding was given a legal read, but I was mostly on my own when it came to everything else, and I’m grateful that some of the most potentially contentious material—about L. Ron Hubbard’s writing career—drew on an earlier article that was brilliantly checked by Matthew Giles of Longreads.) As John McPhee recently wrote of the profession:

Any error is everlasting. As Sara [Lippincott] told the journalism students, once an error gets into print it “will live on and on in libraries carefully catalogued, scrupulously indexed…silicon-chipped, deceiving researcher after researcher down through the ages, all of whom will make new errors on the strength of the original errors, and so on and on into an exponential explosion of errata.” With drawn sword, the fact-checker stands at the near end of this bridge. It is, in part, why the job exists and why, in Sara’s words, a publication will believe in “turning a pack of professional skeptics loose on its own galley proofs.”

McPhee continues: “Book publishers prefer to regard fact-checking as the responsibility of authors, which, contractually, comes down to a simple matter of who doesn’t pay for what. If material that has appeared in a fact-checked magazine reappears in a book, the author is not the only beneficiary of the checker’s work. The book publisher has won a free ticket to factual respectability.” And its absence from the publishing process feels like an odd evolutionary vestige of the book industry that ought to be fixed.

As a result of such tributes, the copy editors and fact checkers of The New Yorker have become cultural icons in themselves, and when an error does make it through, it can be mildly shocking. (Last month, the original version of a review by Adam Gopnik casually stated that Andrew Lloyd Webber was the composer of Chess, and although I knew perfectly well that this was wrong, I had to look it up to make sure that I hadn’t strayed over into a parallel universe.) And their emergence at this particular moment may not be an accident. The first installment of “Holy Writ” appeared on February 23, 2015, just a few months before Donald Trump announced that he was running for president, plunging us all into world in which good grammar and factual accuracy can seem less like matters of common decency than obstacles to be obliterated. Even though the timing was a coincidence, it’s tempting to read our growing appreciation for these unsung heroes as a statement about the importance of the truth itself. As Alyssa Rosenberg writes in the Washington Post:

It’s not surprising that one of the persistent jokes from the Trump era is the suggestion that we’re living in a bad piece of fiction…Pretending we’re all minor characters in a work of fiction can be a way of distancing ourselves from the seeming horror of our time or emphasizing our own feelings of powerlessness, and pointing to “the writers” often helps us deny any responsibility we may have for Trump, whether as voters or as journalists who covered the election. But whatever else we’re doing when we joke about Trump and the swirl of chaos around him as fiction, we’re expressing a wish that this moment will resolve in a narratively and morally comprehensible fashion.

Perhaps we’re also hoping that reality itself will have a fact checker after all, and that the result will make a difference. We don’t know if it will yet. But I’m hopeful that we’ll survive the exponential explosion of errata.

Astounding Stories #21: Black Man’s Burden

with 3 comments

Note: With less than half a year to go until the publication of Astounding: John W. Campbell, Isaac Asimov, Robert A. Heinlein, L. Ron Hubbard, and the Golden Age of Science Fiction, I’m returning, after a long hiatus, to the series in which I highlight works of science fiction that deserve to be rediscovered, reappraised, or simply enjoyed by a wider audience. You can read the earlier installments here

“This never gets old,” T’Challa says in Black Panther, just before we see the nation of Wakanda in its full glory for the first time. It’s perhaps the most moving moment in this often overwhelmingly emotional film, and it speaks to how much of its power hinges on the idea of Wakanda itself. Most fictional countries in the movies—a disproportionate number of which seem to be located in Africa, South America, or the Middle East—are narrative evasions, but not here. As Ishaan Tharoor wrote recently in the Washington Post:

Wakanda, like many places in Africa, is home to a great wealth of natural resources. But unlike most places in Africa, it was able to avoid European colonization. Shielded by the powers of vibranium, the element mined beneath its surface that enabled the country to develop the world’s most advanced technology, Wakanda resisted invaders while its rulers constructed a beautiful space-age kingdom.

Or as the writer Evan Narcisse observed elsewhere to the Post: “Wakanda represents this unbroken chain of achievement of black excellence that never got interrupted by colonialism.” It’s imaginary, yes, but that’s part of the point. In his review, Anthony Lane of The New Yorker delivered a gentle rebuke: “I wonder what weight of political responsibility can, or should, be laid upon anything that is accompanied by buttered popcorn. Vibranium is no more real than the philosopher’s stone…Are 3-D spectacles any more reliable than rose-tinted ones, when we seek to imagine an ideal society?” But the gap between dreams and reality is precisely how the best science fiction—and Black Panther, along with so much else, is a kickass science fiction movie—compels us to see the world with new eyes.

The fiction published by the editor John W. Campbell rarely tackled issues of race directly, and the closest that it ever came was probably a series that began with Black Man’s Burden, the first installment of which ran in the December 1961 issue of Analog. It revolves around a coalition of African-American academics working undercover to effect social and political change in North Africa, with the ultimate goal of uniting the region in the scientific and cultural values of the West. The protagonist is a sociologist named Homer Crawford, who explains:

The distrust of the European and the white man as a whole was prevalent, especially here in Africa. However, and particularly in Africa, the citizens of the new countries were almost unbelievably uneducated, untrained, incapable of engineering their own destiny…We of the Reunited Nations teams are here because we are Africans racially but not nationally, we have no affiliations with clan, tribe, or African nation. We are free to work for Africa’s progress without prejudice. Our job is to remove obstacles wherever we find them. To break up log jams. To eliminate prejudices against the steps that must be taken if Africa is to run down the path of progress, rather than to crawl.

All of this is explained to the reader at great length. There’s some effective action, but much of the story consists of the characters talking, and if these young black intellectuals all end up sounding a lot like John W. Campbell, that shouldn’t be surprising—the author, Mack Reynolds, later said that the story and its sequels “were written at a suggestion of John Campbell’s and whole chunks of them were based on his ideas.” Many sections are taken verbatim from the editor’s letters and editorials, ranging from his musings on judo, mob psychology, and the virtues of the quarterstaff to blanket statements that border on the unforgivable: “You know, with possibly a few exceptions, you can’t enslave a man if he doesn’t want to be a slave…The majority of Jefferson’s slaves wanted to be slaves.”

We’re obviously a long way from Wakanda here—but although Black Man’s Burden might seem easy to hate, oddly enough, it isn’t. Mack Reynolds, who had lived in North Africa, was a talented writer, and the serial as a whole is intelligent, restrained, consistently interesting, and mindful of the problems with its own premise. To encourage the locals to reject tribalism in favor of modern science, medicine, and education, for instance, the team attributes many of its ideas to a fictional savior figure, El Hassan, on the theory that such societies “need a hero,” and by the end, Homer Crawford has reluctantly assumed the role himself. (There are shades not just of T.E. Lawrence but of Paul Atreides, whose story would appear in the magazine just two years later.) But he has few illusions about the nature of his work. As one of his colleagues puts it in the sequel:

Monarchies are of the past, and El Hassan is the voice of the future, something new. We won’t admit he’s just a latter-day tyrant, an opportunist seizing power because it’s there crying to be seized. Actually, El Hassan is in the tradition of Genghis Khan, Temerlane, or, more recently, Napoleon. But he’s a modern version, and we’re not going to hang the old labels on him.

Crawford mordantly responds: “As a young sociologist, I never expected to wind up a literal tyrant.” And Reynolds doesn’t pretend to offer easy solutions. The sequel, Border, Breed, Nor Birth, closes with a bleak denial of happy endings, while the concluding story, “Black Sheep Astray,” ends with Crawford, overthrown after a long rule as El Hassan, returning to start a new revolution among the younger generation, at the likely cost of his life. The leads are drawn with considerable care—even if Reynolds has a bad habit of saying that they look “surprisingly like” Joe Louis or Lena Horne—and their mere presence in Analog is striking enough that one prominent scholar has used it to question Samuel R. Delany’s claim that Campbell rejected one of his stories because “his readership would be able to relate to a black main character.”

Yet this overlooks the fact that an ambitious, messy, uncategorizable novel like Delany’s Nova is worlds apart from a serial that was commissioned and written to Campbell’s specifications. And its conceptual and literary limitations turn out to be closely related. Black Man’s Burden is constructed with diligence and real craft, but this doesn’t make its basic premise any more tenable. It interrogates many of its assumptions, but it doesn’t really question the notion of a covert operation to shape another country’s politics through propaganda, guerrilla action, and the assimilation of undercover agents into the local population. This isn’t science fiction. It’s what intelligence agencies on both sides were doing throughout the Cold War. (If anything, the whisper campaign for El Hassan seems primitive by contemporary standards. These days, the plan would include data analysis, viral messaging in support of favored policies or candidates, and the systematic weaponization of social media on the part of foreign nationals. What would be wrong with that?) By the story’s own logic, the project has to be run by black activists because the locals are suspicious of white outsiders, but there’s no suggestion that their underlying goals are any different—and if the same story would be unthinkable with a white protagonist, it implies that it has problems here that can’t be addressed with a change of race. It’s also characteristically evasive when it comes to how psychohistory actually works. Reading it again, I found myself thinking of what William Easterly writes in The White Man’s Burden:

A Planner thinks he already knows the answers; he thinks of poverty as a technical engineering problem that his answers will solve. A Searcher admits he doesn’t know the answers in advance…A Planner believes outsiders know enough to impose solutions. A Searcher believes only insiders have enough knowledge to find solutions, and that most solutions must be homegrown.

Planners still exist in foreign aid—but they can also edit magazines. Campbell was one of them. Black Man’s Burden was his idea of how to deal with race in Analog, even as he failed to make any effort to look for black writers who knew about the subject firsthand. And it worked about as well here as it did anywhere else.

The war of ideas

with 2 comments

Over the last few days, I’ve been thinking a lot about a pair of tweets. One is from Susan Hennessy, an editor for the national security blog Lawfare, who wrote: “Much of my education has been about grasping nuance, shades of gray. Resisting the urge to oversimplify the complexity of human motivation. This year has taught me that, actually, a lot of what really matters comes down to good people and bad people. And these are bad people.” This is a remarkable statement, and in some ways a heartbreaking one, but I can’t disagree with it, and it reflects a growing trend among journalists and other commentators to simply call what we’re seeing by its name. In response to the lies about the students of Marjory Stoneman Douglas High School—including the accusation that some of them are actors—Margaret Sullivan of the Washington Post wrote:

When people act like cretins, should they be ignored? Does talking about their misdeeds merely give them oxygen? Maybe so. But the sliming—there is no other word for it—of the survivors of last week’s Florida high school massacre is beyond the pale…Legitimate disagreement over policy issues is one thing. Lies, conspiracy theories and insults are quite another.

And Paul Krugman went even further: “America in 2018 is not a place where we can disagree without being disagreeable, where there are good people and good ideas on both sides, or whatever other bipartisan homily you want to recite. We are, instead, living in a kakistocracy, a nation ruled by the worst, and we need to face up to that unpleasant reality.”

The other tweet that has been weighing on my mind was from Rob Goldman, a vice president of advertising for Facebook. It was just one of a series of thoughts—which is an important detail in itself—that he tweeted out on the day that Robert Mueller indicted thirteen Russian nationals for their roles in interfering in the presidential election. After proclaiming that he was “very excited” to see the indictments, Goldman said that he wanted to clear up a few points. He had seen “all of the Russian ads” that appeared on Facebook, and he stated: “I can say very definitively that swaying the election was not the main goal.” But his most memorable words, at least for me, were: “The majority of the Russian ad spend happened after the election. We shared that fact, but very few outlets have covered it because it doesn’t align with the main media narrative of Tump [sic] and the election.” This is an astounding statement, in part because it seems to defend Facebook by saying that it kept running these ads for longer than most people assume. But it’s also inexplicable. It may well be, as some observers have contended, that Goldman had a “nuanced” point to make, but he chose to express it on a forum that is uniquely vulnerable to being taken out of context, and to unthinkingly use language that was liable to be misinterpreted. As Josh Marshall wrote:

[Goldman] even apes what amounts to quasi-Trumpian rhetoric in saying the media distorts the story because the facts “don’t align with the main media narrative of Trump and the election.” This is silly. Elections are a big deal. It’s hardly surprising that people would focus on the election, even though it’s continued since. What is this about exactly? Is Goldman some kind of hardcore Trumper?

I don’t think he is. But it also doesn’t matter, at least not when his thoughts were retweeted approvingly by the president himself.

This all leads me to a point that the events of the last week have only clarified. We’re living in a world in which the lines between right and wrong seem more starkly drawn than ever, with anger and distrust rising to an unbearable degree on both sides. From where I stand, it’s very hard for me to see how we recover from this. When you can accurately say that the United States has become a kakistocracy, you can’t just go back to the way things used to be. Whatever the outcome of the next election, the political landscape has been altered in ways that would have been unthinkable even two years ago, and I can’t see it changing during my lifetime. But even though the stakes seem clear, the answer isn’t less nuance, but more. If there’s one big takeaway from the last eighteen months, it’s that the line between seemingly moderate Republicans and Donald Trump was so evanescent that it took only the gentlest of breaths to blow it away. It suggests that we were closer to the precipice than we ever suspected, and unpacking that situation—and its implications for the future—requires more nuance than most forms of social media can provide. Rob Goldman, who should have known better, didn’t grasp this. And while I hope that the students at Marjory Stoneman Douglas do better, I also worry about how effective they can really be. Charlie Warzel of Buzzfeed recently argued that the pro-Trump media has met its match in the Parkland students: “It chose a political enemy effectively born onto the internet and innately capable of waging an information war.” I want to believe this. But it may also be that these aren’t the weapons that we need. The information war is real, but the only way to win it may be to move it into another battlefield entirely.

Which brings us, in a curious way, back to Robert Mueller, who seems to have assumed the same role for many progressives that Nate Silver once occupied—the one man who was somehow going to tell us that everything was going to be fine. But their differences are also telling. Silver generated reams of commentary, but his reputation ultimately came down to his ability to provide a single number, updated in real time, that would indicate how worried we had to be. That trust is clearly gone, and his fall from grace is less about his own mistakes than it’s an overdue reckoning for the promises of data journalism in general. Mueller, by contrast, does everything in private, avoids the spotlight, and emerges every few months with a mountain of new material that we didn’t even know existed. It’s nuanced, qualitative, and not easy to summarize. As the coverage endlessly reminds us, we don’t know what else the investigation will find, but that’s part of the point. At a time in which controversies seem to erupt overnight, dominate the conversation for a day, and then yield to the next morning’s outrage, Mueller embodies the almost anachronistic notion that the way to make something stick is to work on it diligently, far from the public eye, and release each piece only when you’re ready. (In the words of a proverbial saying attributed to everyone from Buckminster Fuller to Michael Schrage: “Never show fools unfinished work.” And we’re all fools these days.) I picture him fondly as the head of a monastery in the Dark Ages, laboriously preserving information for the future, or even as the shadowy overseer of Asimov’s Foundation. Mueller’s low profile allows him to mean whatever we want to us, of course, and for all I know, he may not be the embodiment of all the virtues that Ralph Waldo Emerson identified as punctuality, personal attention, courage, and thoroughness. I just know that he’s the only one left who might be. Mueller can’t save us by himself. But his example might just show us the way.

The Hedgehog, the Fox, and the Fatted Ram, Part 1

leave a comment »

Over the long weekend, both the New York Times and the Washington Post published lead articles on the diminishing public profile of Jared Kushner. The timing may have been a coincidence, but the pieces had striking similarities. Both made the argument that Kushner’s portfolio, once so vast, has been dramatically reduced by the arrival on the scene of White House chief of staff John F. Kelly; both ran under a headline that inclined some version of the word “shrinking”; and both led off with memorable quotes from their subject. In the Times, it was Kushner’s response when asked by Reince Priebus what his Office of American Innovation would really do: “What do you care?” (The newspaper of record, proper as ever, added: “He emphasized his point with an expletive.”) Meanwhile, the Post, which actually scored an interview, came away with something even stranger. Here’s what Kushner said of himself:

During the campaign, I was more like a fox than a hedgehog. I was more of a generalist having to learn about and master a lot of skills quickly. When I got to D.C., I came with an understanding that the problems here are so complex—and if they were easy problems, they would have been fixed before—and so I became more like the hedgehog, where it was more taking issues you care deeply about, going deep and devoting the time, energy and resources to trying to drive change.

The Post merely noted that this is Kushner’s “version the fable of the fox, who knows many things, and the hedgehog, who knows one important thing,” but as the Washington Examiner pointed out, the real source is Isaiah Berlin’s classic book The Hedgehog and the Fox, which draws its famous contrast between foxes and hedgehogs as a prelude to a consideration of Leo Tolstoy’s theory of history.

Berlin’s book, which is one of my favorites, is so unlike what I’d expect Jared Kushner to be reading that I can’t resist trying to figure out what this reference to it means. If I were conspiratorially minded, I’d observe that if Kushner had wanted to put together a reading list to quickly bring himself up to speed on the history and culture of Russia—I can’t imagine why—then The Hedgehog and the Fox, which can be absorbed in a couple of hours, would be near the top. But the truth, unfortunately, is probably more prosaic. If there’s a single book from the last decade that Kushner, who was briefly touted as the prodigy behind Trump’s data operation, can be assumed to have read, or at least skimmed, it’s Nate Silver’s The Signal and the Noise. And Silver talks at length about the supposed contrast between foxes and hedgehogs, courtesy of a professor of psychology and political science named Philip E. Tetlock, who conducted a study of predictions by experts in various fields:

Tetlock was able to classify his experts along a spectrum between what he called hedgehogs and foxes. The reference to hedgehogs and foxes comes from the title of an Isaiah Berlin essay on the Russian novelist Leo Tolstoy—The Hedgehog and the Fox…Foxes, Tetlock found, are considerably better at forecasting than hedgehogs. They had come closer to the mark on the Soviet Union, for instance. Rather than seeing the USSR in highly ideological terms—as an intrinsically “evil empire,” or as a relatively successful (and perhaps even admirable) example of a Marxist economic system—they instead saw it for what it was: an increasingly dysfunctional nation that was in danger of coming apart at the seams. Whereas the hedgehogs’ forecasts were barely any better than random chance, the foxes’ demonstrated predictive skill.

As intriguing as we might find this reference to Russia, which Kushner presumably read, it also means that in all likelihood, he never even opened Berlin’s book. (Silver annoyingly writes: “Unless you are a fan of Tolstoy—or of flowery prose—you’ll have no particular reason to read Berlin’s essay.”) But it doesn’t really matter where he encountered these classifications. As much as I love the whole notion of the hedgehog and the fox, it has one big problem—as soon as you read it, you’re immediately tempted to apply it to yourself, as Kushner does, when in fact its explanatory power applies only to geniuses. Like John Keats’s celebrated concept of negative capability, which is often used to excuse sloppy, inconsistent thinking, Berlin’s essay encourages us to think of ourselves as foxes or hedgehogs, when we’re really just dilettantes or suffering from tunnel vision. And this categorization has its limits even when applied to unquestionably exceptional personalities. Here’s how Berlin lays it out on the very first page of his book:

There exists a great chasm between those, on one side, who relate everything to a single central vision, one system less or more coherent or articulate, in terms of which they understand, think and feel—a single, universal, organizing principle in terms of which alone all that they are and say has significance—and, on the other side, those who pursue many ends, often unrelated and even contradictory, connected, if at all, only in some de facto way, for some psychological or physiological cause, related by no moral or aesthetic principle; these last lead lives, perform acts, and entertain ideas that are centrifugal rather than centripetal, their thought is scattered or diffused, moving on many levels…without, consciously or unconsciously, seeking to fit [experiences and objects] into, or exclude them from, any one unchanging, all-embracing, sometimes self-contradictory and incomplete, at times fanatical, unitary inner vision.

The contrast that Berlin draws here could hardly seem more stark, but it falls apart as soon as we apply it to, say, Kushner’s father-in-law. On the one hand, Trump has succeeded beyond his wildest dreams by harping monotonously on a handful of reliable themes, notably white nationalism, xenophobia, and resentment of liberal elites. Nothing could seem more like the hedgehog. On the other hand, from one tweet to the next, he’s nothing if not “centrifugal rather than centripetal,” driven by his impulses, embracing contradictory positions, undermining his own surrogates, and resisting all attempts to pin him down to a conventional ideology. It’s all very foxlike. The most generous reading would be to argue that Trump, as Berlin contends of Tolstoy, is “by nature a fox, but [believes] in being a hedgehog,” a comparison that seems ridiculous even as I type it. It’s far more plausible that Trump lacks the intellectual rigor, or even the basic desire, to assemble anything like a coherent politics out of his instinctive drives for power and revenge. Like most of us, he’s a mediocre thinker, and his confusions, which reflect those of his base, have gone a long way toward enabling his rise. Trump bears much the same relationship to his fans that Emerson saw in the man who obsessed Tolstoy so deeply:

Among the eminent persons of the nineteenth century, Bonaparte is far the best known and the most powerful; and owes his predominance to the fidelity with which he expresses the tone of thought and belief, the aims of the masses…If Napoleon is France, if Napoleon is Europe, it is because the people whom he sways are little Napoleons.

Faced with a Trump, little or big, Berlin’s categories lose all meaning—not out of any conceptual weakness, but because it wasn’t what they were designed to do. But that doesn’t mean that Berlin doesn’t deserve our attention. In fact, The Hedgehog and the Fox has more to say about our current predicament than any other book I know, and if Kushner ever bothered to read it, it might give him reason to worry. I’ll have more to say about this tomorrow.

Bringing up the bodies

with one comment

For the last few weeks, my wife and I have been slowly working our way through Ken Burns and Lynn Novick’s devastating documentary series Vietnam. The other night, we finished the episode “Resolve,” which includes an extraordinary sequence—you can find it here around the twenty-five minute mark—about the war’s use of questionable metrics. As narrator Peter Coyote intones: “Since there was no front in Vietnam, as there had been in the first and second World Wars, since no ground was ever permanently won or lost, the American military command in Vietnam—MACV—fell back more and more on a single grisly measure of supposed success: counting corpses. Body count.” The historian and retired Army officer James Willbanks observes:

The problem with the war, as it often is, are the metrics. It is a situation where if you can’t count what’s important, you make what you can count important. So, in this particular case, what you could count was dead enemy bodies.

And as the horrifying images of stacked bodies fill the screen, we hear the quiet, reasonable voice of Robert Gard, a retired lieutenant general and former chairman of the board of the Center for Arms Control and Non-Proliferation: “If body count is the measure of success, then there’s the tendency to count every body as an enemy soldier. There’s a tendency to want to pile up dead bodies and perhaps to use less discriminate firepower than you otherwise might in order to achieve the result that you’re charged with trying to obtain.”

These days, we casually use the phrase “body count” to describe violence in movies and video games, and I was startled to realize how recent the term really is—the earliest reported instance is from 1962, and the oldest results that I can find in a Google Book search are from the early seventies. (Its first use as a book’s title, as far as I can determine, is for the memoir of William Calley, the officer convicted of murder for his involvement in the My Lai massacre.) Military metaphors have a way of seeping into everyday use, in part because of their vividness and, perhaps, because we all like to think of ourselves as fighting in one war or another, but after watching Vietnam, I think that “body count” ought to be forcibly restored to its original connotations. It doesn’t take a lot of introspection to see that it was a statistic that was only possible in a war in which the enemy could be easily dehumanized, and that it encouraged a lack of distinction between military and civilian combatants. Like most faulty metrics, it created a toxic set of incentives from the highest levels of command to the soldiers on the ground. As the full extent of the war’s miscalculations grew more clear, these facts became hard to ignore, and the term itself came to encapsulate the mistakes and deceptions of the conflict as a whole. Writing in Playboy in 1982, Philip Caputo called it “one of the most hideous, morally corrupting ideas ever conceived by the military mind.” Yet most of its emotional charge has since been lost. Words matter, and as the phrase’s significance is obscured, the metric itself starts to creep back. And the temptation to fall back on it increases in response to a confluence of specific factors, as a country engages in military action in which the goals are unclear and victory is poorly defined.

As a result, it’s no surprise that we’re seeing a return to body count. As far back as 2005, Bradley Graham of the Washington Post reported: “The revival of body counts, a practice discredited during the Vietnam War, has apparently come without formal guidance from the Pentagon’s leadership.” More recently, Reed Richardson wrote on FAIR:

In the past few years, official body count estimates have made a notable comeback, as U.S. military and administration officials have tried to talk up the U.S. coalition’s war against ISIS in Syria and Iraq…For example, last August, the U.S. commander of the Syrian-Iraq war garnered a flurry of favorable coverage of the war when he announced that the coalition had killed 45,000 ISIS militants in the past two years. By December, the official ISIS body count number, according to an anonymous “senior U.S. official,” had risen to 50,000 and led headlines on cable news. Reading through that media coverage, though, one finds little skepticism about the figures or historical context about how these killed in action numbers line up with the official estimates of ISIS’s overall size, which have stayed stubbornly consistent year after year. In fact, the official estimated size of ISIS in 2015 and 2016 averaged 25,000 fighters, which means the U.S. coalition had supposedly wiped out the equivalent of its entire force over both years without making a dent in its overall size.

Richardson sums up: “As our not-too-distant past has clearly shown, enemy body counts are a handy, hard-to-resist tool that administrations of both parties often use for war propaganda to promote the idea we are ‘winning’ and to stave off dissent about why we’re fighting in the first place.”

It’s worth pointing out, as Richardson does, that such language isn’t confined to any one party, and it was equally prevalent during the Obama administration. But we should be even more wary of it now. (Richardson writes: “In February, Gen. Tony Thomas, the commander of US Special Operations Command, told a public symposium that 60,000 ISIS fighters had been killed. Thomas added this disingenuous qualifier to his evidence-free number: ‘I’m not that into morbid body count, but that matters.’”) Trump has spent his entire career inflating his numbers, from his net worth to the size of his inauguration crowds, and because he lacks a clear grasp of policy, he’s more inclined to gauge his success—and the lack thereof by his enemies—in terms that lend themselves to the most mindless ways of keeping score, like television ratings. He’s also fundamentally disposed to claim that everything that he does is the biggest and the best, in the face of all evidence to the contrary. This extends to areas that can’t be easily quantified, like international relations, so that every negotiation becomes a zero-sum game in which, as Joe Nocera put it a few years ago: “In every deal, he has to win and you have to lose.” It encourages Trump and his surrogates to see everything as a war, even if it leads them to inflict just as much damage on themselves, and the incentives that he imposes on those around him, in which no admission of error is possible, drag down even the best of his subordinates. And we’ve seen this pattern before. As the journalist Joe Galloway says in Vietnam: “You don’t get details with a body count. You get numbers. And the numbers are lies, most of ‘em. If body count is your success mark, then you’re pushing otherwise honorable men, warriors, to become liars.”

Written by nevalalee

October 24, 2017 at 8:15 am

The art of obfuscation

with 2 comments

In the book Obfuscation: A User’s Guide for Privacy and Protest, a long excerpt of which recently appeared on Nautilus, the academics Finn Brunton and Helen Nissenbaum investigate the ways in which short-term sources of distraction can be used to conceal or obscure the truth. One of their most striking examples is drawn from The Art of Political Murder by Francisco Goldman, which recounts the aftermath of the brutal assassination of the Guatemalan bishop Juan José Gerardi Conedera. Brunton and Nissenbaum write:   

As Goldman documented the long and dangerous process of bringing at least a few of those responsible within the Guatemalan military to justice for this murder, he observed that those threatened by the investigation didn’t merely plant evidence to conceal their role. Framing someone else would be an obvious tactic, and the planted evidence would be assumed to be false. Rather, they produced too much conflicting evidence, too many witnesses and testimonials, too many possible stories. The goal was not to construct an airtight lie, but rather to multiply the possible hypotheses so prolifically that observers would despair of ever arriving at the truth. The circumstances of the bishop’s murder produced what Goldman terms an “endlessly exploitable situation,” full of leads that led nowhere and mountains of seized evidence, each factual element calling the others into question. “So much could be made and so much would be made to seem to connect,” Goldman writes, his italics emphasizing the power of the ambiguity.

What interests me the most about this account is how the players took the existing features of this “endlessly exploitable situation,” which was already too complicated for any one person to easily understand, and simply turned up the volume. They didn’t need to create distractions out of nothing—they just had to leverage and intensify what was naturally there. It’s a clever strategy, because it only needs to last long for enough to run out the clock until the possibility of any real investigation has diminished. Brunton and Nissenbaum draw a useful analogy to the concept of “chaff” in radar countermeasures:

During World War II, a radar operator tracks an airplane over Hamburg, guiding searchlights and anti-aircraft guns in relation to a phosphor dot whose position is updated with each sweep of the antenna. Abruptly, dots that seem to represent airplanes begin to multiply, quickly swamping the display. The actual plane is in there somewhere, impossible to locate owing to the presence of “false echoes.” The plane has released chaff—strips of black paper backed with aluminum foil and cut to half the target radar’s wavelength. Thrown out by the pound and then floating down through the air, they fill the radar screen with signals. The chaff has exactly met the conditions of data the radar is configured to look for, and has given it more “planes,” scattered all across the sky, than it can handle…That the chaff worked only briefly as it fluttered to the ground and was not a permanent solution wasn’t relevant under the circumstances. It only had to work well enough and long enough for the plane to get past the range of the radar.

The authors conclude: “Many forms of obfuscation work best as time-buying ‘throw-away’ moves. They can get you only a few minutes, but sometimes a few minutes is all the time you need.”

The book Obfuscation appeared almost exactly a year ago, but its argument takes on an additional resonance now, when the level of noise in our politics has risen to a degree that makes the culture wars of the past seem positively quaint. It can largely, but not entirely, be attributed to just one man, and there’s an ongoing debate over whether Trump’s use of the rhetorical equivalent of chaff is instinctive, like a squid squirting ink at its enemies, or a deliberate strategy. I tend to see it as the former, but that doesn’t mean that his impulsiveness can’t product the same result—and perhaps even more effectively—as a considered program of disinformation and distraction. What really scares me is the prospect of such tricks becoming channeled and institutionalized the hands of more capable surrogates, as soon as an “endlessly exploitable situation” comes to pass. In The Art of Political Murder, Goldman sums up “the seemingly irresistible logic behind so much of the suspicion, speculation, and tendentiousness” that enveloped the bishop’s death: “Something like this can seem to have a connection to a crime like that.” All you need is an event that produces a flood of data that can be assembled in any number of ways by selectively emphasizing certain connections while deemphasizing others. The great example here is the Kennedy assassination, which generated an unbelievable amount of raw ore for obsessive personalities to sift, like a bin of tesserae that could be put together into any mosaic imaginable. Compiling huge masses of documentation and testimony and placing it before the public is generally something that we only see in a governmental investigation, which has the time and resources to accumulate the information that will inevitably be used to undermine its own conclusions.   

At the moment, there’s one obvious scenario in which this precise situation could arise. I’ve often found myself thinking of Robert Mueller in much the way that Quinta Jurecic of the Washington Post characterizes him in an opinion piece starkly titled “Robert Mueller Can’t Save Us”: “In the American imagination, Mueller is more than Trump’s adversary or the man who happens to be investigating him. He’s the president’s mythic opposite—the anti-Trump…Mueller is an avatar of our hope that justice and meaning will reassert themselves against Trumpian insincerity.” And I frequently console myself with the image of the Mueller investigation as a kind of bucket in which every passing outrage, briefly flaring up in the media only to be obscured by its successors, is filed away for later reckoning. As Jurecic points out, these hopes are misplaced:

There’s no way of knowing how long [Mueller’s] investigation will take and what it will turn up. It could be years before the probe is completed. It could be that Mueller’s team finds no evidence of criminal misconduct on the part of the president himself. And because the special counsel has no obligation to report his conclusions to the public—indeed, the special-counsel regulations do not give him the power to do so without the approval of Deputy Attorney General Rod J. Rosenstein—we may never know what he uncovers.

She’s right, but she also misses what I think is the most frightening possibility of all, which is that the Russia investigation will provide the exact combination of factors—“too many witnesses and testimonials, too many possible stories”—to create the situation that Goldman described in Guatemala. It’s hard to imagine a better breeding ground for conspiracy theories, alternate narratives, and false connections, and the likely purveyors are already practicing on a smaller scale. The Mueller investigation is necessary and important. But it will also provide the artists of obfuscation with the materials to paint their masterpiece.

%d bloggers like this: