Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘New York Times

Life on the last mile

with 2 comments

In telecommunications, there’s a concept called “the last mile,” which states that the final leg of a network—the one that actually reaches the user’s home, school or office—is the most difficult and expensive to build. It’s one thing to construct a massive trunkline, which is basically a huge but relatively straightforward feat of engineering, and quite another to deal with the tangle of equipment, wiring, and specifications on the level of thousands of individual households. More recently, the concept has been extended to public transportation, delivery and distribution services, and other fields that depend on connecting an industrial operation on the largest imaginable scale with specific situations on the retail side. (For instance, Amazon has been trying to cross the last mile through everything from its acquisition of Whole Foods to drone delivery, and the fact that these are seen as alternative approaches to the same problem points to how complicated it really is.) This isn’t just a matter of infrastructure, either, but of the difficulties inherent to any system in which a single pipeline has to split into many smaller branches, whether it’s carrying blood, water, mail, or data. Ninety percent of the wiring can be in that last mile, and success lies less in any overall principles than in the irritating particulars. It has to be solved on the ground, rather than in a design document, and you’ll never be able to anticipate all of the obstacles that you’ll face once those connections start to multiply. It’s literally about the ramifications.

I often feel the same way when it comes to writing. When I think back at how I’ve grown as a writer over the last decade or so, I see clear signs of progress. Thanks mostly to the guidelines that David Mamet presents in On Directing Film, it’s much easier for me to write a decent first draft than it was when I began. I rarely leave anything unfinished; I know how to outline and how to cut; and I’m unlikely to make any huge technical mistakes. In his book Which Lie Did I Tell?, William Goldman says something similar about screenwriting:

Stephen Sondheim once said this: “I cannot write a bad song. You begin it here, build, end there. The words will lay properly on the music so they can be sung, that kind of thing. You may hate it, but it will be a proper song.” I sometimes feel that way about my screenplays. I’ve been doing them for so long now, and I’ve attempted most genres. I know about entering the story as late as possible, entering each scene as late as possible, that kind of thing. You may hate it, but it will be a proper screenplay.

Craft, in other words, can take you most of the way—but it’s the final leg that kills you. As Goldman concludes of his initial pass on the script for Absolute Power: “This first draft was proper as hell—you just didn’t give a shit.” And sooner or later, most writers find that they spend most of their time on that last mile.

Like most other art forms, creative writing can indeed be taught—but only to the point that it still resembles an engineering problem. There are a few basic tricks of structure and technique that will improve almost anyone’s work, much like the skills that you learn in art books like Drawing on the Right Side of the Brain, and that kind of advancement can be enormously satisfying. When it comes to the last mile between you and your desired result, however, many of the rules start to seem useless. You aren’t dealing with the general principles that have gotten you this far, but with problems that arise on the level of individual words or sentences, each one of which needs to be tackled on its own. There’s no way of knowing whether or not you’ve made the right choice until you’ve looked at them all in a row, and even if something seems wrong, you may not know how to fix it. The comforting shape of the outline, which can be assembled in a reasonably logical fashion, is replaced by the chaos of the text, and the fact that you’ve done good work on this level before is no guarantee that you can do it right now. I’ve learned a lot about writing over the years, but to the extent that I’m not yet the writer that I want to be, it lies almost entirely in that last mile, where the ideal remains tantalizingly out of reach.

As a result, I end up revising endlessly, even a late stage, and although the draft always gets better, it never reaches perfection. After a while, you have to decide that it’s as good as it’s going to get, and then move on to something else—which is why it helps to have a deadline. But you can take comfort in the fact that the last mile affects even the best of us. In a recent New York Times profile of the playwright Tony Kushner, Charles McGrath writes:

What makes Angels in America so complicated to stage is not just Mr. Kushner’s need to supervise everything, but that Perestroika, the second part, is to a certain extent a work in progress and may always be. The first part, Millennium Approaches, was already up and running in the spring of 1991, when, with a deadline looming, Mr. Kushner retreated to a cabin in Northern California and wrote most of Perestroika in a feverish eight-day stint, hardly sleeping and living on junk food. He has been tinkering with it ever since…Even during rehearsal last month he was still cutting, rewriting, restructuring.

If Tony Kushner is still revising Angels in America, it makes me feel a little better about spending my life on that last mile. Or as John McPhee says about knowing when to stop: “What I know is that I can’t do any better; someone else might do better, but that’s all I can do; so I call it done.”

Instant karma

with one comment

Last year, my wife and I bought an Instant Pot. (If you’re already dreading the rest of this post, I promise in advance that it won’t be devoted solely to singing its praises.) If you somehow haven’t encountered one before, it’s a basically a programmable pressure cooker. It has a bunch of other functions, including slow cooking and making yogurt, but aside from its sauté setting, I haven’t had a chance to use them yet. At first, I suspected that it would be another appliance, like our bread maker, that we would take out of the box once and then never touch again, but somewhat to my surprise, I’ve found myself using it on a regular basis, and not just as a reliable topic for small talk at parties. Its great virtue is that it allows you to prepare certain tasty but otherwise time-consuming recipes—like the butter chicken so famous that it received its own writeup in The New Yorker—with a minimum of fuss. As I write these lines, my Instant Pot has just finished a batch of soft-boiled eggs, which is its most common function in my house these days, and I might use it tomorrow to make chicken adobo. Occasionally, I’ll be mildly annoyed by its minor shortcomings, such as the fact that an egg set for four minutes at low pressure might have a perfect runny yolk one day and verge on hard-boiled the next. It saves time, but when you add in the waiting period to build and then release the pressure, which isn’t factored into most recipes, it can still take an hour or more to make dinner. But it still marks the most significant step forward in my life in the kitchen since Mark Bittman taught me how to use the broiler more than a decade ago.

My wife hasn’t touched it. In fact, she probably wouldn’t mind if I said that she was scared of the Instant Pot—and she isn’t alone in this. A couple of weeks ago, the Wall Street Journal ran a feature by Ellen Byron titled “America’s Instant-Pot Anxiety,” with multiple anecdotes about home cooks who find themselves afraid of their new appliance:

Missing from the enclosed manual and recipe book is how to fix Instant Pot anxiety. Debbie Rochester, an elementary-school teacher in Atlanta, bought an Instant Pot months ago but returned it unopened. “It was too scary, too complicated,” she says. “The front of the thing has so many buttons.” After Ms. Rochester’s friends kept raving about their Instant Pot meals, she bought another one…Days later, Ms. Rochester began her first beef stew. After about ten minutes of cooking, it was time to release the pressure valve, the step she feared most. Ms. Rochester pulled her sweater over her hand, turned her back and twisted the knob without looking. “I was praying that nothing would blow up,” she says.

Elsewhere, the article quotes Sharon Gebauer of San Diego, who just wanted to make beef and barley soup, only to be filled with sudden misgivings: “I filled it up, started it pressure cooking, and then I started to think, what happens when the barley expands? I just said a prayer and stayed the hell away.”

Not surprisingly, the article has inspired derision from Instant Pot enthusiasts, among whom one common response seems to be: “People are dumb. They don’t read instruction manuals.” Yet I can testify firsthand that the Instant Pot can be intimidating. The manual is thick and not especially organized, and it does a poor job of explaining such crucial features as the steam release and float valve. (I had to watch a video to learn how to handle the former, and I didn’t figure out what the latter was until I had been using the pot for weeks.) But I’ve found that you can safely ignore most of it and fall back on a few basic tricks— as soon as you manage to get through at least one meal. Once I successfully prepared my first dish, my confidence increased enormously, and I barely remember how it felt to be nervous around it. And that may be the single most relevant point about the cult that the Instant Pot has inspired, which rivals the most fervent corners of fan culture. As Kevin Roose noted in a recent article in the New York Times:

A new religion has been born…Its deity is the Instant Pot, a line of electric multicookers that has become an internet phenomenon and inspired a legion of passionate foodies and home cooks. These devotees—they call themselves “Potheads”—use their Instant Pots for virtually every kitchen task imaginable: sautéing, pressure-cooking, steaming, even making yogurt and cheesecakes. Then, they evangelize on the internet, using social media to sing the gadget’s praises to the unconverted.

And when you look at the Instant Pot from a certain angle, you realize that it has all of the qualities required to create a specific kind of fan community. There’s an initial learning curve that’s daunting enough to keep out the casuals, but not so steep that it prevents a critical mass of enthusiasts from forming. Once you learn the basics, you forget how intimidating it seemed when you were on the outside. And it has a huge body of associated lore that discourages newbies from diving in, even if it doesn’t matter much in practice. (In the months that I’ve been using the Instant Pot, I’ve never used anything except the manual pressure and sauté functions, and I’ve disregarded the rest of the manual, just as I draw a blank on pretty much every element of the mytharc on The X-Files.) Most of all, perhaps, it takes something that is genuinely good, but imperfect, and elevates it into an object of veneration. There are plenty of examples in pop culture, from Doctor Who to Infinite Jest, and perhaps it isn’t a coincidence that the Instant Pot has a vaguely futuristic feel to it. A science fiction or fantasy franchise can turn off a lot of potential fans because of its history and complicated externals, even if most are peripheral to the actual experience. Using the Instant Pot for the first time is probably easier than trying to get into Doctor Who, or so I assume—I’ve steered clear of that franchise for many of the same reasons, reasonable or otherwise. There’s nothing wrong with being part of a group drawn together by the shared object of your affection. But once you’re on the inside, it can be hard to put yourself in the position of someone who might be afraid to try it because it has so many buttons.

Written by nevalalee

February 15, 2018 at 8:45 am

Reorganizing the peace

with one comment

In his book Experiment in Autobiography, which he wrote when he was in his sixties, the novelist H.G. Wells defined what he saw as his great task ahead: “To get the primaries of life under control and to concentrate the largest possible proportion of my energy upon the particular system of effort that has established itself for me as my distinctive business in the world.” He explained:

I do not now in the least desire to live longer unless I can go on with what I consider to be my proper business…And that is where I am troubled now. I find myself less able to get on with my work than ever before. Perhaps the years have something to do with that, and it may be that a progressive broadening and deepening of my conception of what my work should be, makes it less easy than it was; but the main cause is certainly the invasion of my time and thought by matters that are either quite secondary to my real business or have no justifiable connection with it. Subordinate and everyday things, it seems to me in this present mood, surround me in an ever-growing jungle. My hours are choked with them; my thoughts are tattered by them. All my life I have been pushing aside intrusive tendrils, shirking discursive consequences, bilking unhelpful obligations, but I am more aware of them now and less hopeful about them than I have ever been. I have a sense of crisis; that the time has come to reorganize my peace, if the ten or fifteen years ahead, which at the utmost I may hope to work in now, are to be saved from being altogether overgrown.

As it turns out, Wells was exactly right, and he lived for another fourteen years. And his notion of rethinking one’s life by “reorganizing the peace” has preoccupied me for a long time, too, although it wasn’t until I read this passage that I was able to put it into words. Wells associated such problems with the lives of creative professionals, whom he compares to “early amphibians” trying to leave the water for the first time, but these days, they seem to affect just about everyone. What troubled Wells was the way in which the work of artists and writers, which is usually done in solitude, invariably involves its practitioners in entanglements that take them further away from whatever they wanted to do in the first place. To some extent, that’s true of all pursuits—success in any field means that you spend less time on the fundamentals than you did when you started—but it’s especially hard on the creative side, since its rewards and punishments are so unpredictable. Money, if it comes at all, arrives at unreliable intervals, and much of your energy is devoted to dealing with problems that can’t be anticipated in advance. It’s exhausting and faintly comic, as Wells beautifully phrased it:

Imperfection and incompleteness are the certain lot of all creative workers. We all compromise. We all fall short. The life story to be told of any creative worker is therefore by its very nature, by its diversions of purpose and its qualified success, by its grotesque transitions from sublimation to base necessity and its pervasive stress towards flight, a comedy.

But the artists were just ahead of the curve, and these “grotesque transitions” are now part of all our lives. Instead of depending on the simple sources of gratification—family, work, religion—that served human beings for so long, we’re tied up in complicated networks that offer more problematic forms of support. A few days ago, the social psychologist Jane Adams published an article in the New York Times about the epidemic of “perfectionism” in college students, as young people make unreasonable demands on themselves to satisfy the expectations of social media:

As college students are returning to school after their winter breaks, many parents are concerned about the state of their mental health. The parents worry about the pressure their kids are putting on themselves. Thinking that others in their social network expect a lot of them is even more important to young adults than the expectations of parents and professors…Parents in my practice say they’re noticing how often their kids come away from Facebook and Instagram feeling depressed, ashamed and anxious, and how vulnerable they are to criticism and judgment, even from strangers, on their social media feeds.

And this simply places more of us in the predicament that Wells identified in artists, whose happiness was tied up with the fickle responses of tastemakers, gatekeepers, and the anonymous public. The need to deal with such factors, which were impossible to anticipate from one day to the next, was the source of many of the “entanglements” that he saw as interfering with his work. And the only difference these days is that everyone’s a critic, and we’re all selling ourselves.

But the solution remains the same. Wells spoke fondly of his vision of what he called the Great Good Place, borrowing a phrase from his friend Henry James:

I require a pleasant well-lit writing room in good air and a comfortable bedroom to sleep in—and, if the mood takes me, to write in—both free from distracting noises and indeed all unexpected disturbances. There should be a secretary or at least a typist within call and out of earshot, and, within reach, an abundant library and the rest of the world all hung accessibly on to that secretary’s telephone. (But it would have to be a one-way telephone, so that when we wanted news we could ask for it, and when we were not in a state to receive and digest news, we should not have it forced upon us.)

This desire for a “one-way telephone” makes me wonder how Wells would view our online lives, which over the last decade have evolved to a point where the flow of energy seems to pass both ways. Wells, of course, was most famous for his science fiction, in which he foresaw such future innovations as space travel and nuclear weapons, but this might be his most prescient observation of all: “We are therefore, now and for the next few hundred years at least, strangers and invaders of the life of every day. We are all essentially lonely. In our nerves, in our bones. We are too preoccupied and too experimental to give ourselves freely and honestly to other people, and in the end other people fail to give themselves fully to us.”

Written by nevalalee

January 23, 2018 at 8:45 am

American Stories #9: 808s & Heartbreak

leave a comment »

Note: As we enter what Joe Scarborough justifiably expects to be “the most consequential political year of our lives,” I’m looking back at ten works of art—books, film, television, and music—that deserve to be reexamined in light of where America stands today. You can find the earlier installments here

If there’s a common thread that connects many of the works of art that I’ve been discussing here, it’s the way in which our private selves can be invaded by our lives as members of a larger nation, until the two become neurotically fused into one. This is probably true of all countries, but its deeper connection with the notion of personal reinvention feels especially American, and no celebrity embodies it as much as Kanye West. It might seem impossible to make sense of the political evolution of a man who once told us that President Bush didn’t care about black people and then ended up—despite the efforts of a concerned time traveler—taking a very public meeting with Donald Trump. Yet if one of our most ambitious, talented, and inventive artists can be frequently dismissed by critics as “oblivious,” it may only be because he’s living two years ahead of the rest of us, and he’s unusually committed to working out his confusions in public. We should all feel bewildered these days, and West doesn’t have the luxury of keeping it to himself. It might seem strange to single out 808s & Heartbreak, which looks at first glance like his least political work, but if this is the most important album of the last ten years, and it is, it’s largely because it reminded us of how unbearable emotion can be expressed through what might seem to casual listeners like cold detachment. It’s an insight that has crucial implications for those of us who just want to get through the next few years, and while West wasn’t the first to make it, he was remarkably candid about acknowledging his sources to the New York Times:

I think the fact that I can’t sing that well is what makes 808s so special…808s was the first album of that kind, you know? It was the first, like, black new wave album. I didn’t realize I was new wave until this project. Thus my connection with Peter Saville, with Raf Simons, with high-end fashion, with minor chords. I hadn’t heard new wave! But I am a black new wave artist.

This is exactly right, and it gets at why this album, which once came off as a perverse dead end, feels so much now like the only way forward. When I think of its precursors, my mind naturally turns to the Pet Shop Boys, particularly on Actually, which was first released in 1987. A song like “Shopping” anticipates 808s in its vocal processing, its dry drum machine, its icy synthesizers, and above all in how it was widely misconstrued as a reflection of the Thatcherite consumerism that it was criticizing. That’s the risk that you run as an ironist, and West has been punished for it more often than anybody else. And while these two worlds could hardly seem further apart, the underlying impulses are weirdly similar. New wave is notoriously hard to define, but I like to think of it as a movement occupied by those who aren’t comfortable in rock or punk. Maybe you’re just a huge nerd, or painfully shy, or not straight or white, or part of a group that has traditionally been penalized for expressing vulnerability or dissent. One solution is to remove as much of yourself from the work as possible, falling back on irony, parody, or Auto-Tune. You make a virtue of reticence and understatement, trusting that your intentions will be understood by those who feel the same way. This underlies the obsessive pastiches of Stephin Merritt and the Magnetic Fields, whose 69 Love Songs is the other great album of my adult life, as well as West’s transformation of himself into a robot programmed to feel pain, like an extended version of the death of HAL in 2001: A Space Odyssey. West has taken it further in the years since—“Blood on the Leaves” may be his most scandalous mingling of the political and the personal—but it was 808s that introduced it to his successors, for whom it serves both as a formula for making hits and as an essential means of survival. Sometimes the only way to make it through the coldest winter is to turn it into the coldest story ever told.

The Hedgehog, the Fox, and the Fatted Ram, Part 1

leave a comment »

Over the long weekend, both the New York Times and the Washington Post published lead articles on the diminishing public profile of Jared Kushner. The timing may have been a coincidence, but the pieces had striking similarities. Both made the argument that Kushner’s portfolio, once so vast, has been dramatically reduced by the arrival on the scene of White House chief of staff John F. Kelly; both ran under a headline that inclined some version of the word “shrinking”; and both led off with memorable quotes from their subject. In the Times, it was Kushner’s response when asked by Reince Priebus what his Office of American Innovation would really do: “What do you care?” (The newspaper of record, proper as ever, added: “He emphasized his point with an expletive.”) Meanwhile, the Post, which actually scored an interview, came away with something even stranger. Here’s what Kushner said of himself:

During the campaign, I was more like a fox than a hedgehog. I was more of a generalist having to learn about and master a lot of skills quickly. When I got to D.C., I came with an understanding that the problems here are so complex—and if they were easy problems, they would have been fixed before—and so I became more like the hedgehog, where it was more taking issues you care deeply about, going deep and devoting the time, energy and resources to trying to drive change.

The Post merely noted that this is Kushner’s “version the fable of the fox, who knows many things, and the hedgehog, who knows one important thing,” but as the Washington Examiner pointed out, the real source is Isaiah Berlin’s classic book The Hedgehog and the Fox, which draws its famous contrast between foxes and hedgehogs as a prelude to a consideration of Leo Tolstoy’s theory of history.

Berlin’s book, which is one of my favorites, is so unlike what I’d expect Jared Kushner to be reading that I can’t resist trying to figure out what this reference to it means. If I were conspiratorially minded, I’d observe that if Kushner had wanted to put together a reading list to quickly bring himself up to speed on the history and culture of Russia—I can’t imagine why—then The Hedgehog and the Fox, which can be absorbed in a couple of hours, would be near the top. But the truth, unfortunately, is probably more prosaic. If there’s a single book from the last decade that Kushner, who was briefly touted as the prodigy behind Trump’s data operation, can be assumed to have read, or at least skimmed, it’s Nate Silver’s The Signal and the Noise. And Silver talks at length about the supposed contrast between foxes and hedgehogs, courtesy of a professor of psychology and political science named Philip E. Tetlock, who conducted a study of predictions by experts in various fields:

Tetlock was able to classify his experts along a spectrum between what he called hedgehogs and foxes. The reference to hedgehogs and foxes comes from the title of an Isaiah Berlin essay on the Russian novelist Leo Tolstoy—The Hedgehog and the Fox…Foxes, Tetlock found, are considerably better at forecasting than hedgehogs. They had come closer to the mark on the Soviet Union, for instance. Rather than seeing the USSR in highly ideological terms—as an intrinsically “evil empire,” or as a relatively successful (and perhaps even admirable) example of a Marxist economic system—they instead saw it for what it was: an increasingly dysfunctional nation that was in danger of coming apart at the seams. Whereas the hedgehogs’ forecasts were barely any better than random chance, the foxes’ demonstrated predictive skill.

As intriguing as we might find this reference to Russia, which Kushner presumably read, it also means that in all likelihood, he never even opened Berlin’s book. (Silver annoyingly writes: “Unless you are a fan of Tolstoy—or of flowery prose—you’ll have no particular reason to read Berlin’s essay.”) But it doesn’t really matter where he encountered these classifications. As much as I love the whole notion of the hedgehog and the fox, it has one big problem—as soon as you read it, you’re immediately tempted to apply it to yourself, as Kushner does, when in fact its explanatory power applies only to geniuses. Like John Keats’s celebrated concept of negative capability, which is often used to excuse sloppy, inconsistent thinking, Berlin’s essay encourages us to think of ourselves as foxes or hedgehogs, when we’re really just dilettantes or suffering from tunnel vision. And this categorization has its limits even when applied to unquestionably exceptional personalities. Here’s how Berlin lays it out on the very first page of his book:

There exists a great chasm between those, on one side, who relate everything to a single central vision, one system less or more coherent or articulate, in terms of which they understand, think and feel—a single, universal, organizing principle in terms of which alone all that they are and say has significance—and, on the other side, those who pursue many ends, often unrelated and even contradictory, connected, if at all, only in some de facto way, for some psychological or physiological cause, related by no moral or aesthetic principle; these last lead lives, perform acts, and entertain ideas that are centrifugal rather than centripetal, their thought is scattered or diffused, moving on many levels…without, consciously or unconsciously, seeking to fit [experiences and objects] into, or exclude them from, any one unchanging, all-embracing, sometimes self-contradictory and incomplete, at times fanatical, unitary inner vision.

The contrast that Berlin draws here could hardly seem more stark, but it falls apart as soon as we apply it to, say, Kushner’s father-in-law. On the one hand, Trump has succeeded beyond his wildest dreams by harping monotonously on a handful of reliable themes, notably white nationalism, xenophobia, and resentment of liberal elites. Nothing could seem more like the hedgehog. On the other hand, from one tweet to the next, he’s nothing if not “centrifugal rather than centripetal,” driven by his impulses, embracing contradictory positions, undermining his own surrogates, and resisting all attempts to pin him down to a conventional ideology. It’s all very foxlike. The most generous reading would be to argue that Trump, as Berlin contends of Tolstoy, is “by nature a fox, but [believes] in being a hedgehog,” a comparison that seems ridiculous even as I type it. It’s far more plausible that Trump lacks the intellectual rigor, or even the basic desire, to assemble anything like a coherent politics out of his instinctive drives for power and revenge. Like most of us, he’s a mediocre thinker, and his confusions, which reflect those of his base, have gone a long way toward enabling his rise. Trump bears much the same relationship to his fans that Emerson saw in the man who obsessed Tolstoy so deeply:

Among the eminent persons of the nineteenth century, Bonaparte is far the best known and the most powerful; and owes his predominance to the fidelity with which he expresses the tone of thought and belief, the aims of the masses…If Napoleon is France, if Napoleon is Europe, it is because the people whom he sways are little Napoleons.

Faced with a Trump, little or big, Berlin’s categories lose all meaning—not out of any conceptual weakness, but because it wasn’t what they were designed to do. But that doesn’t mean that Berlin doesn’t deserve our attention. In fact, The Hedgehog and the Fox has more to say about our current predicament than any other book I know, and if Kushner ever bothered to read it, it might give him reason to worry. I’ll have more to say about this tomorrow.

Of texts and textiles

leave a comment »

Yesterday, if you spend as much time as I do browsing random news articles online, your eye might have been caught by a story with the headline “‘Allah’ is Found on Viking Funeral Clothes.” Similar pieces ran in multiple publications, but I’ll stick with the one in the New York Times, which I think is where I saw it first. Here’s how it begins:

The discovery of Arabic characters that spell “Allah” and “Ali” on Viking funeral costumes in boat graves in Sweden has raised questions about the influence of Islam in Scandinavia. The grave where the costumes were found belonged to a woman dressed in silk burial clothes and was excavated from a field in Gamla Uppsala, north of Stockholm, in the 1970s, but its contents were not cataloged until a few years ago, Annika Larsson, a textile archaeologist at Uppsala University, said on Friday.

Larsson says that she was examining the patterns when she “remembered seeing them in similar Moorish designs in silk ribbons from Spain. I understood it had to be a kind of Arabic character, not Nordic.” The article continues: “Upon closer examination of the band from all angles, she said, she realized she was looking at Kufic script. The words Allah and Ali appeared in the silk found in Boat Grave 36 and in many other graves—and, most intriguing, the word Allah could be seen when reflected in a mirror.” It’s “most intriguing” indeed, particularly because it’s consistent with the hypothesis, which is widely credited, that “the Viking settlements in the Malar Valley of Sweden were, in fact, a western outpost of the Silk Road that stretched through Russia to silk-producing centers east of the Caspian Sea.”

Unfortunately, this particular piece of evidence began to fall apart almost at once. I’d like to say that I felt a flicker of doubt even as I read the article, particularly the part about the pattern being “reflected in a mirror,” but I can’t be entirely sure—like a lot of other readers, I glanced over it briefly and moved on. A few hours later, I saw another story headlined “That Viking Textile Probably Didn’t Actually Have ‘Allah’ On It.” It linked to a very persuasive blog post by Carolyn Priest-Dorman, a textile historian and Viking reenactor who seems perfectly positioned to identify the flaws in Larsson’s argument. As the Times article neglects to mention, Larsson’s reconstruction doesn’t just depend on reflecting the design, but in extending it conjecturally on either side, on the assumption that portions of the original are missing. Priest-Dorman points out that this is unwarranted on the evidence:

This unexplained extrapolation practically doubles the width of the band, and here’s why that’s a problem…If you consult…a photo of Band 6, you can clearly see the continuous metallic weft of the band turning at each selvedge to enter back in the other direction.If Larsson were correct that Band 6 was originally significantly wider, you would not see those turning loops; you’d see a series of discontinuous single passes of brocading weft with cut or broken ends at each edge.

In other words, if the pattern were incomplete, we’d see the breaks, but we don’t. And even if this point were up for debate, you clearly increase the risk of subjective readings when you duplicate, reflect, and otherwise distort the raw “text.”

No one has accused Larsson of intentional fraud, but it appears that the right combination of elements—a source of ambiguous patterns, some erudition, and a certain amount of wishful thinking—resulted in a “solution” to a problem that wasn’t there. If this sounds familiar, it might be because I’ve discussed similar cases on this blog before. One is The Great Cryptogram by Ignatius L. Donnelly, who argued that Francis Bacon was the true author of the works of Shakespeare and left clues to his identity in a code in the plays. An even better parallel is the scholar William Romaine Newbold, who died believing that he had cracked the mysterious Voynich Manuscript. As David Kahn recounts in his masterpiece The Codebreakers, Newbold fell victim to much the same kind of error that Larsson did, except at far greater length and complexity:

Newbold saw microscopic shorthand symbols in the macroscopic characters of the manuscript text and began his decipherment by transliterating them into Roman letters. A secondary text of seventeen different letters resulted. He doubled all but the first and last letters of each section…The resultant quaternary text was then “translated”: Newbold replaced the pairs of letters with a single letter, presumably according to a key, which, however, he never made clear…Finally, Newbold anagrammed the letters of this senary text to produce the alleged plaintext in Latin.

The result, of course, was highly suspect. Anagramming chunks of over a hundred characters at a time, as Newbold did, could result in almost any text you wanted, and the “microscopic shorthand symbols” were nothing but “the breaking up of the thick ink on the rough surface of the vellum into shreds and filaments that Newbold had imagined were individual signs.”

Donnelly and Newbold were working before an era of instantaneous news coverage, but I don’t doubt that they would have received plenty of sympathetic, or at least credulous, attention if they had published their results today—and, in fact, hardly a month goes by without reports of a new “breakthrough” in the Voynich Manuscript. (I’m reminded of the Beale cipher, a similar enigma encoding an alleged hidden treasure that inspired an entire society, the Beale Cypher Association, devoted to solving it. In his book Biggest Secrets, the author William Poundstone examined a copy of the society’s quarterly newsletter, which is available online. It contained no fewer than three proposed solutions.) In the aftermath of the Larsson debacle, a number of observers, including Stephennie Mulder of the University of Texas, raised concerns about how the theory was reported: “It should go without saying that a single scholar’s un-peer-reviewed claim does not truth make.” She’s right. But I think there’s a more specific lesson here. Both Larsson and Newbold started with a vast source of raw material, selected a tiny piece of it, and subjected it to a series of analogous permutations. Larsson doubled the pattern and reflected it in a mirror; Newbold doubled the illusory characters and then anagrammed the result. The first step increased the amount of text that could be “studied,” while the second rearranged it arbitrarily to facilitate additional readings. Each transformation moved further away from the original, which should have been a red flag for any skeptical reader. But when you summarize the process by providing only the first and the last steps, while omitting the intermediate stages, the conclusion looks a lot more impressive. This is exactly what happened with Larsson, and when we turn to Newbold, who announced his findings in 1921, we see how little anything has changed. As Kahn writes in The Codebreakers: “The public at large was fascinated. Sunday supplements had a field day.”

Quote of the Day

leave a comment »

I have a theatrical temperament. I’m not interested in the middle road—maybe because everyone’s on it. Rationality, reasonableness bewilder me. I think it comes out of being a “daughter of the Golden West.” A lot of the stories I was brought up on had to do with extreme actions—leaving everything behind, crossing the trackless wastes, and in those stories the people who stayed behind and had their settled ways—those people were not the people who got the prize. The prize was California.

Joan Didion, in an interview with Michiko Kakutani in the New York Times

Written by nevalalee

October 6, 2017 at 7:30 am

%d bloggers like this: