Posts Tagged ‘Daniel Handler’
A lover’s lies
Over the last month, I’ve been listening endlessly to 50 Song Memoir, the sprawling autobiographical album by Stephin Merritt of the Magnetic Fields. Like much of his work, it’s both technically exquisite and cheerfully uneven, with throwaway novelty tracks alternating with songs that I don’t think I’ll ever forget, but it’s clearly a landmark in the career of one of our indispensable artists. One of its best features is a thick accompanying booklet, in which Merritt walks his good friend Daniel Handler through the stories behind all five discs. It’s totally fascinating, both for its insights into craft and for its uncharacteristic moments of introspection. But it also includes an anecdote, which Merritt shares only in passing, that has been on my mind a lot, particularly in light of what I’ve been discussing over the last few days:
[The song] “Lovers’ Lies” is a boyfriend who, it later turned out, was a pathological liar. Dale Peck has a whole chapter about him. Apparently he went out with Dale Peck before he went out with me, which I didn’t know at the time…So he allowed everyone to believe that he was HIV-positive, because he was an AIDS activist, and it just seemed simpler. But he was not in fact HIV-positive, and eventually that got out, and he became a pariah, persona non grata, and had to leave the area.
Handler doesn’t ask for further details, and the conversation quickly moves on, leaving the story to stand enigmatically by itself. And the song doesn’t add much to the picture.
Merritt doesn’t mention any names, but he provides more than enough information to identify the individual under discussion, who is also thanked in the liner notes to one of his side projects. (I don’t particularly feel like naming him here, either, so I’ve quietly edited some of the passages that follow.) Dale Peck—a literary critic whom I previously knew best for calling Rick Moody “the worst writer of his generation”—tells the story in his book Visions and Revision, a long excerpt of which appeared a few years ago in Out. In his memoir, Peck recalls that the activist “was the first person I slept with who told me he was HIV-positive,” and that he also claimed to have been the son of a Holocaust refugee, a survivor of English boarding schools and mental institutions, and the victim of a beating in Boston. But after cataloging his friend’s remarkable background, Peck concludes:
Everything I’ve just told you is a lie. The Judaism—the Holocaust—the move to England and the nervous breakdown, the time spent in a mental institution and hustling on the streets, and above all the HIV infection: Every last detail—save, perhaps, his name—was a fabrication, invented for who knows what reason and perpetuated with some major or minor variations not just with me but with all of ACT UP…I don’t believe it was empirically necessary for [him] to adopt the identity of an HIV-positive person in order to become the kind of AIDS activist he became. But he did, and he immersed himself in his role to such a degree that he put himself at risk.
And he was no ordinary fabulist. Peck tells us that he was “also one of the nine or ten most important AIDS activists in the United States,” and in David France’s How to Survive a Plague, we hear more about the scale of this ongoing act of impersonation:
It was thought that he was the sickest member of the HIVIP support group—he had testified as an AIDS patient under oath before Congress and issued a famous dictum to fellow activists, “HIV Negatives Get Out of Our Way”—but in fact he had never been infected at all. David Barr, the support group founder, pieced together the deception through inconsistencies in his stories, the vagueness about his doctor visits, his secrecy about lab results. I was incredulous when confronted with these facts…For almost a decade I watched him partake in some of the most instrumental skirmishes that revolutionized science and medicine. I watched his work save lives. He could have accomplished as much as an openly HIV-negative man.
And France’s thoughts on the reasoning behind this deception are particularly significant: “What drove him, I guessed, was a peculiar kind of thrill-seeking behavior. There was no more immediate battle in this epic war than the one to survive. For young men it was an almost romantic race against time. I can imagine, but not fully understand, a compulsion to feel those stakes very personally.”
Peck makes a similar point in his memoir: “From the beginning of the epidemic part of the fascination with AIDS was the desire to have it. To live with it? To die from it? I suspect it’s probably neither, which is to say, I suspect the HIV these men wanted was the phantasmic kind that brings ‘meaning’ to life rather than sickness or pain or death.” And it makes for a striking contrast with an argument advanced by Susan Sontag in AIDS and Its Metaphors, which was published toward the end of the eighties. After noting that such diseases as tuberculosis and syphilis have been romanticized for their associations with emotionality or creativity, to the point of creating “syphilis envy,” Sontag writes: “But with AIDS—though dementia is also a common, late symptom—no compensatory mythology has arisen, or seems likely to arise. AIDS, like cancer, does not allow romanticizing or sentimentalizing, perhaps because its association with death is too powerful.” Sontag was clearly wrong about this, and it’s even possible to recognize this impulse in more recent cases of activists who altered or embellished aspects of their identities—some blatantly, others less so—in what Eve Fairbanks of Buzzfeed calls a mindset “that makes having endured harrowing circumstances seem almost necessary to speak with any moral authority.” But it may have been something even more fundamental. As Peck writes of one pivotal moment:
[The activist] said he had something to tell me and even as I guessed from his tone what it was he said: “I’m positive.” I use quotation marks here because I know these were his actual words: I recorded them on a piece of yellow paper ripped from a legal pad that I later tucked into a new journal. I was a sporadic journaler at best, usually starting one when I felt that something momentous had happened, and I knew that something momentous had happened here. Not that I had slept with an HIV-positive person, but that I had met someone great. Someone about whom I need manufacture none of my usual illusions to love.
The minor key
“What keeps science fiction a minor genre, for all the brilliance of its authors and apparent pertinence of its concerns?” The critic who asked this question was none other than John Updike, in his New Yorker review of David G. Hartwell’s anthology The World Treasury of Science Fiction, which was published at the end of the eighties. Updike immediately responded to his own question with his usual assurance:
The short answer is that each science-fiction story is so busy inventing its environment that little energy is left to be invested in the human subtleties. Ordinarily, “mainstream” fiction snatches what it needs from the contemporary environment and concentrates upon surprising us with details of behavior; science fiction tends to reverse the priorities…It rarely penetrates and involves us the way the quest realistic fiction can…”The writer,” Edmund Wilson wrote, “must always find expressions for something which has never yet been exposed, must master a new set of phenomena which has never yet been mastered.” Those rhapsodies, for instance, which Proust delivered upon the then-fresh inventions of the telephone, the automobile, and the airplane point up the larger relativities and magical connections of his great novel, as well as show the new century breaking upon a fin-de-siècle sensibility. The modest increments of fictional “news,” of phenomena whose presentation is unprecedented, have the cumulative weight of true science—a nudging, inching fidelity to human change ultimately far more impressive and momentous than the great glittering leaps of science fiction.
I’ll concede that Updike’s underlying point here is basically correct, and that a lot of science fiction has to spend so much time establishing the premise and the background that it has to shortchange or underplay other important qualities along the way. (At its highest level, this is less a reflection of the author’s limitations than a courtesy to the reader. It’s hard to innovate along every parameter at once, so complex works of speculative fiction as different as Gravity’s Rainbow and Inception need to strategically simplify wherever they can.) But there’s also a hidden fallacy in Updike’s description of science fiction as “a minor genre.” What, exactly, would a “major” genre look like? It’s hard to come up with a definitive list, but if we’re going to limit ourselves to a conception of genre that encompasses science fiction and not, say, modernist realism, we’d probably include fantasy, horror, western, romance, erotica, adventure, mystery, suspense, and historical fiction. When we ask ourselves whether Updike would be likely to consider any of these genres “major,” it’s pretty clear that the answer is no. Every genre, by definition, is minor, at least to many literary critics, which not only renders the distinction meaningless, but raises a host of other questions. If we honestly ask what keeps all genres—although not individual authors—in the minor category, there seem to be three possibilities. Either genre fiction fails to attract or keep major talent; it suffers from various systemic problems of the kind that Updike identified for science fiction; or there’s some other quirk in the way we think about fiction that relegates these genres to a secondary status, regardless of the quality of specific works or writers.
And while all three of these factors may play a role, it’s the third one that seems most plausible. (After all, when you average out the quality of all “literary fiction,” from Updike, Bellow, and Roth down to the work put out by the small presses and magazines, it seems fairly clear that Sturgeon’s Law applies here as much as anywhere else, and ninety percent of everything is crud. And modernist realism, like every category coherent enough to earn its own label, has plenty of clichés of its own.) In particular, if a genre writer is deemed good enough, his or her reward is to be elevated out of it entirely. You clearly see this with such authors as Jorge Luis Borges, perhaps the greatest writer of speculative fiction of the twentieth century, who was plucked out of that category to complete more effectively with Proust, Joyce, and Kafka—the last of whom was arguably also a genre writer who was forcibly promoted to the next level. It means that the genre as a whole can never win. Its best writers are promptly confiscated, freeing up critics to speculate about why it remains “minor.” As Daniel Handler noted in an interview several years ago:
I believe that children’s literature is a genre. I resisted the idea that children’s literature is just anything that children are reading. And I certainly resisted the idea that certain books should get promoted out of children’s literature just because adults are reading them. That idea is enraging too. That’s what happens to any genre, right? First you say, “Margaret Atwood isn’t really a science fiction writer.” Then you say, “There really aren’t any good science fiction writers.” That’s because you promoted them all!
And this pattern isn’t a new one. It’s revealing that Updike quoted Edmund Wilson, who in his essays “Why Do People Read Detective Stories” and “Who Cares Who Killed Roger Ackroyd?” dismissed the entire mystery genre as minor or worse. Yet when it came to defending his fondness for one author in particular, he fell back on a familiar trick:
I will now confess, in my turn, that, since my first looking into this subject last fall, I have myself become addicted, in spells, to reading myself to sleep with Sherlock Holmes, which I had gone back to, not having looked at it since childhood, in order to see how it compared with Conan Doyle’s latest imitators. I propose, however, to justify my pleasure in rereading Sherlock Holmes on grounds entirely different from those on which the consumers of the current product ordinarily defend their taste. My contention is that Sherlock Holmes is literature on a humble but not ignoble level, whereas the mystery writers most in vogue now are not. The old stories are literature, not because of the conjuring tricks and the puzzles, not because of the lively melodrama, which they have in common with many other detective stories, but by virtue of imagination and style. These are fairy-tales, as Conan Doyle intimated in his preface to his last collection, and they are among the most amusing of fairy-tales and not among the least distinguished.
Strip away the specifics, and the outlines of the argument are clear. Sherlock Holmes is good, and mysteries are bad, so Sherlock Holmes must be something other than mystery fiction. It’s maddening, but from the point of view of a working critic, it makes perfect sense. You get to hold onto the works that you like, while keeping the rest of the genre safely minor—and then you can read yourself happily to sleep.
Lemony Snicket’s unfortunate event
We go through a lot of picture books in my house these days, but one of my daughter’s current favorites is 13 Words by Lemony Snicket. I picked it up at the library on a whim, and although I wasn’t sure what her reaction would be, she loves it—we’ve probably read it two dozen times over the last few weeks. It’s a clever, slightly subversive deconstruction of vocabulary books for kids, with words ranging from bird (“The bird sits on the table”) to despondent (“The bird is despondent”), all the way through convertible, haberdashery, and mezzo-soprano. The result might have been insufferably arch or smug, but it cuts a neat line between being cute enough for kids and knowing enough for their parents. And it ends up as an engaging hybrid between a sweet picture book and a commentary on the arbitrariness, or absurdity, of children’s books in general, with characters and details introduced without explanation to be assimilated into the treasure heap of a child’s imagination. And while it all ends happily, it closes on a pleasantly melancholy note: “Although the bird, to tell you the truth, is still a little despondent.”
Unfortunately, it’s hard to read it now without being reminded of recent events involving Lemony Snicket himself, aka author Daniel Handler. While emceeing the National Book Awards last week, Handler made the following remarks about author Jacqueline Woodson, who had just accepted an award for her memoir Brown Girl Dreaming:
I said that if she won I would tell all of you something I learned about her this summer. Jackie Woodson is allergic to watermelon. Just let that sink in your minds. I said, “You have to put that in a book.” And she said, “You put that in a book.” And I said, “I’m only writing a book about a black girl who’s allergic to watermelon if you, Cornel West, Toni Morrison and Barack Obama say, “This guy’s OK.”
To Handler’s credit, he responded to the subsequent outrage with an admirably heartfelt apology. Yet for those of us who admire Handler and his work, it still feels inexplicable—probably more so than any other incident since Lars von Trier made his own unfortunate statements three years ago at Cannes.
But as with Lars von Trier, a professional provocateur who can edge into a caricature of himself, Handler’s incredible cluelessness here can’t be separated from the very qualities that have made him so successful. If there’s a defining quality to Handler’s work, as with that of his friend Stephin Merritt, it’s an eye for the darkly absurd, and for such a smart, verbal, ironic personality, something like the watermelon stereotype can seem less like a hurtful image than like a self-contained illustration of the absurdity of racism itself. Taken out of context, it feels transparently ridiculous, as if the racists were unconsciously parodying themselves. For a sensibility like Handler’s, saying that black people like watermelon feels like the equivalent of saying, as he does in 13 Words, that a dog and a goat took a convertible to a haberdashery owned by a baby: a statement that points up the underlying incoherence of the whole business of racial stereotyping. Invoking it feels like a nudge to similarly attuned listeners, a wink that implies: “This image is so nonsensical that I don’t even need to make a joke in order to mock it—the bigots, not me, have done that for themselves. So we can all safely laugh at it.”
Except, emphatically, we can’t. The watermelon stereotype might seem inane on its own, but it’s only a single component of the much more troubling history of racial imagery that inevitably trails along behind it. When we view it in isolation, it’s easy to dismiss it, or even think of it as harmless, as online commenters do when they protest, sincerely: “But watermelon is delicious!” (This may be why a cartoon with similar overtones escaped the notice of the editorial staff at the Boston Herald last month, although it doesn’t excuse it.) Which isn’t to say that it can’t be mocked; none other than Cornel West himself speaks of the tragicomic view of life, in which we laugh in the midst of hate and hypocrisy so as not to fall into despair. But it’s a mistake to forget that what strikes us as absurd—especially when we see it from the outside—can retain all its old power to wound. Irony and knowingness are essential tools, but they can also be a trap, if they fool us into thinking that we can stand above or apart from a legacy that others experience on a daily basis. Handler knows this now, and his sincere contrition has gone a long way toward restoring some of the respect that he lost from his readers. Although, to tell you the truth, I’m still a little despondent.