Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘The New Yorker

The temple of doom

leave a comment »

Steven Spielberg on the set of Indiana Jones and the Temple of Doom

Note: I’m taking some time off for the holidays, so I’m republishing a few pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on January 27, 2017.

I think America is going through a paroxysm of rage…But I think there’s going to be a happy ending in November.

Steven Spielberg, to Sky News, July 17, 2016

Last week, in an interview with the New York Times about the twenty-fifth anniversary of Schindler’s List and the expansion of the mission of The Shoah Foundation, Steven Spielberg said of this historical moment:

I think there’s a measurable uptick in anti-Semitism, and certainly an uptick in xenophobia. The racial divide is bigger than I would ever imagine it could be in this modern era. People are voicing hate more now because there’s so many more outlets that give voice to reasonable and unreasonable opinions and demands. People in the highest places are allowing others who would never express their hatred to publicly express it. And that’s been a big change.

Spielberg, it’s fair to say, remains the most quintessentially American of all directors, despite a filmography that ranges freely between cultures and seems equally comfortable in the past and in the future. He’s often called a mythmaker, and if there’s a place where his glossy period pieces, suburban landscapes, and visionary adventures meet, it’s somewhere in the nation’s collective unconscious: its secret reveries of what it used to be, what it is, and what it might be again. Spielberg country, as Stranger Things was determined to remind us, is one of small towns and kids on bikes, but it also still vividly remembers how it beat the Nazis, and it can’t resist turning John Hammond from a calculating billionaire into a grandfatherly, harmless dreamer. No other artist of the last half century has done so much to shape how we all feel about ourselves. He took over where Walt Disney left off. But what has he really done?

To put it in the harshest possible terms, it’s worth asking whether Spielberg—whose personal politics are impeccably liberal—is responsible in part for our current predicament. He taught the New Hollywood how to make movies that force audiences to feel without asking them to think, to encourage an illusion of empathy instead of the real thing, and to create happy endings that confirm viewers in their complacency. You can’t appeal to all four quadrants, as Spielberg did to a greater extent than anyone who has ever lived, without consistently telling people exactly what they want to hear. I’ve spoken elsewhere of how film serves as an exercise ground for the emotions, bringing us closer on a regular basis to the terror, wonder, and despair that many of us would otherwise experience only rarely. It reminds the middle class of what it means to feel pain or awe. But I worry that when we discharge these feelings at the movies, it reduces our capacity to experience them in real life, or, even more insidiously, makes us think that we’re more empathetic and compassionate than we actually are. Few movies have made viewers cry as much as E.T., and few have presented a dilemma further removed than anything a real person is likely to face. (Turn E.T. into an illegal alien being sheltered from a government agency, maybe, and you’d be onto something.) Nearly every film from the first half of Spielberg’s career can be taken as a metaphor for something else. But great popular entertainment has a way of referring to nothing but itself, in a cognitive bridge to nowhere, and his images are so overwhelming that it can seem superfluous to give them any larger meaning.

Steven Spielberg on the set of Jaws

If Spielberg had been content to be nothing but a propagandist, he would have been the greatest one who ever lived. (Hence, perhaps, his queasy fascination with the films of Leni Riefenstahl, who has affinities with Spielberg that make nonsense out of political or religious labels.) Instead, he grew into something that is much harder to define. Jaws, his second film, became the most successful movie ever made, and when he followed it up with Close Encounters, it became obvious that he was in a position with few parallels in the history of art—he occupied a central place in the culture and was also one of its most advanced craftsmen, at a younger age than Damien Chazelle is now. If you’re talented enough to assume that role and smart enough to stay there, your work will inevitably be put to uses that you never could have anticipated. It’s possible to pull clips from Spielberg’s films that make him seem like the cuddliest, most repellent reactionary imaginable, of the sort that once prompted Tony Kushner to say:

Steven Spielberg is apparently a Democrat. He just gave a big party for Bill Clinton. I guess that means he’s probably idiotic…Jurassic Park is sublimely good, hideously reactionary art. E.T. and Close Encounters of the Third Kind are the flagship aesthetic statements of Reaganism. They’re fascinating for that reason, because Spielberg is somebody who has just an astonishing ear for the rumblings of reaction, and he just goes right for it and he knows exactly what to do with it.

Kushner, of course, later became Spielberg’s most devoted screenwriter. And the total transformation of the leading playwright of his generation is the greatest testament imaginable to this director’s uncanny power and importance.

In reality, Spielberg has always been more interesting than he had any right to be, and if his movies have been used to shake people up in the dark while numbing them in other ways, or to confirm the received notions of those who are nostalgic for an America that never existed, it’s hard to conceive of a director of his stature for whom this wouldn’t have been the case. To his credit, Spielberg clearly grasps the uniqueness of his position, and he has done what he could with it, in ways that can seem overly studied. For the last two decades, he has worked hard to challenge some of our assumptions, and at least one of his efforts, Munich, is a masterpiece. But if I’m honest, the film that I find myself thinking about the most is Indiana Jones and the Temple of Doom. It isn’t my favorite Indiana Jones movie—I’d rank it a distant third. For long stretches, it isn’t even all that good. It also trades in the kind of casual racial stereotyping that would be unthinkable today, and it isn’t any more excusable because it deliberately harks back to the conventions of an earlier era. (The fact that it’s even watchable now only indicates how much ground East and South Asians have yet to cover.) But its best scenes are so exciting, so wonderful, and so conductive to dreams that I’ve never gotten over it. Spielberg himself was never particularly pleased with the result, and if asked, he might express discomfort with some of the decisions he made. But there’s no greater tribute to his artistry, which executed that misguided project with such unthinking skill that he exhilarated us almost against his better judgment. It tells us how dangerous he might have been if he hadn’t been so deeply humane. And we should count ourselves lucky that he turned out to be as good of a man as he did, because we’d never have known if he hadn’t.

Updike’s ladder

with one comment

Note: I’m taking the day off, so I’m republishing a post that originally appeared, in a slightly different form, on September 13, 2017.

Last year, the author Anjali Enjeti published an article in The Atlantic titled “Why I’m Still Trying to Get a Book Deal After Ten Years.” If just reading those words makes your palms sweat and puts your heart through a few sympathy palpitations, congratulations—you’re a writer. No matter where you might be in your career, or what length of time you mentally insert into that headline, you can probably relate to what Enjeti writes:

Ten years ago, while sitting at my computer in my sparsely furnished office, I sent my first email to a literary agent. The message included a query letter—a brief synopsis describing the personal-essay collection I’d been working on for the past six years, as well as a short bio about myself. As my third child kicked from inside my pregnant belly, I fantasized about what would come next: a request from the agent to see my book proposal, followed by a dream phone call offering me representation. If all went well, I’d be on my way to becoming a published author by the time my oldest child started first grade.

“Things didn’t go as planned,” Enjeti says dryly, noting that after landing and leaving two agents, she’s been left with six unpublished manuscripts and little else to show for it. She goes on to share the stories of other writers in the same situation, including Michael Bourne of Poets & Writers, who accurately calls the submission process “a slow mauling of my psyche.” And Enjeti wonders: “So after sixteen years of writing books and ten years of failing to find a publisher, why do I keep trying? I ask myself this every day.”

It’s a good question. As it happens, I first encountered her article while reading the authoritative biography Updike by Adam Begley, which chronicles a literary career that amounts to the exact opposite of the ones described above. Begley’s account of John Updike’s first acceptance from The New Yorker—just months after his graduation from Harvard—is like lifestyle porn for writers:

He never forgot the moment when he retrieved the envelope from the mailbox at the end of the drive, the same mailbox that had yielded so many rejection slips, both his and his mother’s: “I felt, standing and reading the good news in the midsummer pink dusk of the stony road beside a field of waving weeds, born as a professional writer.” To extend the metaphor…the actual labor was brief and painless: he passed from unpublished college student to valued contributor in less than two months.

If you’re a writer of any kind, you’re probably biting your hand right now. And I haven’t even gotten to what happened to Updike shortly afterward:

A letter from Katharine White [of The New Yorker] dated September 15, 1954 and addressed to “John H. Updike, General Delivery, Oxford,” proposed that he sign a “first-reading agreement,” a scheme devised for the “most valued and most constant contributors.” Up to this point, he had only one story accepted, along with some light verse. White acknowledged that it was “rather unusual” for the magazine to make this kind of offer to a contributor “of such short standing,” but she and Maxwell and Shawn took into consideration the volume of his submissions…and their overall quality and suitability, and decided that this clever, hard-working young man showed exceptional promise.

Updike was twenty-two years old. Even now, more than half a century later and with his early promise more than fulfilled, it’s hard to read this account without hating him a little. Norman Mailer—whose debut novel, The Naked and the Dead, appeared when he was twenty-five—didn’t pull any punches in “Some Children of the Goddess,” an essay on his contemporaries that was published in Esquire in 1963: “[Updike’s] reputation has traveled in convoy up the Avenue of the Establishment, The New York Times Book Review, blowing sirens like a motorcycle caravan, the professional muse of The New Yorker sitting in the Cadillac, membership cards to the right Fellowships in his pocket.” Even Begley, his biographer, acknowledges the singular nature of his subject’s rise:

It’s worth pausing here to marvel at the unrelieved smoothness of his professional path…Among the other twentieth-century American writers who made a splash before their thirtieth birthday…none piled up accomplishments in as orderly a fashion as Updike, or with as little fuss…This frictionless success has sometimes been held against him. His vast oeuvre materialized with suspiciously little visible effort. Where there’s no struggle, can there be real art? The Romantic notion of the tortured poet has left us with a mild prejudice against the idea of art produced in a calm, rational, workmanlike manner (as he put it, “on a healthy basis of regularity and avoidance of strain”), but that’s precisely how Updike got his start.

Begley doesn’t mention that the phrase “regularity and avoidance of strain” is actually meant to evoke the act of defecation, but even this provides us with an odd picture of writerly contentment. As Dick Hallorann says in The Shining, the best movie about writing ever made: “You got to keep regular, if you want to be happy.”

If there’s a larger theme here, it’s that the sheer productivity and variety of Updike’s career—with its reliable production of uniform hardcover editions over the course of five decades—are inseparable from the “orderly” circumstances of his rise. Updike never lacked a prestigious venue for his talents, which allowed him to focus on being prolific. Writers whose publication history remains volatile and unpredictable, even after they’ve seen print, don’t always have the luxury of being so unruffled, and it can affect their work in ways that are almost subliminal. (A writer can’t survive ten years of chasing after a book deal without spending the entire time convinced that he or she is on the verge of a breakthrough, anticipating an ending that never comes, which may partially account for the prevalence in literary fiction of frustration and unresolved narratives. It also explains why it helps to be privileged enough to fail for years.) The short answer to Begley’s question is that struggle is good for a writer, but so is success, and you take what you can get, even as you’re transformed by it. I think on a monthly basis of what Nicholson Baker writes of Updike in his tribute U and I:

I compared my awkward public self-promotion too with a documentary about Updike that I saw in 1983, I believe, on public TV, in which, in one scene, as the camera follows his climb up a ladder at his mother’s house to put up or take down some storm windows, in the midst of this tricky physical act, he tosses down to us some startlingly lucid little felicity, something about “These small yearly duties which blah blah blah,” and I was stunned to recognize that in Updike we were dealing with a man so naturally verbal that he could write his fucking memoirs on a ladder!

We’re all on that ladder, including Enjeti, who I’m pleased to note finally scored her book deal—she has an essay collection in the works from the University of Georgia Press. Some are on their way up, some are headed down, and some are stuck for years on the same rung. But you never get anywhere if you don’t try to climb.

The unfinished lives

with 3 comments

Yesterday, the New York Times published a long profile of Donald Knuth, the legendary author of The Art of Computer Programming. Knuth is eighty now, and the article by Siobhan Roberts offers an evocative look at an intellectual giant in twilight:

Dr. Knuth usually dresses like the youthful geek he was when he embarked on this odyssey: long-sleeved T-shirt under a short-sleeved T-shirt, with jeans, at least at this time of year…Dr. Knuth lives in Stanford, and allowed for a Sunday visitor. That he spared an entire day was exceptional—usually his availability is “modulo nap time,” a sacred daily ritual from 1 p.m. to 4 p.m. He started early, at Palo Alto’s First Lutheran Church, where he delivered a Sunday school lesson to a standing-room-only crowd.

This year marks the fiftieth anniversary of the publication of the first volume of Knuth’s most famous work, which is still incomplete. Knuth is busy writing the fourth installment, one fascicle at a time, although its most recent piece has been delayed “because he keeps finding more and more irresistible problems that he wants to present.” As Roberts writes: “Dr. Knuth’s exacting standards, literary and otherwise, may explain why his life’s work is nowhere near done. He has a wager with Sergey Brin, the co-founder of Google and a former student…over whether Mr. Brin will finish his Ph.D. before Dr. Knuth concludes his opus…He figures it will take another twenty-five years to finish The Art of Computer Programming, although that time frame has been a constant since about 1980.”

Knuth is a prominent example, although far from the most famous, of a literary and actuarial phenomenon that has grown increasingly familiar—an older author with a projected work of multiple volumes, published one book at a time, that seems increasingly unlikely to ever see completion. On the fiction side, the most noteworthy case has to be that of George R.R. Martin, who has been fielding anxious inquiries from fans for most of the last decade. (In an article that appeared seven long years ago in The New Yorker, Laura Miller quotes Martin, who was only sixty-three at the time: “I’m still getting e-mail from assholes who call me lazy for not finishing the book sooner. They say, ‘You better not pull a Jordan.’”) Robert A. Caro is still laboring over what he hopes will be the final volume of his biography of Lyndon Johnson, and mortality has become an issue not just for him, but for his longtime editor, as we read in Charles McGrath’s classic profile in the Times:

Robert Gottlieb, who signed up Caro to do The Years of Lyndon Johnson when he was editor in chief of Knopf, has continued to edit all of Caro’s books, even after officially leaving the company. Not long ago he said he told Caro: “Let’s look at this situation actuarially. I’m now eighty, and you are seventy-five. The actuarial odds are that if you take however many more years you’re going to take, I’m not going to be here.”

That was six years ago, and both men are still working hard. But sometimes a writer has no choice but to face the inevitable. When asked about the concluding fifth volume of his life of Picasso, with the fourth one still on the way, the biographer John Richardson said candidly: “Listen, I’m ninety-one—I don’t think I have time for that.”

I don’t have the numbers to back this up, but such cases—or at least the public attention that they inspire—seem to be growing more common these days, on account of some combination of lengthening lifespans, increased media coverage of writers at work, and a greater willingness from publishers to agree to multiple volumes in the first place. The subjects of such extended commitments tend to be monumental in themselves, in order to justify the total investment of the writer’s own lifetime, and expanding ambitions are often to blame for blown deadlines. Martin, Caro, and Knuth all increased the prospective number of volumes after their projects were already underway, or as Roberts puts it: “When Dr. Knuth started out, he intended to write a single work. Soon after, computer science underwent its Big Bang, so he reimagined and recast the project in seven volumes.” And this “recasting” seems particularly common in the world of biographies, as the author discovers more material that he can’t bear to cut. The first few volumes may have been produced with relative ease, but as the years pass and anticipation rises, the length of time it takes to write the next installment grows, until it becomes theoretically infinite. Such a radical change of plans, which can involve extending the writing process for decades, or even beyond the author’s natural lifespan, requires an indulgent publisher, university, or other benefactor. (John Richardson’s book has been underwritten by nothing less than the John Richardson Fund for Picasso Research, which reminds me of what Homer Simpson said after being informed that he suffered from Homer Simpson syndrome: “Oh, why me?”) And it may not be an accident that many of the examples that first come to mind are white men, who have the cultural position and privilege to take their time.

It isn’t hard to understand a writer’s reluctance to let go of a subject, the pressures on a book being written in plain sight, or the tempting prospect of working on the same project forever. And the image of such authors confronting their mortality in the face of an unfinished book is often deeply moving. One of the most touching examples is that of Joseph Needham, whose Science and Civilization in China may have undergone the most dramatic expansion of them all, from an intended single volume to twenty-seven and counting. As Kenneth Girdwood Robinson writes in a concluding posthumous volume:

The Duke of Edinburgh, Chancellor of the University of Cambridge, visited The Needham Research Institute, and interested himself in the progress of the project. “And how long will it take to finish it?” he enquired. On being given a rather conservative answer, “At least ten years,” he exclaimed, “Good God, man, Joseph will be dead before you’ve finished,” a very true appreciation of the situation…In his closing years, though his mind remained lucid and his memory astonishing, Needham had great difficulty even in moving from one chair to another, and even more difficulty in speaking and in making himself understood, due to the effect of the medicines he took to control Parkinsonism. But a secretary, working closely with him day by day, could often understand what he had said, and could read what he had written, when others were baffled.

Needham’s decline eventually became impossible to ignore by those who knew him best, as his biographer Simon Winchester writes in The Man Who Loved China: “It was suggested that, for the first time in memory, he take the day off. It was a Friday, after all: he could make it a long weekend. He could charge his batteries for the week ahead. ‘All right,’ he said. ‘I’ll stay at home.’” He died later that day, with his book still unfinished. But it had been a good life.

Quote of the Day

leave a comment »

Writers are usually embarrassed when other writers start to “sing”—their profession’s prestige is at stake and the blabbermouths are likely to have the whole wretched truth beat out of them, that they are an ignorant, hysterically egotistical, shamelessly toadying, envious lot who would do almost anything in the world—even write a novel—to avoid an honest day’s work or escape a human responsibility. Any writer tempted to open his trap in public lets the news out.

Dawn Powell, in The New Yorker

Written by nevalalee

December 3, 2018 at 7:30 am

The private eyes of culture

leave a comment »

Yesterday, in my post on the late magician Ricky Jay, I neglected to mention one of the most fascinating aspects of his long career. Toward the end of his classic profile in The New Yorker, Mark Singer drops an offhand reference to an intriguing project:

Most afternoons, Jay spends a couple of hours in his office, on Sunset Boulevard, in a building owned by Andrew Solt, a television producer…He decided now to drop by the office, where he had to attend to some business involving a new venture that he has begun with Michael Weber—a consulting company called Deceptive Practices, Ltd., and offering “Arcane Knowledge on a Need to Know Basis.” They are currently working on the new Mike Nichols film, Wolf, starring Jack Nicholson.

When the article was written, Deceptive Practices was just getting off the ground, but it went on to compile an enviable list of projects, including The Illusionist, The Prestige, and most famously Forrest Gump, for which Jay and Weber designed the wheelchair that hid Gary Sinise’s legs. It isn’t clear how lucrative the business ever was, but it made for great publicity, and best of all, it allowed Jay to monetize the service that he had offered for free to the likes of David Mamet—a source of “arcane knowledge,” much of it presumably gleaned from his vast reading in the field, that wasn’t available in any other way.

As I reflected on this, I was reminded of another provider of arcane knowledge who figures prominently in one of my favorite novels. In Umberto Eco’s Foucault’s Pendulum, the narrator, Casaubon, comes home to Milan after a long sojourn abroad feeling like a man without a country. He recalls:

I decided to invent a job for myself. I knew a lot of things, unconnected things, but I wanted to be able to connect them after a few hours at a library. I once thought it was necessary to have a theory, and that my problem was that I didn’t. But nowadays all you needed was information; everybody was greedy for information, especially if it was out of date. I dropped in at the university, to see if I could fit in somewhere. The lecture halls were quiet; the students glided along the corridors like ghosts, lending one another badly made bibliographies. I knew how to make a good bibliography.

In practice, Casaubon finds that he knows a lot of things—like the identities of such obscure figures as Lord Chandos and Anselm of Canterbury—that can’t be found easily in reference books, prompting a student to marvel at him: “In your day you knew everything.” This leads Casaubon to a sudden inspiration: “I had a trade after all. I would set up a cultural investigation agency, be a kind of private eye of learning. Instead of sticking my nose into all-night dives and cathouses, I would skulk around bookshops, libraries, corridors of university departments…I was lucky enough to find two rooms and a little kitchen in an old building in the suburbs…In a pair of bookcases I arranged the atlases, encyclopedias, catalogs I acquired bit by bit.”

This feels a little like the fond daydream of a scholar like Umberto Eco himself, who spent decades acquiring arcane knowledge—not all of it required by his academic work—before becoming a famous novelist. And I suspect that many graduate students, professors, and miscellaneous bibliophiles cherish the hope that the scraps of disconnected information that they’ve accumulated over time will turn out to be useful one day, in the face of all evidence to the contrary. (Casaubon is evidently named after the character from Middlemarch who labors for years over a book titled The Key to All Mythologies, which is already completely out of date.) To illustrate what he does for a living, Casaubon offers the example of a translator who calls him one day out of the blue, desperate to know the meaning of the word “Mutakallimūn.” Casaubon asks him for two days, and then he gets to work:

I go to the library, flip through some card catalogs, give the man in the reference office a cigarette, and pick up a clue. That evening I invite an instructor in Islamic studies out for a drink. I buy him a couple of beers and he drops his guard, gives me the lowdown for nothing. I call the client back. “All right, the Mutakallimūn were radical Moslem theologians at the time of Avicenna. They said the world was a sort of dust cloud of accidents that formed particular shapes only by an instantaneous and temporary act of the divine will. If God was distracted for even a moment, the universe would fall to pieces, into a meaningless anarchy of atoms. That enough for you? The job took me three days. Pay what you think is fair.”

Eco could have picked nearly anything to serve as a case study, of course, but the story that he choses serves as a metaphor for one of the central themes of the book. If the world of information is a “meaningless anarchy of atoms,” it takes the private eyes of culture to give it shape and meaning.

All the while, however, Eco is busy undermining the pretensions of his protagonists, who pay a terrible price for treating information so lightly. And it might not seem that such brokers of arcane knowledge are even necessary these days, now that an online search generates pages of results for the Mutakallimūn. Yet there’s still a place for this kind of scholarship, which might end up being the last form of brainwork not to be made obsolete by technology. As Ricky Jay knew, by specializing deeply in one particular field, you might be able to make yourself indispensable, especially in areas where the knowledge hasn’t been written down or digitized. (In the course of researching Astounding, I was repeatedly struck by how much of the story wasn’t available in any readily accessible form. It was buried in letters, manuscripts, and other primary sources, and while this happens to be the one area where I’ve actually done some of the legwork, I have a feeling that it’s equally true of every other topic imaginable.) As both Jay and Casaubon realized, it’s a role that rests on arcane knowledge of the kind that can only be acquired by reading the books that nobody else has bothered to read in a long time, even if it doesn’t pay off right away. Casaubon tells us: “In the beginning, I had to turn a deaf ear to my conscience and write theses for desperate students. It wasn’t hard; I just went and copied some from the previous decade. But then my friends in publishing began sending me manuscripts and foreign books to read—naturally, the least appealing and for little money.” But he perseveres, and the rule that he sets for himself might still be enough, if you’re lucky, to fuel an entire career:

Still, I was accumulating experience and information, and I never threw anything away…I had a strict rule, which I think secret services follow, too: No piece of information is superior to any other. Power lies in having them all on file and then finding the connections.

Written by nevalalee

November 27, 2018 at 8:41 am

Ghosts and diversions

leave a comment »

Over the weekend, after I heard that the magician Ricky Jay had died, I went back to revisit the great profile, “Secrets of the Magus,” that Mark Singer wrote over a quarter of a century ago for The New Yorker. Along with Daniel Zalewski’s classic piece on Werner Herzog, it’s one of the articles in that magazine that I’ve thought about and reread the most, but what caught my attention this time around was a tribute from David Mamet:

I’ll call Ricky on the phone. I’ll ask him—say, for something I’m writing—“A guy’s wandering through upstate New York in 1802 and he comes to a tavern and there’s some sort of mountebank. What would the mountebank be doing?” And Ricky goes to his library and then sends me an entire description of what the mountebank would be doing. Or I’ll tell him I’m having a Fourth of July party and I want to do some sort of disappearance in the middle of the woods. He says, “That’s the most bizarre request I’ve ever heard. You want to do a disappearing effect in the woods? There’s nothing like that in the literature. I mean, there’s this one 1760 pamphlet—Jokes, Tricks, Ghosts, and Diversions by Woodland, Stream and Campfire. But, other than that, I can’t think of a thing.” He’s unbelievably generous. Ricky’s one of the world’s great people. He’s my hero. I’ve never seen anybody better at what he does.

Coming from Mamet, this is high praise indeed, and it gets at most of the reasons why Ricky Jay was one of my heroes, too. Elsewhere in the article, Mamet says admiringly: “I regard Ricky as an example of the ‘superior man,’ according to the I Ching definition. He’s the paradigm of what a philosopher should be: someone who’s devoted his life to both the study and the practice of his chosen field.”

And what struck me on reading these lines again was how deeply Jay’s life and work were tied up in books. A bookseller quoted in Singer’s article estimates that Jay spent more of his disposable income on rare books than anyone else he knew, and his professional legacy might turn out to be even greater as a writer, archivist, and historian as it was for sleight of hand. (“Though Jay abhors the notion of buying books as investments, his own collection, while it is not for sale and is therefore technically priceless, more or less represents his net worth,” Singer writes. And I imagine that a lot of his fellow collectors are very curious about what will happen to his library now.) His most famous book as an author, Learned Pigs & Fireproof Women, includes a chapter on Arthur Lloyd, “The Human Card Index,” a vaudevillian renowned for his ability to produce anything printed on paper—a marriage license, ringside seats to a boxing match, menus, photos of royalty, membership cards for every club imaginable—from his pockets on demand. This feels now like a metaphor for the mystique of Jay himself, who fascinated me for many of the same reasons. Like most great magicians, he exuded an aura of arcane wisdom, but in his case, this impression appears to have been nothing less than the truth. Singer quotes the magician Michael Weber:

Magic is not about someone else sharing the newest secret. Magic is about working hard to discover a secret and making something out of it. You start with some small principle and you build a theatrical presentation out of it. You do something that’s technically artistic that creates a small drama. There are two ways you can expand your knowledge—through books and by gaining the confidence of fellow magicians who will explain these things. Ricky to a large degree gets his information from books—old books—and then when he performs for magicians they want to know, “Where did that come from?” And he’s appalled that they haven’t read this stuff.

As a result, Jay had the paradoxical image of a man who was immersed in the lore of magic while also keeping much of that world at arm’s length. “Clearly, Jay has been more interested in the craft of magic than in the practical exigencies of promoting himself as a performer,” Singer writes, and Jay was perfectly fine with that reputation. In Learned Pigs, Jay writes admiringly of the conjurer Max Malini:

Yet far more than Malini’s contemporaries, the famous conjurers Herrmann, Kellar, Thurston, and Houdini, Malini was the embodiment of what a magician should be—not a performer who requires a fully equipped stage, elaborate apparatus, elephants, or handcuffs to accomplish his mysteries, but one who can stand a few inches from you and with a borrowed coin, a lemon, a knife, a tumbler, or a pack of cards convince you he performs miracles.

This was obviously how Jay liked to see himself, as he says with equal affection of the magician Dai Vernon: “Making money was only a means of allowing him to sit in a hotel room and think about his art, about cups and balls and coins and cards.” Yet the reality must have been more complicated. You don’t become as famous or beloved as Ricky Jay without an inhuman degree of ambition, however carefully hidden, and he cultivated attention in ways that allowed him to maintain his air of remove. Apart from Vernon, his other essential mentor was Charlie Miller, who seems to have played the same role in the lives of other magicians that Joe Ancis, “the funniest man in New York City,” did for Lenny Bruce. Both were geniuses who hated to perform, so they practiced their art for a small handful of confidants and fellow obsessives. And the fact that Jay, by contrast, lived the kind of life that would lead him to be widely mourned by the public indicates that there was rather more to him than the reticent persona that he projected.

Jay did perform for paying audiences, of course, and Singer’s article closes with his preparations for a show, Ricky Jay and His 52 Assistants, that promises to relieve him from the “tenuous circumstances” that result from his devotion to art. (A decade later, my brother and I went to see his second Broadway production, On the Stem, which is still one of my favorite memories from a lifetime of theatergoing.) But he evidently had mixed feelings about the whole enterprise, which left him even more detached from the performers with whom he was frequently surrounded. As Weber notes: “Ricky won’t perform for magicians at magic shows, because they’re interested in things. They don’t get it. They won’t watch him and be inspired to make magic of their own. They’ll be inspired to do that trick that belongs to Ricky…There’s this large body of magic lumpen who really don’t understand Ricky’s legacy—his contribution to the art, his place in the art, his technical proficiency and creativity. They think he’s an élitist and a snob.” Or as the writer and mentalist T.A. Walters tells Singer:

Some magicians, once they learn how to do a trick without dropping the prop on their foot, go ahead and perform in public. Ricky will work on a routine a couple of years before even showing anyone. One of the things that I love about Ricky is his continued amazement at how little magicians seem to care about the art. Intellectually, Ricky seems to understand this, but emotionally he can’t accept it. He gets as upset about this problem today as he did twenty years ago.

If the remarkable life that he lived is any indication, Jay never did get over it. According to Singer, Jay once asked Dai Vernon how he dealt with the intellectual indifference of other magicians to their craft. Vernon responded: “I forced myself not to care.” And after his friend’s death, Jay said wryly: “Maybe that’s how he lived to be ninety-eight years old.”

The authoritarian personality

leave a comment »

Note: I’m taking a few days off for Thanksgiving. This post originally appeared, in a slightly different form, on August 29, 2017.

In 1950, a group of four scholars working at UC Berkeley published a massive book titled The Authoritarian Personality. Three of its authors, including the philosopher and polymath Theodor W. Adorno, were Jewish, and the study was expressly designed to shed light on the rise of fascism and Nazism, which it conceived in large part as the manifestation of an abnormal personality syndrome magnified by mass communication. The work was immediately controversial, and some of the concerns that have been raised about its methodology—which emphasized individual pathology over social factors—appear to be legitimate. (One of its critics, the psychologist Thomas Pettigrew, conducted a study of American towns in the North and South that cast doubt on whether such traits as racism could truly be seen as mental illnesses: “You almost had to be mentally ill to be tolerant in the South. The authoritarian personality was a good explanation at the individual level, but not at the societal level.” The italics are mine.) Yet the book remains hugely compelling, and we seem to be living in a moment in which its ideas are moving back toward the center of the conversation, with attention from both ends of the political spectrum. Richard Spencer, of all people, wrote his master’s thesis on Adorno and Richard Wagner, while a bizarre conspiracy theory has emerged on the right that Adorno was the secret composer and lyricist for the Beatles. More reasonably, the New Yorker music critic Alex Ross wrote shortly after the last presidential election:

The combination of economic inequality and pop-cultural frivolity is precisely the scenario Adorno and others had in mind: mass distraction masking elite domination. Two years ago, in an essay on the persistence of the Frankfurt School, I wrote, “If Adorno were to look upon the cultural landscape of the twenty-first century, he might take grim satisfaction in seeing his fondest fears realized.” I spoke too soon. His moment of vindication is arriving now.

And when you leaf today through The Authoritarian Personality, which is available in its entirety online, you’re constantly rocked by flashes of recognition. In the chapter “Politics and Economics in the Interview Material,” before delving into the political beliefs expressed by the study’s participants, Adorno writes:

The evaluation of the political statements contained in our interview material has to be considered in relation to the widespread ignorance and confusion of our subjects in political matters, a phenomenon which might well surpass what even a skeptical observer should have anticipated. If people do not know what they are talking about, the concept of “opinion,” which is basic to any approach to ideology, loses its meaning.

Ignorance and confusion are bad enough, but they become particularly dangerous when combined with the social pressure to have an opinion about everything, which encourages people to fake their way through it. As Adorno observes: “Those who do not know but feel somehow obliged to have political opinions, because of some vague idea about the requirements of democracy, help themselves with scurrilous ways of thinking and sometimes with forthright bluff.” And he describes this bluffing and bluster in terms that should strike us as uncomfortably familiar:

The individual has to cope with problems which he actually does not understand, and he has to develop certain techniques of orientation, however crude and fallacious they may be, which help him to find his way through the dark…On the one hand, they provide the individual with a kind of knowledge, or with substitutes for knowledge, which makes it possible for him to take a stand where it is expected of him, whilst he is actually not equipped to do so. On the other hand, by themselves they alleviate psychologically the feeling of anxiety and uncertainty and provide the individual with the illusion of some kind of intellectual security, of something he can stick to even if he feels, underneath, the inadequacy of his opinions.

So what do we do when we’re expected to have opinions on subjects that we can’t be bothered to actually understand? Adorno argues that we tend to fall back on the complementary strategies of stereotyping and personification. Of the former, he writes:

Rigid dichotomies, such as that between “good and bad,” “we and the others,” “I and the world” date back to our earliest developmental phases…They point back to the “chaotic” nature of reality, and its clash with the omnipotence fantasies of earliest infancy. Our stereotypes are both tools and scars: the “bad man” is the stereotype par excellence…Modern mass communications, molded after industrial production, spread a whole system of stereotypes which, while still being fundamentally “ununderstandable” to the individual, allow him at any moment to appear as being up to date and “knowing all about it.” Thus, stereotyped thinking in political matters is almost inescapable.

Adorno was writing nearly seventy years ago, and the pressure to “know all about” politics—as well as the volume of stereotyped information being fed to consumers—has increased exponentially. But stereotypes, while initially satisfying, exist on the level of abstraction, which leads to the need for personalization as well:

[Personalization is] the tendency to describe objective social and economic processes, political programs, internal and external tensions in terms of some person identified with the case in question rather than taking the trouble to perform the impersonal intellectual operations required by the abstractness of the social processes themselves…To know something about a person helps one to seem “informed” without actually going into the matter: it is easier to talk about names than about issues, while at the same time the names are recognized identification marks for all current topics.

Adorno concludes that “spurious personalization is an ideal behavior pattern for the semi­-erudite, a device somewhere in the middle between complete ignorance and that kind of ‘knowledge’ which is being promoted by mass communication and industrialized culture.” This is a tendency, needless to say, that we find on both the left and the right, and it becomes particularly prevalent in periods of maximum confusion:

The opaqueness of the present political and economic situation for the average person provides an ideal opportunity for retrogression to the infantile level of stereotypy and personalization…Stereotypy helps to organize what appears to the ignorant as chaotic: the less he is able to enter into a really cognitive process, the more stubbornly he clings to certain patterns, belief in which saves him the trouble of really going into the matter.

This seems to describe our predicament uncannily well, and I could keep listing the parallels forever. (Adorno has an entire subchapter titled “No Pity for the Poor.”) Whatever else you might think of his methods, there’s no question that he captures our current situation with frightening clarity: “As less and less actually depends on individual spontaneity in our political and social organization, the more people are likely to cling to the idea that the man is everything and to seek a substitute for their own social impotence in the supposed omnipotence of great personalities.” Most prophetically of all, Adorno draws a distinction between genuine conservatives and “pseudoconservatives,” describing the former as “supporting not only capitalism in its liberal, individualistic form but also those tenets of traditional Americanism which are definitely antirepressive and sincerely democratic, as indicated by an unqualified rejection of antiminority prejudices.” And he adds chillingly: “The pseudoconservative is a man who, in the name of upholding traditional American values and institutions and defending them against more or less fictitious dangers, consciously or unconsciously aims at their abolition.”

Written by nevalalee

November 23, 2018 at 9:00 am

%d bloggers like this: