Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘New York Times

The Bad Pennies, Part 1

with one comment

For the last couple of months, I’ve been trying to pull together the tangled ends of a story that seems so complicated—and which encompasses so many unlikely personalities—that I doubt that I’ll ever get to the bottom of it, at least not without devoting more time to the project than I currently have to spare. (It really requires a good biography, and maybe two, and in the meantime, I’m just going to throw out a few leads in the hopes that somebody else will follow up.) It centers on a man named William Herbert Sheldon, who was born in 1898 and died in 1977. Sheldon was a psychologist and numismatist who received his doctorate from the University of Chicago and studied under Carl Jung. He’s best known today for his theory of somatotypes, which classified all human beings into degrees of endomorph, mesomorph, or ectomorph, based on their physical proportions. Sheldon argued that an individual’s build was an indication of character and ability, as Ron Rosenbaum wrote over twenty years ago in a fascinating investigative piece for the New York Times:

[Sheldon] believed that every individual harbored within him different degrees of each of the three character components. By using body measurements and ratios derived from nude photographs, Sheldon believed he could assign every individual a three-digit number representing the three components, components that Sheldon believed were inborn—genetic—and remained unwavering determinants of character regardless of transitory weight change. In other words, physique equals destiny.

Sheldon’s work carried obvious overtones of eugenics, even racism, which must have been evident to many observers even at the time. (In the early twenties, Sheldon wrote a paper titled “The Intelligence of Mexican Children,” in which he asserted that “Negro intelligence” comes to a “standstill at about the tenth year.”) And these themes became even more explicit in the writings of his closest collaborator, the anthropologist Earnest A. Hooton. In the fifties, Hooton’s “research” on the physical attributes that were allegedly associated with criminality was treated with guarded respect even by the likes of Martin Gardner, who wrote in his book Fads and Fallacies:

The theory that criminals have characteristic “stigmata”—facial and bodily features which distinguish them from other men—was…revived by Professor Earnest A. Hooton, of the Harvard anthropology faculty. In a study made in the thirties, Dr. Hooton found all kinds of body correlations with certain types of criminality. For example, robbers tend to have heavy beards, diffused pigment in the iris, attached ear lobes, and six other body traits. Hooton must not be regarded as a crank, however—his work is too carefully done to fall into that category—but his conclusions have not been accepted by most of his colleagues, who think his research lacked adequate controls.

Gardner should have known better. Hooton, like Sheldon, was obsessed with dividing up human beings on the basis of their morphological characteristics, as he wrote in the Times in 1936: “Our real purpose should be to segregate and eliminate the unfit, worthless, degenerate and anti-social portion of each racial and ethnic strain in our population, so that we may utilize the substantial merits of the sound majority, and the special and diversified gifts of its superior members.”

Sheldon and Hooton’s work reached its culmination, or nadir, in one of the strangest episodes in the history of anthropology, which Rosenbaum’s article memorably calls “The Great Ivy League Nude Posture Photo Scandal.” For decades, such institutions as Harvard College had photographed incoming freshmen in the nude, supposedly to look for signs of scoliosis and rickets. Sheldon and Hooton took advantage of these existing programs to take nude photos of students, both male and female, at colleges including Harvard, Radcliffe, Princeton, Yale, Wellesley, and Vassar, ostensibly to study posture, but really to gather raw data for their work on somatotypes. The project went on for decades, and Rosenbaum points out that the number of famous alumni who had their pictures taken staggers the imagination: “George Bush, George Pataki, Brandon Tartikoff and Bob Woodward were required to do it at Yale. At Vassar, Meryl Streep; at Mount Holyoke, Wendy Wasserstein; at Wellesley, Hillary Rodham and Diane Sawyer.” After some diligent sleuthing, Rosenbaum determined that most of these photographs were later destroyed, but a collection of negatives survived at the National Anthropological Archives in the Smithsonian, where he was ultimately allowed to view some of them. He writes of the experience:

As I thumbed rapidly through box after box to confirm that the entries described in the Finder’s Aid were actually there, I tried to glance at only the faces. It was a decision that paid off, because it was in them that a crucial difference between the men and the women revealed itself. For the most part, the men looked diffident, oblivious. That’s not surprising considering that men of that era were accustomed to undressing for draft physicals and athletic-squad weigh-ins. But the faces of the women were another story. I was surprised at how many looked deeply unhappy, as if pained at being subjected to this procedure. On the faces of quite a few I saw what looked like grimaces, reflecting pronounced discomfort, perhaps even anger.

And it’s clearly the women who bore the greatest degree of lingering humiliation and fear. Rumors circulated for years that the pictures had been stolen and sold, and such notable figures as Nora Ephron, Sally Quinn, and Judith Martin speak candidly to Rosenbaum of how they were haunted by these memories. (Quinn tells him: “You always thought when you did it that one day they’d come back to haunt you. That twenty-five years later, when your husband was running for president, they’d show up in Penthouse.” For the record, according to Rosenbaum, when the future Hillary Clinton attended Wellesley, undergraduates were allowed to take the pictures “only partly nude.”) Rosenbaum captures the unsavory nature of the entire program in terms that might have been published yesterday:

Suddenly the subjects of Sheldon’s photography leaped into the foreground: the shy girl, the fat girl, the religiously conservative, the victim of inappropriate parental attention…In a culture that already encourages women to scrutinize their bodies critically, the first thing that happens to these women when they arrive at college is an intrusive, uncomfortable, public examination of their nude bodies.

If William Herbert Sheldon’s story had ended there, it would be strange enough, but there’s a lot more to be told. I haven’t even mentioned his work as a numismatist, which led to a stolen penny scandal that rocked the world of coin collecting as deeply as the nude photos did the Ivy League. But the real reason I wanted to talk about him involves one of his protégés, whom Sheldon met while teaching in the fifties at Columbia. His name was Walter H. Breen, who later married the fantasy author Marion Zimmer Bradley—which leads us in turn to one of the darkest episodes in the entire history of science fiction. I’ll be talking more about this tomorrow.

Written by nevalalee

December 3, 2018 at 8:35 am

The Men Who Saw Tomorrow, Part 1

with 2 comments

If there’s a single theme that runs throughout my book Astounding, it’s the two sides of the editor John W. Campbell. These days, Campbell tends to be associated with highly technical “hard” science fiction with an emphasis on physics and engineering, but he had an equally dominant mystical side, and from the beginning, you often see the same basic impulses deployed in both directions. (After the memory of his career had faded, much of this history was quietly revised, as Algis Burdrys notes in Benchmarks Revisited: “The strong mystical bent displayed among even the coarsest cigar-chewing technists is conveniently overlooked, and Campbell’s subsequent preoccupation with psionics is seen as an inexplicable deviation from a life of hitherto unswerving straight devotion to what we all agree is reasonability.”) As an undergraduate at M.I.T. and Duke, Campbell was drawn successively to Norbert Wiener, the founder of cybernetics, and Joseph Rhine, the psychologist best known for his statistical studies of telepathy. Both professors fed into his fascination with a possible science of the mind, but along strikingly different lines, and he later pursued both dianetics, which he originally saw as a kind of practical cybernetics, and explorations of psychic powers. Much the same holds true of his other great obsession—the problem of foreseeing the future. As I discuss today in an essay in the New York Times, its most famous manifestation was the notion of psychohistory, the fictional science of prediction in Asimov’s Foundation series. But at a time of global uncertainty, it wasn’t the method of forecasting that counted, but the accuracy of the results, and even as Campbell was collaborating with Asimov, his interest in prophecy was taking him to even stranger places.

The vehicle for the editor’s more mystical explorations was Unknown, the landmark fantasy pulp that briefly channeled these inclinations away from the pages of Astounding. (In my book, I argue that the simultaneous existence of these two titles purified science fiction at a crucial moment, and that the entire genre might have evolved in altogether different ways if Campbell had been forced to express all sides of his personality in a single magazine.) As I noted here the other day, in an attempt to attract a wider audience, Campbell removed the cover paintings from Unknown, hoping to make it look like a more mainstream publication. The first issue with the revised design was dated July 1940, and in his editor’s note, Campbell explicitly addressed the “new discoverers” who were reading the magazine for the first time. He grandly asserted that fantasy represented “a completely untrammeled literary medium,” and as an illustration of the kinds of subjects that he intended to explore in his stories, he offered a revealing example:

Until somebody satisfactorily explains away the unquestionable masses of evidence showing that people do have visions of things yet to come, or of things occurring at far-distant points—until someone explains how Nostradamus, the prophet, predicted things centuries before they happened with such minute detail (as to names of people not to be born for half a dozen generations or so!) that no vague “Oh, vague generalities—things are always happening that can be twisted to fit!” can possibly explain them away—until the time those are docketed and labeled and nearly filed—they belong to The Unknown.

It was Campbell’s first mention in print of Nostradamus, the sixteenth-century French prophet, but it wouldn’t be the last. A few months later, Campbell alluded in another editorial to the Moberly-Jourdain incident, in which two women claimed to have traveled over a century back in time on a visit to the Palace of Versailles. The editor continued: “If it happens one way—how about the other? How about someone slipping from the past to the future? It is known—and don’t condemn till you’ve read a fair analysis of the old man’s works—that Nostradamus, the famous French prophet, did not guess at what might happen; he recorded what did happen—before it happened. His accuracy of prophecy runs considerably better, actually, than the United States government crop forecasts, in percentage, and the latter are certainly used as a basis for business.” Campbell then drew a revealing connection between Nostradamus and the war in Europe:

Incidentally, to avoid disappointment, Nostradamus did not go into much detail about this period. He was writing several hundred years ago, for people of that time—and principally for Parisians. He predicted in some detail the French Revolution, predicted several destructions of Paris—which have come off on schedule, to date—and did not predict destruction of Paris for 1940. He did, however, for 1999—by a “rain of fire from the East.” Presumably he didn’t have any adequate terms for airplane bombs, so that may mean thermite incendiaries. But the present period, too many centuries from his own times, would be of minor interest to him, and details are sketchy. The prophecy goes up to about the thirty-fifth century.

And the timing was highly significant. Earlier that year, Campbell had published the nonfiction piece “The Science of Whithering” by L. Sprague de Camp in Astounding, shortly after German troops marched into Paris. De Camp’s article, which discussed the work of such cyclical historians as Spengler and Toynbee, represented the academic or scientific approach the problem of forecasting, and it would soon find its fictional expression in such stories as Jack Williamson’s “Breakdown” and Asimov’s “Foundation.” As usual, however, Campbell was playing both sides, and he was about to pursue a parallel train of thought in Unknown that has largely been forgotten. Instead of attempting to explain Nostradamus in rational terms, Campbell ventured a theory that was even more fantastic than the idea of clairvoyance:

Occasionally a man—vanishes…And somehow, he falls into another time. Sometimes future—sometimes past. And sometimes he comes back, sometimes he doesn’t. If he does come back, there’d be a tendency, and a smart one, to shut up; it’s mighty hard to prove. Of course, if he’s a scholarly gentlemen, he might spend his unintentional sojourn in the future reading histories of his beloved native land. Then, of course, he ought to be pretty accurate at predicting revolutions and destruction of cities. Even be able to name inconsequential details, as Nostradamus did.

To some extent, this might have been just a game that he was playing for his readers—but not completely. Campbell’s interest in Nostradamus was very real, and just as he had used Williamson and Asimov to explore psychohistory, he deployed another immensely talented surrogate to look into the problem of prophecy. His name was Anthony Boucher. I’ll be exploring this in greater detail tomorrow.

Note: Please join me today at 12:00pm ET for a Twitter AMA to celebrate the release of the fantastic new horror anthology Terror at the Crossroads, which includes my short story “Cryptids.”

Fire and Fury

leave a comment »

I’ve been thinking a lot recently about Brian De Palma’s horror movie The Fury, which celebrated its fortieth anniversary earlier this year. More specifically, I’ve been thinking about Pauline Kael’s review, which is one of the pieces included in her enormous collection For Keeps. I’ve read that book endlessly for two decades now, and as a result, The Fury is one of those films from the late seventies—like Philip Kaufman’s Invasion of the Body Snatchers—that endure in my memory mostly as a few paragraphs of Kael’s prose. In particular, I often find myself remembering these lines:

De Palma is the reverse side of the coin from Spielberg. Close Encounters gives us the comedy of hope. The Fury is the comedy of cruelly dashed hope. With Spielberg, what happens is so much better than you dared hope that you have to laugh; with De Palma, it’s so much worse than you feared that you have to laugh.

That sums up how I feel about a lot of things these days, when everything is consistently worse than I could have imagined, although laughter usually feels very far away. (Another line from Kael inadvertently points to the danger of identifying ourselves with our political heroes: “De Palma builds up our identification with the very characters who will be destroyed, or become destroyers, and some people identified so strongly with Carrie that they couldn’t laugh—they felt hurt and betrayed.”) And her description of one pivotal scene, which appears in her review of Dressed to Kill, gets closer than just about anything else to my memories of the last presidential election: “There’s nothing here to match the floating, poetic horror of the slowed-down sequence in which Amy Irving and Carrie Snodgress are running to freedom: it’s as if each of them and each of the other people on the street were in a different time frame, and Carrie Snodgress’s face is full of happiness just as she’s flung over the hood of a car.”

The Fury seems to have been largely forgotten by mainstream audiences, but references to it pop up in works ranging from Looper to Stranger Things, and I suspect that it might be due for a reappraisal. It’s about two teenagers, a boy and a girl, who have never met, but who share a psychic connection. As Kael notes, they’re “superior beings” who might have been prophets or healers in an earlier age, but now they’ve been targeted by our “corrupt government…which seeks to use them for espionage, as secret weapons.” Reading this now, I’m slightly reminded of our current administration’s unapologetic willingness to use vulnerable families and children as political pawns, but that isn’t really the point. What interests me more is how De Palma’s love of violent imagery undercuts the whole moral arc of the movie. I might call this a problem, except that it isn’t—it’s a recurrent feature of his work that resonated uneasily with viewers who were struggling to integrate the specter of institutionalized violence into their everyday lives. (In a later essay, Kael wrote of acquaintances who resisted such movies because of its association with the “guilty mess” of the recently concluded war: “There’s a righteousness in their tone when they say they don’t like violence; I get the feeling that I’m being told that my urging them to see The Fury means that I’ll be responsible if there’s another Vietnam.”) And it’s especially striking in this movie, which for much of its length is supposedly about an attempt to escape this cycle of vengeance. Of the two psychic teens, Robyn, played by Andrew Stevens, eventually succumbs to it, while Gillian, played by Amy Irving, fights it for as long as she can. As Kael explains: “Both Gillian and Robyn have the power to zap people with their minds. Gillian is trying to cling to her sanity—she doesn’t want to hurt anyone. And, knowing that her power is out of her conscious control, she’s terrified of her own secret rages.”

And it’s hard for me to read this passage now without connecting it to the ongoing discussion over women’s anger, in which the word “fury” occurs with surprising frequency. Here’s the journalist Rebecca Traister writing in the New York Times, in an essay adapted from her bestselling book Good and Mad:

Fury was a tool to be marshaled by men like Judge Kavanaugh and Senator Graham, in defense of their own claims to political, legal, public power. Fury was a weapon that had not been made available to the woman who had reason to question those claims…Most of the time, female anger is discouraged, repressed, ignored, swallowed. Or transformed into something more palatable, and less recognizable as fury—something like tears. When women are truly livid, they often weep…This political moment has provoked a period in which more and more women have been in no mood to dress their fury up as anything other than raw and burning rage.

Traister’s article was headlined: “Fury is a Political Weapon. And Women Need to Wield It.” And if you were so inclined, you could take The Fury as an extended metaphor for the issue that Casey Cep raises in her recent roundup of books on the subject in The New Yorker: “A major problem with anger is that some people are allowed to express it while others are not.” In the film, Gillian spends most of the movie resisting her violent urges, while her male psychic twin gives into them, and the climax—which is the only scene that most viewers remember—hinges on her embrace of the rage that Robyn passed to her at the moment of his death.

This brings us to Childress, the villain played by John Cassavetes, whose demise Kael hyperbolically describes as “the greatest finish for any villain ever.” A few paragraphs earlier, Kael writes of this scene:

This is where De Palma shows his evil grin, because we are implicated in this murderousness: we want it, just as we wanted to see the bitchy Chris get hers in Carrie. Cassavetes is an ideal villain (as he was in Rosemary’s Baby)—sullenly indifferent to anything but his own interests. He’s so right for Childress that one regrets that there wasn’t a real writer around to match his gloomy, viscous nastiness.

“Gloomy, viscous nastiness” might ring a bell today, and Childress’s death—Gillian literally blows him up with her mind—feels like the embodiment of our impulses for punishment, revenge, and retribution. It’s stunning how quickly the movie discards Gillian’s entire character arc for the sake of this moment, but what makes the ending truly memorable is what happens next, which is nothing. Childress explodes, and the film just ends, because it has nothing left to show us. That works well enough in a movie, but in real life, we have to face the problem of what Brittney Cooper, whose new book explicitly calls rage a superpower, sums up as “what kind of world we want to see, not just what kind of things we want to get rid of.” In her article in The New Yorker, Cep refers to the philosopher and classicist Martha Nussbaum’s treatment of the Furies themselves, who are transformed at the end of the Oresteia into the Eumenides, “beautiful creatures that serve justice rather than pursue cruelty.” It isn’t clear how this transformation takes place, and De Palma, typically, sidesteps it entirely. But if we can’t imagine anything beyond cathartic vengeance, we’re left with an ending closer to what Kael writes of Dressed to Kill: “The spell isn’t broken and [De Palma] doesn’t fully resolve our fear. He’s saying that even after the horror has been explained, it stays with you—the nightmare never ends.”

Written by nevalalee

October 30, 2018 at 9:24 am

The chosen ones

with one comment

In his recent New Yorker profile of Mark Zuckerberg, Evan Osnos quotes one of the Facebook founder’s close friends: “I think Mark has always seen himself as a man of history, someone who is destined to be great, and I mean that in the broadest sense of the term.” Zuckerberg feels “a teleological frame of feeling almost chosen,” and in his case, it happened to be correct. Yet this tells us almost nothing abut Zuckerberg himself, because I can safely say that most other undergraduates at Harvard feel the same way. A writer for The Simpsons once claimed that the show had so many presidential jokes—like the one about Grover Cleveland spanking Grandpa “on two non-consecutive occasions”—because most of the writers secretly once thought that they would be president themselves, and he had a point. It’s very hard to do anything interesting in life without the certainty that you’re somehow one of the chosen ones, even if your estimation of yourself turns out to be wildly off the mark. (When I was in my twenties, my favorite point of comparison was Napoleon, while Zuckerberg seems to be more fond of Augustus: “You have all these good and bad and complex figures. I think Augustus is one of the most fascinating. Basically, through a really harsh approach, he established two hundred years of world peace.”) This kind of conviction is necessary for success, although hardly sufficient. The first human beings to walk on Mars may have already been born. Deep down, they know it, and this knowledge will determine their decisions for the rest of their lives. Of course, thousands of others “know” it, too. And just a few of them will turn out to be right.

One of my persistent themes on this blog is how we tend to confuse talent with luck, or, more generally, to underestimate the role that chance plays in success or failure. I never tire of quoting the economist Daniel Kahneman, who in Thinking Fast and Slow shares what he calls his favorite equation:

Success = Talent + Luck
Great Success = A little more talent + A lot of luck

The truth of this statement seems incontestable. Yet we’re all reluctant to acknowledge its power in our own lives, and this tendency only increases as the roles played by luck and privilege assume a greater importance. This week has been bracketed by news stories about two men who embody this attitude at its most extreme. On the one hand, you have Brett Kavanaugh, a Yale legacy student who seems unable to recognize that his drinking and his professional success weren’t mutually exclusive, but closer to the opposite. He occupied a cultural and social stratum that gave him the chance to screw up repeatedly without lasting consequences, and we’re about to learn how far that privilege truly extends. On the other hand, you have yesterday’s New York Times exposé of Donald Trump, who took hundreds of millions of dollars from his father’s real estate empire—often in the form of bailouts for his own failed investments—while constantly describing himself as a self-made billionaire. This is hardly surprising, but it’s still striking to see the extent to which Fred Trump played along with his son’s story. He understood the value of that myth.

This gets at an important point about privilege, no matter which form it takes. We have a way of visualizing these matters in spatial terms—”upper class,” “lower class,” “class pyramid,” “rising,” “falling,” or “stratum” in the sense that I used it above. But true privilege isn’t spatial, but temporal. It unfolds over time, by giving its beneficiaries more opportunities to fail and recover, when those living at the edge might not be able to come back from the slightest misstep. We like to say that a privileged person is someone who was born on third base and thinks he hit a triple, but it’s more like being granted unlimited turns at bat. Kavanaugh provides a vivid reminder, in case we needed one, that a man who fits a certain profile has the freedom to make all kinds of mistakes, the smallest of which would be fatal for someone who didn’t look like he did. And this doesn’t just apply to drunken misbehavior, criminal or otherwise, but even to the legitimate failures that are necessary for the vast majority of us to achieve real success. When you come from the right background, it’s easier to survive for long enough to benefit from the effects of luck, which influences the way that we talk about failure itself. Silicon Valley speaks of “failing faster,” which only makes sense when the price of failure is humiliation or the loss of investment capital, not falling permanently out of the middle class. And as I’ve noted before, Pixar’s creative philosophy, which Andrew Stanton described as a process in which “the films still suck for three out of the four years it takes to make them,” is only practicable for filmmakers who look and sound like their counterparts at the top, which grants them the necessary creative freedom to fail repeatedly—a luxury that women are rarely granted.

This may all come across as unbelievably depressing, but there’s a silver lining, and it took me years to figure it out. The odds of succeeding in any creative field—which includes nearly everything in which the standard career path isn’t clearly marked—are minuscule. Few who try will ever make it, even if they have “a teleological frame of feeling almost chosen.” This isn’t due to a lack of drive or talent, but of time and second chances. When you combine the absence of any straightforward instructions with the crucial role played by luck, you get a process in which repeated failure over a long period is almost inevitable. Those who drop out don’t suffer from weak nerves, but from the fact that they’ve used up all of their extra lives. Privilege allows you to stay in the game for long enough for the odds to turn in your favor, and if you’ve got it, you may as well use it. (An Ivy League education doesn’t guarantee success, but it drastically increases your ability to stick around in the middle class in the meantime.) In its absence, you can find strategies of minimizing risk in small ways while increasing it on the highest levels, which just another word for becoming a bohemian. And the big takeaway here is that since the probability of success is already so low, you may as well do exactly what you want. It can be tempting to tailor your work to the market, reasoning that it will increase your chances ever so slightly, but in reality, the difference is infinitesimal. An objective observer would conclude that you’re not going to make it either way, and even if you do, it will take about the same amount of time to succeed by selling out as it would by staying true to yourself. You should still do everything that you can to make the odds more favorable, but if you’re probably going to fail anyway, you might as well do it on your own terms. And that’s the only choice that matters.

Written by nevalalee

October 3, 2018 at 8:59 am

The Order of St. John’s

leave a comment »

When I think back on my personal experience with the great books, as I did here the other day, I have to start with the six weeks that I spent as a high school junior at St. John’s College in Annapolis, Maryland. As I’ve discussed in greater detail before, I had applied to the Telluride Associate Summer Program on the advice of my guidance counselor. It was an impulsive decision, but I was accepted, and I don’t think it’s an exaggeration to call it one of the three or four most significant turning points in my entire life. I was more than primed for a program like this—I had just bought my own set of the Great Books of the Western World at a church book sale—and I left with my head full of the values embodied by the college, which still structures its curriculum around a similar notion of the Western Canon. Throughout the summer, I attended seminars with seventeen other bright teenagers, and as we worked our way from Plato’s Cratylus through Wittgenstein’s Philosophical Investigations, it all seemed somehow normal. I more or less assumed that this was how college would be, which wasn’t entirely true, although I did my best to replicate the experience. Looking back, in fact, I suspect that my time at St. John’s was more responsible than any other factor for allowing me to attend the college of my choice, and it certainly played a role in my decision to major in classics. But it’s only now that I can fully appreciate how much privilege went into each stage in that process. It came down to a series of choices, which I was able to make freely, and while I don’t think I always acted correctly, I’m amazed at how lucky I was, and how the elements of a liberal education itself managed to obscure that crucial point.

I’ve been thinking about this recently because of an article by Frank Bruni in the New York Times, who paid a visit to the sister campus of St. John’s College in Santa Fe. He opens with a description that certainly would have appealed to my adolescent self, although probably not to most other teenagers:

Have I got a college for you. For your first two years, your regimen includes ancient Greek. And I do mean Greek, the language, not Greece, the civilization, though you’ll also hang with Aristotle, Aeschylus, Thucydides and the rest of the gang. There’s no choice in the matter. There’s little choice, period…You have no major, only “the program,” an exploration of the Western canon that was implemented in 1937 and has barely changed…It’s an increasingly exotic and important holdout against so many developments in higher education—the stress on vocational training, the treatment of students as fickle consumers, the elevation of individualism over a shared heritage—that have gone too far. It’s a necessary tug back in the other direction.

More than twenty years after I spent the summer there, the basic pitch for the college doesn’t seem to have changed. Its fans still draw a pointed comparison between the curriculum at St. John’s and the supposedly more “consumerist” approach of most undergraduate programs, and it tends to define itself in sharp contrast to the touchy-feely world around it. “Let your collegiate peers elsewhere design their own majors and frolic with Kerouac,” Bruni writes. “For you it’s Kant.”

Yet it isn’t hard to turn this argument on its head, or to recognize that there’s a real sense in which St. John’s might be one of the most individualistic and consumerist colleges in the entire country. (The article itself is headlined “The Most Contrarian College in America,” while Bruni writes that he was drawn to it “out of respect for its orneriness.” And a school for ornery contrarians sounds pretty individualistic to me.) We can start with the obvious point that “the stress on vocational training” at other colleges is the result of economic anxiety at a time of rising tuitions and crippling student loans. There’s tremendous pressure to turn students away from the humanities, and it isn’t completely unjustified. The ability to major in classics or philosophy reflects a kind of privilege in itself, at least in the form of the absence of some of those pressures, and it isn’t always about money. For better or worse, reading the great books is just about the most individualistic gesture imaginable, and its supposed benefits—what the dean of the Santa Fe campus characterizes as the creation of “a more thoughtful, reflective, self-possessed and authentic citizen, lover, partner, parent and member of the global economy”—are obsessively focused on the self. The students at St. John’s may not have the chance to shop around for classes once they get there, but they made a vastly more important choice as a consumer long before they even arrived. A choice of college amounts to a lot of things, but it’s certainly an act with financial consequences. In many cases, it’s the largest purchase that any of us will ever make. The option of spending one’s college years reading Hobbes and Spinoza at considerable cost doesn’t even factor into the practical or economic universe of most families, and it would be ridiculous to claim otherwise.

In other words, every student at St. John’s exercised his or her power in the academic marketplace when it mattered most. By comparison, the ability to tailor one’s class schedule seems like a fairly minor form of consumerism—which doesn’t detract from the quality of the product, which is excellent, as it should be at such prices. (Bruni notes approvingly that the college recently cut its annual tuition from $52,000 to $35,000, which I applaud, although it doesn’t change my underlying point.) But it’s difficult to separate the value of such an education from the existing qualities required for a high schooler to choose it in the first place. It’s hard for me to imagine a freshman at St. John’s who wasn’t intelligent, motivated, and individualistic, none of which would suffer from four years of immersion in the classics. They’re already lucky, which is a lesson that the great books won’t teach on their own. The Great Conversation tends to take place within a circle of authors who have been chosen for their resemblance to one another, or for how well they fit into a cultural narrative imposed on them after the fact, as Robert Maynard Hutchins writes in the introduction to Great Books of the Western World: “The set is almost self-selected, in the sense that one book leads to another, amplifying, modifying, or contradicting it.” And that’s fine. But it means that you rarely see these authors marveling over their own special status, which they take for granted. For a canon that consists entirely of books written by white men, there’s remarkably little discussion of privilege, because they live in it like fish in water—which is as good an argument for diversity as any I can imagine. The students at St. John’s may ask these hard questions about themselves, but if they do, it’s despite what they read, not because of it. Believe me, I should know.

Written by nevalalee

September 20, 2018 at 9:02 am

The end of flexibility

leave a comment »

A few days ago, I picked up my old paperback copy of Steps to an Ecology of Mind, which collects the major papers of the anthropologist and cyberneticist Gregory Bateson. I’ve been browsing through this dense little volume since I was in my teens, but I’ve never managed to work through it all from beginning to end, and I turned to it recently out of a vague instinct that it was somehow what I needed. (Among other things, I’m hoping to put together a collection of my short stories, and I’m starting to see that many of Bateson’s ideas are relevant to the themes that I’ve explored as a science fiction writer.) I owe my introduction to his work, as with so many other authors, to Stewart Brand of The Whole Earth Catalog, who advised in one edition:

[Bateson] wandered thornily in and out of various disciplines—biology, ethnology, linguistics, epistemology, psychotherapy—and left each of them altered with his passage. Steps to an Ecology of Mind chronicles that journey…In recommending the book I’ve learned to suggest that it be read backwards. Read the broad analyses of mind and ecology at the end of the book and then work back to see where the premises come from.

This always seemed reasonable to me, so when I returned to it last week, I flipped immediately to the final paper, “Ecology and Flexibility in Urban Civilization,” which was first presented in 1970. I must have read it at some point—I’ve quoted from it several times on this blog before—but as I looked over it again, I found that it suddenly seemed remarkably urgent. As I had suspected, it was exactly what I needed to read right now. And its message is far from reassuring.

Bateson’s central point, which seems hard to deny, revolves around the concept of flexibility, or “uncommitted potentiality for change,” which he identifies as a fundamental quality of any healthy civilization. In order to survive, a society has to be able to evolve in response to changing conditions, to the point of rethinking even its most basic values and assumptions. Bateson proposes that any kind of planning for the future include a budget for flexibility itself, which is what enables the system to change in response to pressures that can’t be anticipated in advance. He uses the analogy of an acrobat who moves his arms between different positions of temporary instability in order to remain on the wire, and he notes that a viable civilization organizes itself in ways that allow it to draw on such reserves of flexibility when needed. (One of his prescriptions, incidentally, serves as a powerful argument for diversity as a positive good in its own right: “There shall be diversity in the civilization, not only to accommodate the genetic and experiential diversity of persons, but also to provide the flexibility and ‘preadaptation’ necessary for unpredictable change.”) The trouble is that a system tends to eat up its own flexibility whenever a single variable becomes inflexible, or “uptight,” compared to the rest:

Because the variables are interlinked, to be uptight in respect to one variable commonly means that other variables cannot be changed without pushing the uptight variable. The loss of flexibility spreads throughout the system. In extreme cases, the system will only accept those changes which change the tolerance limits for the uptight variable. For example, an overpopulated society looks for those changes (increased food, new roads, more houses, etc.) which will make the pathological and pathogenic conditions of overpopulation more comfortable. But these ad hoc changes are precisely those which in longer time can lead to more fundamental ecological pathology.

When I consider these lines now, it’s hard for me not to feel deeply unsettled. Writing in the early seventies, Bateson saw overpopulation as the most dangerous source of stress in the global system, and these days, we’re more likely to speak of global warming, resource depletion, and income inequality. Change a few phrases here and there, however, and the situation seems largely the same: “The pathologies of our time may broadly be said to be the accumulated results of this process—the eating up of flexibility in response to stresses of one sort or another…and the refusal to bear with those byproducts of stress…which are the age-old correctives.” Bateson observes, crucially, that the inflexible variables don’t need to be fundamental in themselves—they just need to resist change long enough to become a habit. Once we find it impossible to imagine life without fossil fuels, for example, we become willing to condone all kinds of other disruptions to keep that one hard-programmed variable in place. A civilization naturally tends to expand into any available pocket of flexibility, blowing through the budget that it should have been holding in reserve. The result is a society structured along lines that are manifestly rigid, irrational, indefensible, and seemingly unchangeable. As Bateson puts it grimly:

Civilizations have risen and fallen. A new technology for the exploitation of nature or a new technique for the exploitation of other men permits the rise of a civilization. But each civilization, as it reaches the limits of what can be exploited in that particular way, must eventually fall. The new invention gives elbow room or flexibility, but the using up of that flexibility is death.

And it’s difficult for me to read this today without thinking of all the aspects of our present predicament—political, environmental, social, and economic. Since Bateson sounded his warning half a century ago, we’ve consumed our entire budget of flexibility, largely in response to a single hard-programmed variable that undermined all the other factors that it was meant to sustain. At its best, the free market can be the best imaginable mechanism for ensuring flexibility, by allocating resources more efficiently than any system of central planning ever could. (As one prominent politician recently said to The Atlantic: “I love competition. I want to see every start-up business, everybody who’s got a good idea, have a chance to get in the market and try…Really what excites me about markets is competition. I want to make sure we’ve got a set of rules that lets everybody who’s got a good, competitive idea get in the game.” It was Elizabeth Warren.) When capital is concentrated beyond reason, however, and solely for its own sake, it becomes a weapon that can be used to freeze other cultural variables into place, no matter how much pain it causes. As the anonymous opinion writer indicated in the New York Times last week, it will tolerate a president who demeans the very idea of democracy itself, as long as it gets “effective deregulation, historic tax reform, a more robust military and more,” because it no longer sees any other alternative. And this is where it gets us. For most of my life, I was ready to defend capitalism as the best system available, as long as its worst excesses were kept in check by measures that Bateson dismissively describes as “legally slapping the wrists of encroaching authority.” I know now that these norms were far more fragile than I wanted to acknowledge, and it may be too late to recover. Bateson writes: “Either man is too clever, in which case we are doomed, or he was not clever enough to limit his greed to courses which would not destroy the ongoing total system. I prefer the second hypothesis.” And I do, too. But I no longer really believe it.

The paper of record

leave a comment »

One of my favorite conventions in suspense fiction is the trope known as Authentication by Newspaper. It’s the moment in a movie, novel, or television show—and sometimes even in reality—when the kidnapper sends a picture of the victim holding a copy of a recent paper, with the date and headline clearly visible, as a form of proof of life. (You can also use it with piles of illicit cash, to prove that you’re ready to send payment.) The idea frequently pops up in such movies as Midnight Run and Mission: Impossible 2, and it also inspired a classic headline from The Onion: “Report: Majority Of Newspapers Now Purchased By Kidnappers To Prove Date.” It all depends on the fact that a newspaper is a datable object that is widely available and impossible to fake in advance, which means that it can be used to definitively establish the earliest possible day in which an event could have taken place. And you can also use the paper to verify a past date in subtler ways. A few weeks ago, Motherboard had a fascinating article on a time-stamping service called Surety, which provides the equivalent of a dated seal for digital documents. To make it impossible to change the date on one of these files, every week, for more than twenty years, Surety has generated a public hash value from its internal client database and published it in the classified ad section of the New York Times. As the company notes: “This makes it impossible for anyone—including Surety—to backdate timestamps or validate electronic records that were not exact copies of the original.”

I was reminded of all this yesterday, after the Times posted an anonymous opinion piece titled “I Am Part of the Resistance Inside the Trump Administration.” The essay, which the paper credits to “a senior official,” describes what amounts to a shadow government within the White House devoted to saving the president—and the rest of the country—from his worst impulses. And while the author may prefer to remain nameless, he certainly doesn’t suffer from a lack of humility:

Many of the senior officials in [Trump’s] own administration are working diligently from within to frustrate parts of his agenda and his worst inclinations. I would know. I am one of them…It may be cold comfort in this chaotic era, but Americans should know that there are adults in the room. We fully recognize what is happening. And we are trying to do what’s right even when Donald Trump won’t.

The result, he claims, is “a two-track presidency,” with a group of principled advisors doing their best to counteract Trump’s admiration for autocrats and contempt for international relations: “This isn’t the work of the so-called deep state. It’s the work of the steady state.” He even reveals that there was early discussion among cabinet members of using the Twenty-Fifth Amendment to remove Trump from office, although it was scuttled by concern of precipitating a crisis somehow worse than the one in which we’ve found ourselves.

Not surprisingly, the piece has generated a firestorm of speculation about the author’s identity, both online and in the White House itself, which I won’t bother covering here. What interests me are the writer’s reasons for publishing it in the first place. Over the short term, it can only destabilize an already volatile situation, and everyone involved will suffer for it. This implies that the author has a long game in mind, and it had better be pretty compelling. On Twitter, Nate Silver proposed one popular theory: “It seems like the person’s goal is to get outed and secure a very generous advance on a book deal.” He may be right—although if that’s the case, the plan has quickly gone sideways. Reaction on both sides has been far more critical than positive, with Erik Wemple of the Washington Post perhaps putting it best:

Like most anonymous quotes and tracts, this one is a PR stunt. Mr. Senior Administration Official gets to use the distributive power of the New York Times to recast an entire class of federal appointees. No longer are they enablers of a foolish and capricious president. They are now the country’s most precious and valued patriots. In an appearance on Wednesday afternoon, the president pronounced it all a “gutless” exercise. No argument here.

Or as the political blogger Charles P. Pierce says even more savagely in his response on Esquire: “Just shut up and quit.”

But Wemple’s offhand reference to “the distributive power” of the Times makes me think that the real motive is staring us right in the face. It’s a form of Authentication by Newspaper. Let’s say that you’re a senior official in the Trump administration who knows that time is running out. You’re afraid to openly defy the president, but you also want to benefit—or at least to survive—after the ship goes down. In the aftermath, everyone will be scrambling to position themselves for some kind of future career, even though the events of the last few years have left most of them irrevocably tainted. By the time it falls apart, it will be too late to claim that you were gravely concerned. But the solution is a stroke of genius. You plant an anonymous piece in the Times, like the founders of Surety publishing its hash value in the classified ads, except that your platform is vastly more prominent. And you place it there precisely so that you can point to it in the future. After Trump is no longer a threat, you can reveal yourself, with full corroboration from the paper of record, to show that you had the best interests of the country in mind all along. You were one of the good ones. The datestamp is right there. That’s your endgame, no matter how much pain it causes in the meantime. It’s brilliant. But it may not work. As nearly everyone has realized by now, the fact that a “steady state” of conservatives is working to minimize the damage of a Trump presidency to achieve “effective deregulation, historic tax reform, a more robust military and more” is a scandal in itself. This isn’t proof of life. It’s the opposite.

Written by nevalalee

September 6, 2018 at 8:59 am

%d bloggers like this: