Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Frank Bruni

The Order of St. John’s

leave a comment »

When I think back on my personal experience with the great books, as I did here the other day, I have to start with the six weeks that I spent as a high school junior at St. John’s College in Annapolis, Maryland. As I’ve discussed in greater detail before, I had applied to the Telluride Associate Summer Program on the advice of my guidance counselor. It was an impulsive decision, but I was accepted, and I don’t think it’s an exaggeration to call it one of the three or four most significant turning points in my entire life. I was more than primed for a program like this—I had just bought my own set of the Great Books of the Western World at a church book sale—and I left with my head full of the values embodied by the college, which still structures its curriculum around a similar notion of the Western Canon. Throughout the summer, I attended seminars with seventeen other bright teenagers, and as we worked our way from Plato’s Cratylus through Wittgenstein’s Philosophical Investigations, it all seemed somehow normal. I more or less assumed that this was how college would be, which wasn’t entirely true, although I did my best to replicate the experience. Looking back, in fact, I suspect that my time at St. John’s was more responsible than any other factor for allowing me to attend the college of my choice, and it certainly played a role in my decision to major in classics. But it’s only now that I can fully appreciate how much privilege went into each stage in that process. It came down to a series of choices, which I was able to make freely, and while I don’t think I always acted correctly, I’m amazed at how lucky I was, and how the elements of a liberal education itself managed to obscure that crucial point.

I’ve been thinking about this recently because of an article by Frank Bruni in the New York Times, who paid a visit to the sister campus of St. John’s College in Santa Fe. He opens with a description that certainly would have appealed to my adolescent self, although probably not to most other teenagers:

Have I got a college for you. For your first two years, your regimen includes ancient Greek. And I do mean Greek, the language, not Greece, the civilization, though you’ll also hang with Aristotle, Aeschylus, Thucydides and the rest of the gang. There’s no choice in the matter. There’s little choice, period…You have no major, only “the program,” an exploration of the Western canon that was implemented in 1937 and has barely changed…It’s an increasingly exotic and important holdout against so many developments in higher education—the stress on vocational training, the treatment of students as fickle consumers, the elevation of individualism over a shared heritage—that have gone too far. It’s a necessary tug back in the other direction.

More than twenty years after I spent the summer there, the basic pitch for the college doesn’t seem to have changed. Its fans still draw a pointed comparison between the curriculum at St. John’s and the supposedly more “consumerist” approach of most undergraduate programs, and it tends to define itself in sharp contrast to the touchy-feely world around it. “Let your collegiate peers elsewhere design their own majors and frolic with Kerouac,” Bruni writes. “For you it’s Kant.”

Yet it isn’t hard to turn this argument on its head, or to recognize that there’s a real sense in which St. John’s might be one of the most individualistic and consumerist colleges in the entire country. (The article itself is headlined “The Most Contrarian College in America,” while Bruni writes that he was drawn to it “out of respect for its orneriness.” And a school for ornery contrarians sounds pretty individualistic to me.) We can start with the obvious point that “the stress on vocational training” at other colleges is the result of economic anxiety at a time of rising tuitions and crippling student loans. There’s tremendous pressure to turn students away from the humanities, and it isn’t completely unjustified. The ability to major in classics or philosophy reflects a kind of privilege in itself, at least in the form of the absence of some of those pressures, and it isn’t always about money. For better or worse, reading the great books is just about the most individualistic gesture imaginable, and its supposed benefits—what the dean of the Santa Fe campus characterizes as the creation of “a more thoughtful, reflective, self-possessed and authentic citizen, lover, partner, parent and member of the global economy”—are obsessively focused on the self. The students at St. John’s may not have the chance to shop around for classes once they get there, but they made a vastly more important choice as a consumer long before they even arrived. A choice of college amounts to a lot of things, but it’s certainly an act with financial consequences. In many cases, it’s the largest purchase that any of us will ever make. The option of spending one’s college years reading Hobbes and Spinoza at considerable cost doesn’t even factor into the practical or economic universe of most families, and it would be ridiculous to claim otherwise.

In other words, every student at St. John’s exercised his or her power in the academic marketplace when it mattered most. By comparison, the ability to tailor one’s class schedule seems like a fairly minor form of consumerism—which doesn’t detract from the quality of the product, which is excellent, as it should be at such prices. (Bruni notes approvingly that the college recently cut its annual tuition from $52,000 to $35,000, which I applaud, although it doesn’t change my underlying point.) But it’s difficult to separate the value of such an education from the existing qualities required for a high schooler to choose it in the first place. It’s hard for me to imagine a freshman at St. John’s who wasn’t intelligent, motivated, and individualistic, none of which would suffer from four years of immersion in the classics. They’re already lucky, which is a lesson that the great books won’t teach on their own. The Great Conversation tends to take place within a circle of authors who have been chosen for their resemblance to one another, or for how well they fit into a cultural narrative imposed on them after the fact, as Robert Maynard Hutchins writes in the introduction to Great Books of the Western World: “The set is almost self-selected, in the sense that one book leads to another, amplifying, modifying, or contradicting it.” And that’s fine. But it means that you rarely see these authors marveling over their own special status, which they take for granted. For a canon that consists entirely of books written by white men, there’s remarkably little discussion of privilege, because they live in it like fish in water—which is as good an argument for diversity as any I can imagine. The students at St. John’s may ask these hard questions about themselves, but if they do, it’s despite what they read, not because of it. Believe me, I should know.

Written by nevalalee

September 20, 2018 at 9:02 am

The A/B Test

with 2 comments

In this week’s issue of The New York Times Magazine, there’s a profile of Mark Zuckerberg by Farhad Manjoo, who describes how the founder of Facebook is coming to terms with his role in the world in the aftermath of last year’s election. I find myself thinking about Zuckerberg a lot these days, arguably even more than I use Facebook itself. We just missed overlapping in college, and with one possible exception, which I’ll mention later, he’s the most influential figure to emerge from those ranks in the last two decades. Manjoo depicts him as an intensely private man obliged to walk a fine line in public, leading him to be absurdly cautious about what he says: “When I asked if he had chatted with Obama about the former president’s critique of Facebook, Zuckerberg paused for several seconds, nearly to the point of awkwardness, before answering that he had.” Zuckerberg is trying to figure out what he believes—and how to act—under conditions of enormous scrutiny, but he also has more resources at his disposal than just about anyone else in history. Here’s the passage in the article that stuck with me the most:

The people who work on News Feed aren’t making decisions that turn on fuzzy human ideas like ethics, judgment, intuition, or seniority. They are concerned only with quantifiable outcomes about people’s actions on the site. That data, at Facebook, is the only real truth…This ideal runs so deep that the people who make News Feed often have to put aside their own notions of what’s best. “One of the things we’ve all learned over the years is that our intuition can be wrong a fair amount of the time,” John Hegeman, the vice president of product management and a News Feed team member, told me. “There are things you don’t expect will happen. And we learn a lot from that process: Why didn’t that happen, and what might that mean?”

Reading this, I began to reflect on how rarely we actually test our intuitions. I’ve spoken a lot on this blog about the role of intuitive thinking in the arts and sciences, mostly because it doesn’t get the emphasis it deserves, but there’s also no guarantee that intuition will steer us in the right direction. The psychologist Daniel Kahneman has devoted his career to showing how we tend to overvalue our gut reactions, particularly if we’ve been fortunate enough to be right in the past, and the study of human irrationality has become a rich avenue of research in the social sciences, which are often undermined by poor hunches of their own. It may not even be a matter of right or wrong. An intuitive choice may be better or worse than the alternative, but for the most part, we’ll never know. One of the quirks of Silicon Valley culture is that it claims to base everything on raw data, but it’s often in the service of notions that are outlandish, untested, and easy to misrepresent. Facebook comes closer than any company in existence to the ideal of an endless A/B test, in which the user base is randomly divided into two or more groups to see which approaches are the most effective. It’s the best lab ever developed for testing our hunches about human behavior. (Most controversially, Facebook modified the news feeds of hundreds of thousands of users to adjust the number of positive or negative posts, in order to gauge the emotional impact, and it has conducted similar tests on voter turnout.) And it shouldn’t surprise us if many of our intuitions turn out to be mistaken. If anything, we should expect them to be right about half the time—and if we can nudge that percentage just a little bit upward, in theory, it should give us a significant competitive advantage.

So what good is intuition, anyway? I like to start with William Goldman’s story about the Broadway producer George Abbott, who once passed a choreographer holding his head in his hands while the dancers stood around doing nothing. When Abbott asked what was wrong, the choreographer said that he couldn’t figure out what to do next. Abbott shot back: “Well, have them do something! That way we’ll have something to change.” Intuition, as I’ve argued before, is mostly about taking you from zero ideas to one idea, which you can then start to refine. John W. Campbell makes much the same argument in what might be his single best editorial, “The Value of Panic,” which begins with a maxim from the Harvard professor Wayne Batteau: “In total ignorance, try anything. Then you won’t be so ignorant.” Campbell argues that this provides an evolutionary rationale for panic, in which an animal acts “in a manner entirely different from the normal behavior patterns of the organism.” He continues:

Given: An organism with N characteristic behavior modes available. Given: An environmental situation which cannot be solved by any of the N available behavior modes, but which must be solved immediately if the organism is to survive. Logical conclusion: The organism will inevitably die. But…if we introduce Panic, allowing the organism to generate a purely random behavior mode not a member of the N modes characteristically available?

Campbell concludes: “When the probability of survival is zero on the basis of all known factors—it’s time to throw in an unknown.” In extreme situations, the result is panic; under less intense circumstances, it’s a blind hunch. You can even see them as points on a spectrum, the purpose of which is to provide us with a random action or idea that can then be revised into something better, assuming that we survive for long enough. But sometimes the animal just gets eaten.

The idea of refinement, revision, or testing is inseparable from intuition, and Zuckerberg has been granted the most powerful tool imaginable for asking hard questions and getting quantifiable answers. What he does with it is another matter entirely. But it’s also worth looking at his only peer from college who could conceivably challenge him in terms of global influence. On paper, Mark Zuckerberg and Jared Kushner have remarkable similarities. Both are young Jewish men—although Kushner is more observant—who were born less than four years and sixty miles apart. Kushner, whose acceptance to Harvard was so manifestly the result of his family’s wealth that it became a case study in a book on the subject, was a member of the final clubs that Zuckerberg badly wanted to join, or so Aaron Sorkin would have us believe. Both ended up as unlikely media magnates of a very different kind: Kushner, like Charles Foster Kane, took over a New York newspaper from a man named Carter. Yet their approaches to their newfound positions couldn’t be more different. Kushner has been called “a shadow secretary of state” whose portfolio includes Mexico, China, the Middle East, and the reorganization of the federal government, but it feels like one long improvisation, on the apparent assumption that he can wing it and succeed where so many others have failed. As Bruce Bartlett writes in the New York Times, without a staff, Kushner “is just a dilettante meddling in matters he lacks the depth or the resources to grasp,” and we may not have a chance to recover if his intuitions are wrong. In other words, he resembles his father-in-law, as Frank Bruni notes:

I’m told by insiders that when Trump’s long-shot campaign led to victory, he and Kushner became convinced not only that they’d tapped into something that everybody was missing about America, but that they’d tapped into something that everybody was missing about the two of them.

Zuckerberg and Kushner’s lives ran roughly in parallel for a long time, but now they’re diverging at a point at which they almost seem to be offering us two alternate versions of the future, like an A/B test with only one possible outcome. Neither is wholly positive, but that doesn’t make the choice any less stark. And if you think this sounds farfetched, bookmark this post, and read it again in about six years.

Awake in the Dark

leave a comment »

Jessica Chastain in Zero Dark Thirty

A movie, or any work of art, isn’t complete until someone sees it. Even the most modest studio film these days represents about two hundred years of collective work from the cast and crew, and when the result of their labor is projected on a screen in a darkened room, where it can shape and channel the emotions of a theater full of strangers, surprising things can happen. In Behind the Seen, Walter Murch compares this phenomenon to that of an old-fashioned radio tube, which takes a powerful but simple electrical current and combines it with a weak but coherent signal to transform it, say, into Beethoven’s Ninth Symphony. A similar thing happens to an audience in a theater:

The power—the energy—isn’t coming from the film. It’s coming from the collective lives and emotional world of the audience. Say it’s a big theater—you have a thousand people there, and the average age of that audience is 25. You have 25,000 years, three times recorded history, sitting in the audience. That’s a tremendously powerful but unorganized force that is looking for coherence.

And the mark of a great movie is one that takes up an unexpected life, for better or worse, once it meets the undirected power of a large popular audience.

I’ve been thinking about this ever since finally seeing Zero Dark Thirty, which I think is unquestionably the movie of the year. (If I were to repost my list of the year’s best films, it would occupy the top slot, just ahead of The Dark Knight Rises and Life of Pi.) It’s an incredible work, focused, complex but always clear, and directed with remarkable assurance by Kathryn Bigelow, who tells an often convoluted story, but never allows the eye to wander. Yet it’s a film that seems likely to be defined by the controversy over its depiction of torture. This isn’t the place to respond to such concerns in detail, except to note that Bigelow and writer Mark Boal have already argued their own case better than anyone else. But it seems to me that many of the commentators who see the movie as an implicit endorsement of torture—”No waterboarding, no Bin Laden,” as Frank Bruni writes—are reading something into it that ignores the subtleties of the film’s own structure, which begins with enhanced interrogation and then moves beyond it.

Power and Coherence

But it’s a testament to the skill and intelligence of Bigelow, Boal, and their collaborators that they’ve given us a movie that serves as a blank slate, on which viewers can project their own fears and concerns. Zero Dark Thirty doesn’t tell us what to think, and although some, like Andrew Sullivan, have taken this as an abdication of artistic responsibility, it’s really an example of the art of film at its height. It’s a movie for adults. So, in very different ways, are Lincoln and Django Unchained, which is why I’m not surprised by the slew of opinion pieces about the lack of “agency” in the black characters in Lincoln, or whether Django is really a story about a slave being saved by a white man. Such responses tell us more about the viewers than the movies themselves, and that’s fine—but we also need to recognize that movies that can evoke and sustain such questions are ultimately more interesting than films like Argo or Les Misérables, which reassure us at every turn about what we’re supposed to be feeling.

Needless to say, the Oscars have rarely rewarded this kind of ambiguity, which may be why Zero Dark Thirty had to content itself with a shared award for Best Sound Editing. And both Argo and Les Misérables are very good movies. But it takes remarkable skill and commitment to tell stories like this—and in particular, to give us all the satisfactions we crave from more conventional entertainment while also pushing forward into something darker. (That’s why many of our greatest, most problematic works of fiction tend to come from artists who have proven equally adept at constructing beautiful toys: Bigelow could never have made Zero Dark Thirty if she hadn’t already made Point Break.) When we’re sitting in the dark, looking for coherence, we’re at our must vulnerable, and when we’re faced with a movie that pushes our buttons while leaving us unsettled by its larger implications, it’s tempting to reduce it to something we can easily grasp. But in a medium that depends so much on the resonance between a work and its viewers, such films demand courage not just in the artist, but from the audience as well.

Written by nevalalee

February 25, 2013 at 9:50 am

%d bloggers like this: