Posts Tagged ‘Chicago Humanities Festival’
The Silver Standard
It’s never easy to predict the future, and on this particular blog, I try to avoid such prognostications, but just this once, I’m going to go out on a limb: I think there’s a good chance that Nate Silver will be Time‘s Man of the Year. Silver wasn’t the only talented analyst tracking the statistical side of the presidential race, but he’s by far the most visible, and he serves as the public face for the single most important story of this election: the triumph of information. Ultimately, the polls, at least in the aggregate, were right. Silver predicted the results correctly in all fifty states, and although he admits that his call in Florida could have gone either way, it’s still impressive—especially when you consider that his forecasts of the vote in individual swing states were startlingly accurate, differing from the final tally overall by less than 1.5%. At the moment, Silver is in an enviable position: he’s a public intellectual whose word, at least for now, carries immense weight among countless informed readers, regardless of the subject. And the real question is what he intends to do with this power.
I’ve been reading Silver for years, but after seeing him deliver a talk last week at the Chicago Humanities Festival, I emerged feeling even more encouraged by his newfound public stature. Silver isn’t a great public speaker: his presentation consisted mostly of slides drawn from his new book, The Signal and the Noise, and he sometimes comes across as a guy who spent the last six months alone in a darkened room, only to be thrust suddenly, blinking, into the light. Yet there’s something oddly reassuring about his nerdy, somewhat awkward presence. This isn’t someone like Jonah Lehrer, whose polished presentations at TED tend to obscure the fact that he doesn’t have many original ideas of his own, as was recently made distressingly clear. Silver is the real thing, a creature of statistics and spreadsheets who claims, convincingly, that if Excel were an Olympic sport, he’d be competing on the U.S. team. In person, he’s more candid and profane than in his lucid, often technical blog posts, but the impression one gets is of a man who has far more ideas in his head than he’s able to express in a short talk.
And his example is an instructive one, even to those of us who pay attention to politics only every couple of years, and who don’t have much of an interest in poker or baseball, his two other great obsessions. Silver is a heroic figure in an age of information. In his talk, he pointed out that ninety percent of the information in the world was created over the last two years, which makes it all the more important to find ways of navigating it effectively. With all the data at our disposal, it’s easy to find evidence for any argument we want to make: as the presidential debates made clear, there’s always a favorable poll or study to cite in our favor. Silver may have found his niche in politics, but he’s really an exemplar of how to intelligently read any body of publicly available information. We all have access to the same numbers: the question is how to interpret them, and, even more crucially, how to deal with information that doesn’t support our own beliefs. (Silver admits that he’s generally left of center in his own politics, but I almost wish that he were a closet conservative who was simply reporting the numbers as objectively as he could.)
But the most important thing about Silver is that he isn’t a witch. He predicted the election results better than almost anyone else, but he wasn’t alone: all of the major poll aggregators called the presidential race correctly, often using nothing more complicated than a simple average of polls, which implies that what Silver did was relatively simple, once you’ve made the decision to follow the data wherever it goes. And unlike most pundits, Silver has an enormous incentive to be painstaking in his methods. He knows that his reputation is based entirely on his accuracy, which made the conservative accusation that he was skewing the results seem ludicrous even at the time: he had much more to lose, over the long term, by being wrong. And it makes me very curious about his next move. At his talk, Silver pointed out that politics, unlike finance, was an easy target for statistical rigor: “You can look smart just by being pretty good.” Whether he can move beyond the polls into other fields remains to be seen, but I suspect that he’ll be both smart and cautious. And I can’t wait to see what he does next.
Walter Murch and the analog/digital divide
Walter Murch is the smartest person in America.
If anything, this understates the case. Yesterday I attended a talk at the Chicago Humanities Festival given by Murch, the legendary editor and sound designer of such films as The English Patient and Apocalypse Now. Regular readers of this blog know how much Murch means to me: he’s a longtime friend and colleague of such directors as George Lucas and Francis Ford Coppola, and while he never quite ascended to their levels of wealth and power, he’s their equal, or better, when it comes to intelligence, artistry, and innovation. Murch is a polymath whose work expresses both a universal curiosity and a meticulous level of craft, as amply chronicled in his own book, In the Blink of an Eye, and such fascinating portraits as Michael Ondaatje’s The Conversations and Charles Koppelman’s Behind the Seen. And while he may not be as famous as some of his collaborators, he’s an esteemed figure within the world of film, as demonstrated by the crowd of groupies who pressed in afterward for an autograph. Confession: I was one of them.
In addition to everything else, Murch is a wonderful public speaker, an inexhaustible source of anecdote and insight delivered in perfectly formed paragraphs. (Not least, he’s the first speaker I’ve ever seen who actually knew how to use his own laptop to give a presentation—not surprising, since Murch is one of the great users of Apple products.) Inevitably, our hour with Murch flew by much too quickly; his interlocutor, critic Lawrence Weschler, spoke of another incident in which a presenter was booed for ushering Murch off the stage after he’d held the audience enthralled for six hours. But the conversation we did get ranged from discussions of Chinese calligraphy to the recent financial crisis, and from THX-1138 to The Conversation to The Clone Wars, an episode of which Murch recently directed. I wish I could quote it all here. Instead, I’ll just touch on a subject central to the talk: the transition from analog to digital.
For most of the history of cinema, editing film was both intellectually difficult and physically taxing. The sheer bulk of the materials involved was daunting enough: Murch points out that for combined sound and picture in 35 mm, one minute of film equals a pound of celluloid. For a movie like Apocalypse Now, this comes out to something like seven tons of raw footage. And when an editor working on film is seeking a particular frame, weighing only a few thousands of an ounce, he needs to keep good records—and, Murch adds, to have “a strong back and arms.” Today, of course, the situation has changed dramatically: with an editing platform like Final Cut Pro, which Murch famously used to edit Cold Mountain, instead digging through a bin for the right piece of film, you can call up the necessary frame at once. This makes the process much more efficient, but it also leads to certain losses. Here’s Charles Koppelman in Behind the Seen:
As Murch often points out, the simple act of having to rewind film on a flatbed editing machine gave him the chance to see footage in other context (high-speed, reverse) that could reveal a look, a gesture, or a completely forgotten shot. Likewise, the few moments he had to spend waiting for a reel to rewind injected a blank space into the process during which he could simply let his mind wander into subconscious areas. With random-access, computer-based editing, a mouse click instantly takes the editor right to a desired frame; there is no waiting, no downtime—and fewer happy accidents.
I’ve spoken before about the paradoxes involved in increased efficiency, and how to compensate for it, in my post on Blinn’s Law. And one of the most fascinating aspects of Murch’s work is his effort to deliberately introduce randomness and chance into the creative process. One of my few regrets about yesterday’s talk was that he was unable to discuss this in detail, since it’s one the most valuable lessons he has to share. Murch sometimes reminds me of Steve Jobs, with whom he corresponded at times, both in his fondness for black turtlenecks and in his efforts to bridge the worlds of the humanities and the sciences—and, even more crucially, the worlds of analog and digital creativity. As we pass ever further into the random-access age, it’s all the more important to listen to Murch, who tirelessly explores the future even as he unsentimentally points out the usefulness of the past. A computer, he notes, always gives you what you want; an older system, with its inherent unpredictability, often gives you what you need. Tomorrow, I’ll be talking more about how artists of all kinds can deal with this dilemma.