Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Ian Parker

Broyles’s Law and the Ken Burns effect

with one comment

For most of my life as a moviegoer, I’ve followed a rule that has served me pretty well. Whenever the director of a documentary narrates the story in the first person, or, worse, appears on camera, I start to get suspicious. I’m not talking about movies like Roger and Me or even the loathsome Catfish, in which the filmmakers, for better or worse, are inherently part of the action, but about films in which the director inserts himself into the frame for no particular reason. Occasionally, I can forgive this, as I did with the brilliant The Cove, but usually, I feel a moment of doubt whenever the director’s voiceover begins. (In its worst form, it opens the movie with a redundant narration: “I first came across the story that you’re about to hear in the summer of 1990…”) But while I still think that this is a danger sign, I’ve recently concluded that I was wrong about why. I had always assumed that it was a sign of ego—that these directors were imposing themselves on a story that was really about other people, because they thought that it was all about them. In reality, it seems more likely that it’s a solution to a technical problem. What happens, I think, is that the director sits down to review his footage and discovers that it can’t be cut together as a coherent narrative. Perhaps there are are crucial scenes or beats missing, but the events that the movie depicts are long over, or there’s no budget to go back and shoot more. An interview might bridge the gaps, but maybe this isn’t logistically feasible. In the end, the director is left with just one person who is available to say all the right things on the soundtrack to provide the necessary transitions and clarifications. It’s himself. In a perfect world, if he had gotten the material that he needed, he wouldn’t have to be in his own movie at all, but he doesn’t have a choice. It isn’t a failure of character, but of technique, and the result ends up being much the same.

I got to thinking about this after reading a recent New Yorker profile by Ian Parker of the documentarian Ken Burns, whose upcoming series on the Vietnam War is poised to become a major cultural event. The article takes an irreverent tone toward Burns, whose cultural status encourages him to speechification in private: “His default conversational setting is Commencement Address, involving quotation from nineteenth-century heroes and from his own previous commentary, and moments of almost rhapsodic self-appreciation. He is readier than most people to regard his creative decisions as courageous.” But Parker also shares a fascinating anecdote about which I wish I knew more:

In the mid-eighties, Burns was working on a deft, entertaining documentary about Huey Long, the populist Louisiana politician. He asked two historians, William Leuchtenburg and Alan Brinkley, about a photograph he hoped to use, as a part of the account of Long’s assassination; it showed him protected by a phalanx of state troopers. Brinkley told him that the image might mislead; Long usually had plainclothes bodyguards. Burns felt thwarted. Then Leuchtenburg spoke. He’d just watched a football game in which Frank Broyles, the former University of Arkansas coach, was a commentator. When the game paused to allow a hurt player to be examined, Broyles explained that coaches tend to gauge the seriousness of an injury by asking a player his name or the time of day; if he can’t answer correctly, it’s serious. As Burns recalled it, Broyles went on, “But, of course, if the player is important to the game, we tell him what his name is, we tell him what time it is, and we send him back in.”

Hence Broyles’s Law: “If it’s super-important, if it’s working, you tell him what his name is, and you send him back into the game.” Burns decided to leave the photo in the movie. Parker continues:

Was this, perhaps, a terrible law? Burns laughed. “It’s a terrible law!” But, he went on, it didn’t let him off the hook, ethically. “This would be Werner Herzog’s ‘ecstatic truth’—‘I can do anything I want. I’ll pay the town drunk to crawl across the ice in the Russian village.’” He was referring to scenes in Herzog’s Bells from the Deep, which Herzog has been happy to describe, and defend, as stage-managed. “If he chooses to do that, that’s okay. And then there are other people who’d rather do reenactments than have a photograph that’s vague.” Instead, Burns said, “We do enough research that we can pretty much convince ourselves—in the best sense of the word—that we’ve done the honorable job.”

The reasoning in this paragraph is a little muddled, but Burns seems to be saying that he isn’t relying on “the ecstatic truth” of Herzog, who blurs the line between fiction and reality, or the reenactments favored by Errol Morris, who sometimes seems to be making a feature film interspersed with footage of talking heads. Instead, Burns is assembling a narrative solely out of primary sources, and if an image furthers the viewer’s intellectual understanding or emotional engagement, it can be included, even if it isn’t strictly accurate. These are the compromises that you make when you’re determined to use nothing but the visuals that you have available, and you trust in your understanding of the material to tell whether or not you’ve made the “honorable” choice.

On some level, this is basically what every author of nonfiction has to consider when assembling sources, which involves countless judgment calls about emphasis, order, and selection, as I’ve discussed here before. But I’m more interested in the point that this emerges from a technical issue inherent to the form of the documentary itself, in which the viewer always has to be looking at something. When the perfect image isn’t available, you have a few different options. You can ignore the problem; you can cut to an interview subject who tells the viewers about what they’re not seeing; or you can shoot a reenactment. (Recent documentaries seem to lean heavily on animation, presumably because it’s cheaper and easier to control in the studio.) Or, like Burns, you can make do with what you have, because that’s how you’ve defined the task for yourself. Burns wants to use nothing but interviews, narration, and archival materials, and the technical tricks that we’ve come to associate with his style—like the camera pan across photos that Apple actually calls the Ken Burns effect—arise directly out of those constraints. The result is often brilliant, in large part because Burns has no choice but to think hard about how to use the materials that he has. Broyles’s Law may be “terrible,” but it’s better than most of the alternatives. Burns has the luxury of big budgets, a huge staff, and a lot of time, which allows him to be fastidious about his solutions to such problems. But a desperate documentary filmmaker, faced with no money and a hole in the story to fill, may have no other recourse than to grab a microphone, sit down in the editing bay, and start to speak: “I first came across the story that you’re about to hear in the summer of 1990…”

Written by nevalalee

September 11, 2017 at 9:12 am

Food for thought

leave a comment »

The A.V. Club's Supper Club

Earlier this week, The A.V. Club, which is still the pop culture website at which I spend the vast majority of my online life, announced a new food section called “Supper Club.” It’s helmed by the James Beard Award-winning food critic and journalist Kevin Pang, a talented writer and documentarian whose work I’ve admired for years. On Wednesday, alongside the site’s usual television and movie coverage, seemingly half the homepage was devoted to features like “America’s ten tastiest fast foods,” followed a day later by “All of Dairy Queen’s Blizzards, ranked.” And the reaction from the community was—not good. Pang’s introductory post quickly drew over a thousand comments, with the most upvoted response reading:

I’ll save you about six months of pissed-away cash. Please reallocate the money that will be wasted on this venture to add more shows to the TV Club review section.

Most of the other food features received the same treatment, with commenters ignoring the content of the articles themselves and complaining about the new section on principle. Internet commenters, it must be said, are notoriously resistant to change, and most vocal segment of the community represents a tiny fraction of the overall readership of The A.V. Club. But I think it’s fair to say that the site’s editors can’t be entirely happy with how the launch has gone.

Yet the readers aren’t altogether wrong, either, and in retrospect, you could make a good case that the rollout should have been handled differently. The A.V. Club has gone through a rough couple of years, with many of its most recognizable writers leaving to start the movie site The Dissolve—which recently folded—even as its signature television coverage has been scaled back. Those detailed reviews of individual episodes might be popular with commenters, but they evidently don’t generate enough page views to justify the same degree of investment, and the site is looking at ways to stabilize its revenue at a challenging time for the entire industry. The community is obviously worried abut this, and Supper Club happened to appear at a moment when the commenters were likely to be skeptical about any new move, as if it were all a zero-sum game, which it isn’t. But the launch itself didn’t help matters. It makes sense to start an enterprise like this with a lot of articles on its first day, but taking over half the site with minimal advance warning lost it a lot of goodwill. Pang could also have been introduced more gradually: he’s a celebrity in foodie circles, but to most A.V. Club readers, he’s just a name. (It was also probably a miscalculation to have Pang write the introductory post himself, which placed him in the awkward position of having to drum up interest in his own work for an audience that didn’t know who he was.) And while I’ve enjoyed some of the content so far, and I understand the desire to keep the features lightweight and accessible, I don’t think the site has done itself any favors by leading with articles like “Do we eat soup or do we drink soup?”

The A.V. Club's Supper Club

This might seem like a lot of analysis for a kerfuffle that will be forgotten within a few weeks, no matter how Supper Club does in the meantime. But The A.V. Club has been a landmark site for pop culture coverage for the last decade, and its efforts to reinvent itself should concern anyone who cares about whether such venues can survive. I found myself thinking about this shortly after reading the excellent New Yorker profile of Pete Wells, the restaurant critic of the New York Times. Its author, Ian Parker, notes that modern food writing has become a subset of cultural criticism:

“A lot of reviews now tend to be food features,” [former Times restaurant critic Mimi Sheraton] said. She recalled a reference to Martin Amis in a Wells review of a Spanish restaurant in Brooklyn; she said she would have mentioned Amis only “if he came in and sat down and ordered chopped liver.”

Craig Claiborne, in a review from 1966, observed, “The lobster tart was palatable but bland and the skewered lamb on the dry side. The mussels marinière were creditable.” Thanks, in part, to the informal and diverting columns of Gael Greene, at New York, and Ruth Reichl, the Times’ critic during the nineties, restaurant reviewing in American papers has since become as much a vehicle for cultural criticism and literary entertainment—or, as Sheraton put it, “gossip”—as a guide to eating out.

If this is true, and I think it is, it means that food criticism, for better or worse, falls squarely within the mandate of The A.V. Club, whether its commenters like it or not.

But that doesn’t mean that we shouldn’t hold The A.V. Club to unreasonably high standards. In fact, we should be harder on it than we would on most sites, for reasons that Parker neatly outlines in his profile of Wells:

As Wells has come to see it, a disastrous restaurant is newsworthy only if it has a pedigree or commercial might. The mom-and-pop catastrophe can be overlooked. “I shouldn’t be having to explain to people what the place is,” he said. This reasoning seems civil, though, as Wells acknowledged, it means that his pans focus disproportionately on restaurants that have corporate siblings. Indeed, hype is often his direct or indirect subject. Of the fifteen no-star evaluations in his first four years, only two went to restaurants that weren’t part of a group of restaurants.

Parker continues: “There are restaurants that exist to have four Times stars. With fewer, they become a kind of paradox.” And when it comes to pop culture, The A.V. Club is the equivalent of a four-star restaurant. It was writing deeply felt, outrageously long essays on film and television before the longread was even a thing—in part, I suspect, because of its historical connection to The Onion: because it was often mistaken for a parody site, it always felt the need to prove its fundamental seriousness, which it did, over and over again. If Supper Club had launched with one of the ambitious, richly reported pieces that Pang has written elsewhere, the response might have been very different. Listicles might make more economic sense, and they can be fun if done right, but The A.V. Club has defined itself as a place where obsessively detailed and personal pop culture writing has a home. That’s what Supper Club should be. And until it is, we shouldn’t be surprised if readers have trouble swallowing it.

Written by nevalalee

September 9, 2016 at 9:08 am

The AutoContent Wizard, Part 1

leave a comment »

The AutoContent Wizard

A few days ago, while helping my wife prepare a presentation for a class she’s teaching as an adjunct lecturer at the Medill School of Journalism, I was reminded of how much I hate PowerPoint. This isn’t a knock at my wife, who is taking her role there very seriously, or even at the slideshow format itself, which can be useful, as she intends to employ it, in presenting exhibits or examples for a classroom discussion. It’s more a feeling of creeping terror at the slideshow state of mind, the one that links both the stodgiest of corporations and the startups that like to think they have nothing in common with the old guard. Every tech pundit or entrepreneur is expected to have a deck, a set of slides—presumably stored in a handy flash drive that can be kept on one’s person at all times—with which he or she can distill his life’s work into a few images, ready to be shown at an impromptu TED talk. And it horrifies me, not just as a former office worker who has spent countless hours suffering through interminable slide presentations, but as someone who cares about the future of ideas. I’m not the first person to say this, of course, and at this point, it would probably be more interesting to mount a legitimate defense of the slideshow mentality than to make fun of it yet again. But the more I look around at our media landscape, the more it seems like a PowerPoint world, crafted and marketed to us by people who don’t want to think in terms that can’t be boiled down to a slick presentation.

The most useful critic on the subject, not surprisingly, is the legendary statistician and information designer Edward Tufte, whose book Beautiful Evidence includes a savage takedown of PowerPoint and the sloppy thinking it encourages. Tufte includes the usual samples of incomprehensible slides, but he also gets at a crucial point that explains how we got here:

Yet PowerPoint is entirely presenter-oriented, and not content-oriented, not audience-oriented. The claims of [PowerPoint] marketing are addressed to speakers: “A cure for the presentation jitters.” “Get yourself organized.” “Use the AutoContent Wizard to figure out what you want to say.” And fans of PowerPoint are presenters, rarely audience members.

This, I think, is the crux of the matter. Any form of communication that is designed more for the convenience of the creator than for its audience is inherently corrupt, and it tends to corrupt serious thought—or even simple clarity—the more frequently it gets used. We’re often told that people fear public speaking more than death, but the solution turns into a kind of living death for its listeners. And its most obvious manifestation is the dreaded bullet point, or, more accurately, the nested list, a visual cliché intended to create the impression of clear, logical, sequential thinking where none actually exists.

A PowerPoint slide

Bullet points aren’t anything new, as Richard Feynman learned back when he was wading through government documents as part of his investigation of the Challenger explosion: “Then we learned about ‘bullets’—little black circles in front of phrases that were supposed to summarize things. There was one after another of these little goddamn bullets in our briefing books and on slides.” But the way PowerPoint not only encourages but essentially forces users to structure their presentations as nested bullet points, no matter how incoherent the underlying argument, points to something truly insidious: the assumption that presentation counts for more than content, and that it’s fine to settle for shoddy, disorganized thinking as long as it follows the same set of stereotyped beats. As Ian Parker wrote in an excellent New Yorker piece on the subject from more than a decade ago:

But PowerPoint also has a private, interior influence. It edits ideas. It is, almost surreptitiously, a business manual as well as a business suit, with an opinion—an oddly pedantic, prescriptive opinion—about the way we should think. It helps you make a case, but it also makes its own case about how to organize information, how much information to organize, how to look at the world.

Parker cites the “Motivating a Team” template, which invites the anxious presenter to fill in a series of blanks—”Generate possible solutions with green light, nonjudgmental thinking”—and cheerfully advises: “Have an inspirational close.” A tool supposedly made for presentations, in short, shades imperceptibly into a formula for thought, based on the implicit premise that ideas themselves are interchangeable.

The templates that Parker is mocking here used to appear in a feature known as the AutoContent Wizard, which sounds a little what a prolific blogger might want to be called in bed. In fact, it was a set of templates—including my favorite, “Communicating Bad News”—that amounts to a gesture of contempt toward everyone who creates or is subjected to such presentations, as Parker’s account of its creation makes clear:

AutoContent was added in the mid-nineties, when Microsoft learned that some would-be presenters were uncomfortable with a blank PowerPoint page—it was hard to get started. “We said, ‘What we need is some automatic content!'” a former Microsoft developer recalls, laughing. “‘Punch the button and you’ll have a presentation.'” The idea, he thought, was “crazy.” And the name was meant as a joke. But Microsoft took the idea and kept the name—a rare example of a product named in outright mockery of its target customers.

The AutoContent Wizard was phased out a few years ago, but even if we’re lucky enough not to work at a company at which we’re subjected to its successors, the attitudes behind it have expanded to cover much of the content that we consume on a daily basis. I’m speaking of the slideshow and the listicle, which have long since become the models for how online content can be profitably packaged and delivered. Tomorrow, I’ll be talking more about how PowerPoint gave birth to these secret children, how the assumptions they reflect come from the very top, and what it means to live in a world in which AutoContent, not content, is king.

Written by nevalalee

September 21, 2015 at 9:47 am

The Ive Mind

with 3 comments

Jonathan Ive and Steve Jobs

Like many readers, I spent much of yesterday working my way through Ian Parker’s massive New Yorker profile of Apple designer Jonathan Ive. Over the years, we’ve seen plenty of extended feature pieces on Ive, who somehow manages to preserve his reputation as an intensely private man, but it feels like Parker set out to write the one to end them all: it’s well over fifteen thousand words long, and there were times, as I watched my progress creeping slowly by in the scroll bar, when I felt like I was navigating an infinite loop of my own. (It also closes in that abrupt New Yorker way that takes apparent pride in ending articles at the most arbitrary place possible, as if the writer had suffered a stroke before finishing the last paragraph.) Still, it’s a fine piece, crammed with insights, and I expect that I’ll read it again. I’ve become slightly less enamored of Apple ever since my latest MacBook started to disintegrate a few months after I bought it—by which I mean its screws popped out one by one and its plastic casing began to bubble alarmingly outward—but there’s no doubting Ive’s vision, intelligence, and ability to articulate his ideas.

Like a lot of Apple coverage, Parker’s article builds on the company’s mythology while making occasional stabs at deflating it, with paragraphs of almost pornographic praise alternating with a skeptical sentence or two. (“I noticed that, at this moment in the history of personal technology, Cook still uses notifications in the form of a young woman appearing silently from nowhere to hold a sheet of paper in his line of sight.”) And he’s best not so much at talking about Apple’s culture as at talking about how they talk about it. Here’s my favorite part:

[Ive] linked the studio’s work to NASA’s: like the Apollo program, the creation of Apple products required “invention after invention after invention that you would never be conscious of, but that was necessary to do something that was new.” It was a tic that I came to recognize: self-promotion driven by fear that one’s self-effacement might be taken too literally. Even as Apple objects strive for effortlessness, there’s clearly a hope that the effort required—the “huge degree of care,” the years of investigations into new materials, the months spent enforcing cutting paths in Asian factories—will be acknowledged.

Early patent sketches for Apple handheld device

I love this because it neatly encapsulates the neurosis at the heart of so much creative work, from fiction to industrial design. We’re constantly told that we ought to strive for simplicity, and that the finished product, to use one of Ive’s favorite terms, should seem “inevitable.” Yet we’re also anxious that the purity of the result not be confused with the ease of its creation. Writers want readers to accept a novel as a window onto reality while simultaneously noticing the thousands of individual choices and acts of will that went into fashioning it, which is inherently impossible. And it kills us. Writing a novel is a backbreaking process that wants to look as simple as life, and that contradiction goes a long way toward explaining why authors never feel as if they’ve received enough love: the terms of the game that they’ve chosen ensure that most of their work remains invisible. Novels, even mediocre ones, consist of “invention after invention after invention,” a daunting series, as T.H. White noted, of “nouns, verbs, prepositions, adjectives, reeling across the page.” And even when a story all but begs us to admire the brilliance of its construction, we’ll never see more than a fraction of the labor it required.

So what’s a creative artist to do? Well, we can talk endlessly about process, as Ive does, and dream of having a profile in The Paris Review, complete with images of our discarded drafts. Or we can push complexity to the forefront, knowing at least that it will be acknowledged, even if it goes against what we secretly believe about the inevitability of great art. (“The artist, like the God of creation, remains within or behind or beyond or above his handiwork, invisible, refined out of existence, indifferent, paring his fingernails,” James Joyce writes, and yet few other authors have been so insistent that we recognize his choices, even within individual words.) Or, if all else fails, we can rail against critics who seem insufficiently appreciative of how much work is required to make something feel obvious, or who focus on some trivial point while ignoring the agonies that went into a story’s foundations. None of which, of course, prevents us from taking the exact same attitude toward works of art made by others. Ultimately, the only solution is to learn to live with your private store of effort, uncertainty, and compromise, never advertising it or pointing to all your hard work as an excuse when it falls short. Because in the end, the result has to stand on its own, even if it’s the apple of your eye.

Written by nevalalee

February 18, 2015 at 9:50 am

%d bloggers like this: