Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘The New Yorker

The men who sold the moonshot

with 3 comments

When you ask Google whether we should build houses on the ocean, it gives you a bunch of results like these. If you ask Google X, the subsidiary within the company responsible for investigating “moonshot” projects like self-driving cars and space elevators, the answer that you get is rather different, as Derek Thompson reports in the cover story for this month’s issue of The Atlantic:

Like a think-tank panel with the instincts of an improv troupe, the group sprang into an interrogative frenzy. “What are the specific economic benefits of increasing housing supply?” the liquid-crystals guy asked. “Isn’t the real problem that transportation infrastructure is so expensive?” the balloon scientist said. “How sure are we that living in densely built cities makes us happier?” the extradimensional physicist wondered. Over the course of an hour, the conversation turned to the ergonomics of Tokyo’s high-speed trains and then to Americans’ cultural preference for suburbs. Members of the team discussed commonsense solutions to urban density, such as more money for transit, and eccentric ideas, such as acoustic technology to make apartments soundproof and self-driving housing units that could park on top of one another in a city center. At one point, teleportation enjoyed a brief hearing.

Thompson writes a little later: “I’d expected the team at X to sketch some floating houses on a whiteboard, or discuss ways to connect an ocean suburb to a city center, or just inform me that the idea was terrible. I was wrong. The table never once mentioned the words floating or ocean. My pitch merely inspired an inquiry into the purpose of housing and the shortfalls of U.S. infrastructure. It was my first lesson in radical creativity. Moonshots don’t begin with brainstorming clever answers. They start with the hard work of finding the right questions.”

I don’t know why Thompson decided to ask about “oceanic residences,” but I read this section of the article with particular interest, because about two years ago, I spent a month thinking about the subject intensively for my novella “The Proving Ground.” As I’ve described elsewhere, I knew early on in the process that it was going to be a story about the construction of a seastead in the Marshall Islands, which was pretty specific. There was plenty of background material available, ranging from general treatments of the idea in books like The Millennial Project by Marshall T. Savage—which had been sitting unread on my shelf for years—to detailed proposals for seasteads in the real world. The obvious source was The Seasteading Institute, a libertarian pipe dream funded by Peter Thiel that generated a lot of useful plans along the way, as long as you saw it as the legwork for a science fiction story, rather than as a project on which you were planning to actually spend fifty billion dollars. The difference between most of these proposals and the brainstorming session that Thompson describes is that they start with a floating city and then look for reasons to justify it. Seasteading is a solution in search of a problem. In other words, it’s science fiction, which often starts with a premise or setting that seems like it would lead to an exciting story and then searches for the necessary rationalizations. (The more invisible the process, the better.) And this can lead us to troubling places. As I’ve noted before, Thiel blames many of this country’s problems on “a failure of imagination,” and his nostalgia for vintage science fiction is rooted in a longing for the grand gestures that it embodied: the flying car, the seastead, the space colony. As he famously said six years ago to The New Yorker: “The anthology of the top twenty-five sci-fi stories in 1970 was, like, ‘Me and my friend the robot went for a walk on the moon,’ and in 2008 it was, like, ‘The galaxy is run by a fundamentalist Islamic confederacy, and there are people who are hunting planets and killing them for fun.'”

Google X isn’t immune to this tendency—Google Glass was, if anything, a solution in search of a problem—and some degree of science-fictional thinking is probably inherent to any such enterprise. In his article, Thompson doesn’t mention science fiction by name, but the whole division is clearly reminiscent of and inspired by the genre, down to the term “moonshot” and that mysterious letter at the end of its name. (Company lore claims that the “X” was chosen as “a purposeful placeholder,” but it’s hard not to think that it was motivated by the same impulse that gave us Dimension X, X Minus 1, Rocketship X-M, and even The X-Files.) In fact, an earlier article for The Atlantic looked at this connection in depth, and its conclusions weren’t altogether positive. Three years ago, in the same publication, Robinson Meyer quoted a passage from an article in Fast Company about the kinds of projects favored by Google X, but he drew a more ambivalent conclusion:

A lot of people might read that [description] and think: Wow, cool, Google is trying to make the future! But “science fiction” provides but a tiny porthole onto the vast strangeness of the future. When we imagine a “science fiction”-like future, I think we tend to picture completed worlds, flying cars, the shiny, floating towers of midcentury dreams. We tend, in other words, to imagine future technological systems as readymade, holistic products that people will choose to adopt, rather than as the assembled work of countless different actors, which they’ve always really been. The futurist Scott Smith calls these “flat-pack futures,” and they infect “science fictional” thinking.

He added: “I fear—especially when we talk about “science fiction”—that we miss the layeredness of the world, that many people worked to build it…Flying through space is awesome, but if technological advocates want not only to make their advances but to hold onto them, we have better learn the virtues of incrementalism.” (The contrast between Meyer’s skepticism and Thompson’s more positive take feels like a matter of access—it’s easier to criticize Google X’s assumptions when it’s being profiled by a rival magazine.)

But Meyer makes a good point, and science fiction’s mixed record at dealing with incrementalism is a natural consequence of its origins in popular fiction. A story demands a protagonist, which encourages writers to see scientific progress in terms of heroic figures. The early fiction of John W. Campbell returns monotonously to the same basic plot, in which a lone genius discovers atomic power and uses it to build a spaceship, drawing on the limitless resources of a wealthy and generous benefactor. As Isaac Asimov noted in his essay “Big, Big, Big”:

The thing about John Campbell is that he liked things big. He liked big men with big ideas working out big applications of their big theories. And he liked it fast. His big men built big weapons within days; weapons that were, moreover, without serious shortcomings, or at least, with no shortcomings that could not be corrected as follows: “Hmm, something’s wrong—oh, I see—of course.” Then, in two hours, something would be jerry-built to fix the jerry-built device.

This works well enough in pulp adventure, but after science fiction began to take itself seriously as prophecy, it fossilized into the notion that all problems can be approached as provinces of engineering and solved by geniuses working alone or in small groups. Elon Musk has been compared to Tony Stark, but he’s really the modern incarnation of a figure as old as The Skylark of Space, and the adulation that he still inspires shades into beliefs that are even less innocuous—like the idea that our politics should be entrusted to similarly big men. Writing of Google X’s Rapid Evaluation team, Thompson uses terms that would have made Campbell salivate: “You might say it’s Rapid Eval’s job to apply a kind of future-perfect analysis to every potential project: If this idea succeeds, what will have been the challenges? If it fails, what will have been the reasons?” Science fiction likes to believe that it’s better than average at this kind of forecasting. But it’s just as likely that it’s worse.

Written by nevalalee

October 11, 2017 at 9:02 am

The man with the plan

with one comment

This month marks the twenty-fifth anniversary of the release of Reservoir Dogs, a film that I loved as much as just about every other budding cinephile who came of age in the nineties. Tom Shone has a nice writeup on its legacy in The New Yorker, and while I don’t agree with every point that he makes—he dismisses Kill Bill, which is a movie that means so much to me that I named my own daughter after Beatrix Kiddo—he has insights that can’t be ignored: “Quentin [Tarantino] became his worst reviews, rather in the manner of a boy who, falsely accused of something, decides that he might as well do the thing for which he has already been punished.” And there’s one paragraph that strikes me as wonderfully perceptive:

So many great filmmakers have made their debuts with heist films—from Woody Allen’s Take the Money and Run to Michael Mann’s Thief to Wes Anderson’s Bottle Rocket to Bryan Singer’s The Usual Suspects—that it’s tempting to see the genre almost as an allegory for the filmmaking process. The model it offers first-time filmmakers is thus as much economic as aesthetic—a reaffirmation of the tenant that Jean-Luc Godard attributed to D. W. Griffith: “All you need to make a movie is a girl and a gun.” A man assembles a gang for the implementation of a plan that is months in the rehearsal and whose execution rests on a cunning facsimile of midmorning reality going undetected. But the plan meets bumpy reality, requiring feats of improvisation and quick thinking if the gang is to make off with its loot—and the filmmaker is to avoid going to movie jail.

And while you could nitpick the details of this argument—Singer’s debut was actually Public Access, a movie that nobody, including me, has seen—it gets at something fundamental about the art of film, which lies at the intersection of an industrial process and a crime. I’ve spoken elsewhere about how Inception, my favorite movie of the last decade, maps the members of its mind heist neatly onto the crew of a motion picture: Cobb is the director, Saito the producer, Ariadne the set designer, Eames the actor, and Arthur is, I don’t know, the line producer, while Fischer, the mark, is a surrogate for the audience itself. (For what it’s worth, Christopher Nolan has stated that any such allegory was unconscious, although he seems to have embraced it after the fact.) Most of the directors whom Shone names are what we’d call auteur figures, and aside from Singer, all of them wear a writer’s hat, which can obscure the extent to which they depend on collaboration. Yet in their best work, it’s hard to imagine Singer without Christopher McQuarrie, Tarantino without editor Sally Menke, or Wes Anderson without Owen Wilson, not to mention the art directors, cinematographers, and other skilled craftsmen required to finish even the most idiosyncratic and personal movie. Just as every novel is secretly about the process of its own creation, every movie is inevitably about making movies, which is the life that its creators know most intimately. One of the most exhilarating things that a movie can do is give us a sense of the huddle between artists, which is central to the appeal of The Red Shoes, but also Mission: Impossible—Rogue Nation, in which Tom Cruise told McQuarrie that he wanted to make a film about what it was like for the two of them to make a film.

But there’s also an element of criminality, which might be even more crucial. I’m not the first person to point out that there’s something illicit in the act of watching images of other people’s lives projected onto a screen in a darkened theater—David Thomson, our greatest film critic, has built his career on variations on that one central insight. And it shouldn’t surprise us if the filmmaking process itself takes on aspects of something done in the shadows, in defiance of permits, labor regulations, and the orderly progression of traffic. (Werner Herzog famously advised aspiring directors to carry bolt cutters everywhere: “If you want to do a film, steal a camera, steal raw stock, sneak into a lab and do it!”) If your goal is to tell a story about putting together a team for a complicated project, it could be about the Ballet Lermontov or the defense of a Japanese village, and the result might be even greater. But it would lack the air of illegality on which the medium thrives, both in its dreamlife and in its practical reality. From the beginning, Tarantino seems to have sensed this. He’s become so famous for reviving the careers of neglected figures for the sake of the auras that they provide—John Travolta, Pam Grier, Robert Forster, Keith Carradine—that it’s practically become his trademark, and we often forget that he did it for the first time in Reservoir Dogs. Lawrence Tierney, the star of Dillinger and Born to Kill, had been such a menacing presence both onscreen and off that that he was effectively banned from Hollywood after the forties, and he remained a terrifying presence even in old age. He terrorized the cast of Seinfield during his guest appearance as Elaine’s father, and one of my favorite commentary tracks from The Simpsons consists of the staff reminiscing nervously about how much he scared them during the recording of “Marge Be Not Proud.”

Yet Tarantino still cast him as Joe Cabot, the man who sets up the heist, and Tierney rewarded him with a brilliant performance. Behind the scenes, it went more or less as you might expect, as Tarantino recalled much later:

Tierney was a complete lunatic by that time—he just needed to be sedated. We had decided to shoot his scenes first, so my first week of directing was talking with this fucking lunatic. He was personally challenging to every aspect of filmmaking. By the end of the week everybody on set hated Tierney—it wasn’t just me. And in the last twenty minutes of the first week we had a blowout and got into a fist fight. I fired him, and the whole crew burst into applause.

But the most revealing thing about the whole incident is that an untested director like Tarantino felt capable of taking on Tierney at all. You could argue that he already had an inkling of what he might become, but I’d prefer to think that he both needed and wanted someone like this to symbolize the last piece of the picture. Joe Cabot is the man with the plan, and he’s also the man with the money. (In the original script, Joe says into the phone: “Sid, stop, you’re embarrassing me. I don’t need to be told what I already know. When you have bad months, you do what every businessman in the world does, I don’t care if he’s Donald Trump or Irving the tailor. Ya ride it out.”) It’s tempting to associate him with the producer, but he’s more like a studio head, a position that has often drawn men whose bullying and manipulation is tolerated as long as they can make movies. When he wrote the screenplay, Tarantino had probably never met such a creature in person, but he must have had some sense of what was in store, and Reservoir Dogs was picked up for distribution by a man who fit the profile perfectly—and who never left Tarantino’s side ever again. His name was Harvey Weinstein.

Writing with scissors

leave a comment »

Over the last few years, one of my great pleasures has been reading the articles on writing that John McPhee has been contributing on an annual basis to The New Yorker. I’ve written here about my reactions to McPhee’s advice on using the dictionary, on “greening” or cutting a piece by an arbitrary length, on structure, on frames of reference. Now his full book on the subject is here, Draft No. 4, and it’s arriving in my life at an opportune time. I’m wrapping up a draft of my own book, with two months to go before deadline, and I have a daunting set of tasks ahead of me—responding to editorial comments, preparing the notes and bibliography, wrestling the whole thing down to size. McPhee’s reasonable voice is a balm at such times, although he never minimizes the difficulty of the process itself, which he calls “masochistic, mind-fracturing self-enslaved labor,” even as he speaks of the writer’s “animal sense of being hunted.” And when you read Sam Anderson’s wonderful profile on McPhee in this week’s issue of The New York Times Magazine, it’s like listening to an old soldier who has been in combat so many times that everything that he says carries the weight of long experience. (Reading it, I was reminded a little of the film editor Walter Murch, whom McPhee resembles in certain ways—they look sort of alike, they’re both obsessed with structure, and they both seem to know everything. I was curious to see whether anyone else had made this connection, so I did a search for their names together on Google. Of the first five results, three were links from this blog.)

Anderson’s article offers us the portrait of a man who, at eighty-six, has done a better job than just about anyone else of organizing his own brain: “Each of those years seems to be filed away inside of him, loaded with information, ready to access.” I would have been equally pleased to learn that McPhee was as privately untidy as his writing is intricately patterned, but it makes sense that his interest in problems of structure—to which he returns endlessly—would manifest itself in his life and conversation. He’s interested in structure in the same way that the rest of us are interested in the lives of our own children. I never tire of hearing how writers deal with structural issues, and I find passages like the following almost pornographically fascinating:

The process is hellacious. McPhee gathers every single scrap of reporting on a given project—every interview, description, stray thought and research tidbit—and types all of it into his computer. He studies that data and comes up with organizing categories: themes, set pieces, characters and so on. Each category is assigned a code. To find the structure of a piece, McPhee makes an index card for each of his codes, sets them on a large table and arranges and rearranges the cards until the sequence seems right. Then he works back through his mass of assembled data, labeling each piece with the relevant code. On the computer, a program called “Structur” arranges these scraps into organized batches, and McPhee then works sequentially, batch by batch, converting all of it into prose. (In the old days, McPhee would manually type out his notes, photocopy them, cut up everything with scissors, and sort it all into coded envelopes. His first computer, he says, was “a five-thousand-dollar pair of scissors.”)

Anderson writes: “[McPhee] is one of the world’s few remaining users of a program called Kedit, which he writes about, at great length, in Draft No. 4.” The phrase “at great length” excites me tremendously—I’m at a point in my life where I’d rather hear about a writer’s favorite software program than his or her inspirational  thoughts on creativity—and McPhee’s process doesn’t sound too far removed from the one that I’ve worked out for myself. As I read it, though, I found myself thinking in passing of what might be lost when you move from scissors to a computer. (Scissors appear in the toolboxes of many of the writers and artists I admire. In The Elements of Style, E.B. White advises: “Quite often the writer will discover, on examining the completed work, that there are serious flaws in the arrangement of the material, calling for transpositions. When this is the case, he can save himself much labor and time by using scissors on his manuscript, cutting it to pieces and fitting the pieces together in a better order.” In The Silent Clowns, Walter Kerr describes the narrative challenges of filmmaking in the early fifties and concludes: “The problem was solved, more or less, with a scissors.” And Paul Klee once wrote in his diary: “What I don’t like, I cut away with the scissors.”) But McPhee isn’t sentimental about the tools themselves. In Anderson’s profile, the New Yorker editor David Remnick, who took McPhee’s class at Princeton, recalls: “You were in the room with a craftsman of the art, rather than a scholar or critic—to the point where I remember him passing around the weird mechanical pencils he used to use.” Yet there’s no question in my mind that McPhee would drop that one brand of pencil if he found one that he thought was objectively better. As soon as he had Kedit, he got rid of the scissors. When you’re trying to rethink structure from the ground up, you don’t have much time for nostalgia.

And when McPhee explains the rationale behind his methods, you can hear the pragmatism of fifty years of hard experience:

If this sounds mechanical, its effect was absolutely the reverse. If the contents of the seventh folder were before me, the contents of twenty-nine other folders were out of sight. Every organizational aspect was behind me. The procedure eliminated nearly all distraction and concentrated only the material I had to deal with in a given day or week. It painted me into a corner, yes, but in doing so it freed me to write.

This amounts to an elaboration of what I’ve elsewhere called my favorite piece of writing advice, which David Mamet offers in Some Freaks:

As a writer, I’ve tried to train myself to go one achievable step at a time: to say, for example, “Today I don’t have to be particularly inventive, all I have to be is careful, and make up an outline of the actual physical things the character does in Act One.” And then, the following day to say, “Today I don’t have to be careful. I already have this careful, literal outline, and I all have to do is be a little bit inventive,” et cetera, et cetera.

Mamet might as well have come out of the same box as Walter Murch and McPhee, which implies that I have a definite type when it comes to looking for advice. And what they all have in common, besides the glasses and beard, is the air of having labored at a craft for decades, survived, and returned to tell the tale. Of the three, McPhee’s career may be the most enviable of all, if only because he spent it in Princeton, not Hollywood. It’s nice to be able to structure an essay. The tricky part is structuring a life.

Talking the Talk

leave a comment »

A few days ago, while reading Adam Begley’s biography of John Updike, I came across the following passage about William Shawn, the legendary editor of The New Yorker:

Nowadays Shawn is nearly as famous for his oddities as for his editorial prowess. The catalog of his phobias and behavioral tics, the intrigue (especially his decades-long office romance with Lillian Ross, which was meant to be a deep, deep secret and become, with the passage of time, merely the obvious but unmentionable status quo), the passive-aggressive manipulation of colleagues and contributors, the velvet tenacity of his grip on power…it’s all almost enough to make us forget the astonishing success with which he steered the magazine.

Earlier this week, Lillian Ross passed away at the age of ninety-nine. Her personal life, like Shawn’s, often received more attention than her professional accomplishments, and her obituaries predictably devoted a lot of space to their affair, which might have chagrined but not surprised her. In an era when celebrity journalists like Norman Mailer and Gay Talese were on the ascendant, she cautioned reporters against placing themselves at the center of the story—although she also wrote a late memoir of her life with Shawn, Here But Not Here, that caused a firestorm of controversy within its tiny world when it was released two decades ago. In his New York Times review, Charles McGrath called it “a tactless example of the current avidity for tell-all confessions,” and it struck many readers as an odd departure for a reporter who had been complimented for her ability to fade into the background. And while its title sounded like a motto for objective reporting, it actually came from something that Shawn—whom Updike later praised for his “disinterested standards”—liked to say about his home life: “I am there, but I am not there.”

But Ross, Shawn, and their magazine entered the inner lives of their readers in ways that transcended the efforts of reporters who asked more insistently for our attention. In her book Reporting, Ross offered her personal rules for conducting journalism:

Reporting is difficult, partly because the writer does not have the leeway to play around with the lives of people, as he does in fiction. There are many other restrictions, too…Your attention at all times should be on your subject, not on you. Do not call attention to yourself. As a reporter, serve your subject, do not yourself. Do not, in effect say, “Look at me. See what a great reporter I am!” Do not, if you want to reveal that the Emperor is not wearing any clothes, write, “I am showing that the Emperor is already naked.”

A few more admonitions: do not promote yourself; do not advertise yourself; do not sell yourself. If you have a tendency to do these things, you should go into some line of work that may benefit from your talents as a promoter, a salesman, or an actor. Too many extraneous considerations have been imposed on reporting in recent years, and it is time now to ask writers who would be reporters to report.

Decades later, in speaking of her reputation as a fly on the wall, Ross struck a rather different note: “What craziness! A reporter doing a story can’t pretend to be invisible, let alone a fly; he or she is seen and heard and responded to by the people he or she is writing about. A reporter is always chemically involved in a story.”

Ross might sound like she’s contradicting herself, but I don’t think that she is. It helps to focus on the words “chemically involved,” which makes reporting sound like an industrial process—which, in the hands of Shawn’s writers, including Ross and Updike, is what it became. A recent tribute describes Ross as “an early architect” of the Talk of the Town section, which puts her at the center of a certain way of viewing the world. The Talk of the Town has always been characterized less by any particular subject than by its voice, which Begley capably evokes in an account of one of Updike’s early pieces, in which he visited a lawn care business in Southampton:

The resulting journalistic trifle is mildly amusing and fairly typical of The Talk of the Town, save for the exurban expedition…The reporter (“we,” by hallowed New Yorker convention) gathers a comically copious amount of information about the product, allows its makers to display a comical commercial enthusiasm, and adds to the comedy by appearing (almost) to share that enthusiasm.

In this case, the product was a lawn treatment that dyed the grass green, but The Talk of the Town remains the magazine’s place to accommodate more famous subjects who have something to promote. Its stance toward such material allows its interviewees to plug film or book projects while keeping them at a bemused distance, and a lot of it hinges on that remarkable “we.” (It’s the counterpart of the “you” that appears so often in its movie reviews.) Updike gently mocked it years later: “Who, after all, could that indefatigably fascinated, perpetually peripatetic ‘we’ be but a collection of dazzled farm-boys?” But it’s still our ideal of a certain kind of nonfiction—privileged, lightly ironic, with dashes of surprising insight that don’t prevent you from turning the page.

Ross was one of the inventors of that voice, which was the chemical trick that she used to dissolve herself into a story. It allowed trivial pieces to be rapidly produced, while also allowing for deeper engagement when the opportunity presented itself. (To push the analogy from Updike’s article to the breaking point, it was “the desired combination of a dye that would immediately color the lawn and a fertilizer that would eventually rejuvenate it.”) And much of the success of The New Yorker lay in the values that its readers projected onto that “we.” As Begley describes the characters in Updike’s story “Incest”:

The young couple…are college educated, living in a small, pleasant New York apartment furnished with bamboo chairs, a modernist sofa, a makeshift bed, bookshelves filled with books. They’re familiar with Proust and Freud and the pediatric pronouncements of Dr. Benjamin Spock…Jane sips vermouth after dinner, listening to Bach on the record player while she reads The New Republic—if the story hadn’t been intended for publication in The New Yorker, surely she would have been reading that magazine instead.

Norman Mailer, a New Journalist who actually published a collection titled Advertisements for Myself, was dismissive of the magazine’s hold on its readers: “Hundreds of thousands, perhaps millions of people in the most established parts of the middle class kill their quickest impulses before they dare to act in such a way as to look ridiculous to the private eye of their taste whose style has been keyed by the eye of The New Yorker.” He’s speaking of The Talk of the Town, as refined by Ross and Shawn, and it’s still true today. Updike made fun of that “we” because he could—but for many readers, then and now, the grass on that side was definitely greener.

Their brand is crisis

leave a comment »

A while back, The New Yorker ran an engaging piece by John Colapinto about the branding firm Lexicon, which specializes in coming up with product names for corporate clients. It was published nearly six years ago, but it’s stuck in my head, after so many other articles have faded, in part because the work of Lexicon—which has named such brands as BlackBerry, Pentium, PowerBook, and Dasani—feels like a distillation of what writers and artists do all the time. It’s hard enough to write a distinctive slogan or jingle, but trying to evoke so much in a single word, like Swiffer, resembles a form of black magic. (I’m reminded of the protagonist of John Barth’s novel The Tidewater Tales, who keeps cutting down a short story until it consists of nothing but the word “olive.”) But there’s a science to it as well. Colapinto writes:

Lexicon employs two in-house linguists and consults with seventy-seven others around the world, specialists in languages as diverse as Urdu, Tagalog, and Hindi—a critical resource, [founder David] Placek says. They screen names for embarrassing associations. (The industry abounds in tales of cross-linguistic gaffes, like Creap coffee creamer from Japan, Bum potato chips from Spain, and the Chevy Nova—in Spanish, the “no go.”) They also offer input on the unconscious resonance of particular sounds. In the mid-nineties, Lexicon funded a linguistic study whose results suggested that the sound of the letter “b” was one of the most “reliable” in any language—“whether you were in Poland or Paris or New York,” Placek said. He mentioned this to the Research in Motion executives, and they decided to capitalize both “b”s: BlackBerry.

Yesterday, a story broke about another brand that starts with a “b.” Bodega, a startup that has raised millions of dollars in venture investment, inspired a flurry of online rage after Fast Company published an article with the headline “Two Ex-Googlers Want To Make Bodegas And Mom-And-Pop Corner Stores Obsolete.” The profile, which was responsibly reported and written by Elizabeth Segran, summarizes the company’s pitch as follows:

Bodega sets up five-foot-wide pantry boxes filled with non-perishable items you might pick up at a convenience store. An app will allow you to unlock the box and cameras powered with computer vision will register what you’ve picked up, automatically charging your credit card…Bodega’s logo is a cat, a nod to the popular bodega cat meme on social media—although if the duo gets their way, real felines won’t have brick-and-mortar shops to saunter around and take naps in much longer.    

There are obvious problems here, both on the practical side and on the level of what we’ve somehow agreed to call “optics,” and they’ve been capably pointed out by others. But the company’s name, which appropriates a term for corner stores in urban areas often owned by immigrants, didn’t help. As Segran relates:

I asked [founder Paul McDonald] point-blank about whether he’s worried that the name Bodega might come off as culturally insensitive. Not really. “I’m not particularly concerned about it,” he says. “We did surveys in the Latin American community to understand if they felt the name was a misappropriation of that term or had negative connotations, and 97% said ‘no.’ It’s a simple name and I think it works.”

When I first read that quote, shortly before the firestorm broke, I thought of the famous line from Fargo: “I’m not sure I agree with you a hundred percent on your police work there.” It seems safe to say that if you feel obliged to check whether or not your brand name is a “misappropriation,” you’re probably better off not using it, and that the three percent of respondents who find it objectionable might cause you a disproportionate amount of trouble. (Focusing on “the Latin American community” also overlooks the fact that many people are more than willing to be offended on behalf of others.) In an apologetic post that was published late yesterday, McDonald asked rhetorically: “Is it possible we didn’t fully understand what the reaction to the name would be?” He answered himself:

Yes, clearly. The name Bodega sparked a wave of criticism on social media far beyond what we ever imagined. When we first came up with the idea to call the company Bodega we recognized that there was a risk of it being interpreted as misappropriation. We did some homework — speaking to New Yorkers, branding people, and even running some survey work asking about the name and any potential offense it might cause. But it’s clear that we may not have been asking the right questions of the right people.

Personally, I’d be curious to know which “branding people” they consulted, and whether they were seduced by the “reliability” of the letter “b,” or by the word’s “consonant-vowel-consonant pattern,” which, as Colapinto notes, is “among the first that infants learn in any language.”

Whatever the process was, the result was that Bodega ended up with just about the worst name that it could possibly have chosen. Its business model has other issues that make it unlikely that it could pose a threat to anyone, much less one’s favorite corner store, but it could easily have positioned itself to make it seem that it was targeting big chain drugstores, not independent businesses. Instead, it chose a name that was like a torpedo aimed at itself. It was a self-inflicted wound, and you could argue that the editors of Fast Company were ready with almost unseemly glee to ram the dagger home. Yet it was bound to happen sooner or later, and the real question is why none of Bodega’s investors raised concerns about it at any stage. You could say, quite reasonably, that the culprit was the lack of diverse voices in technology and finance, but I suspect that something else was involved. The founders were clearly aware of the potential for trouble, but they were so in love with their name and logo that they ignored it. It was worse than a sin—it was a mistake. And if they’re penalized for it, it shouldn’t be for being offensive, but for being bad at what they were supposed to be doing. As Colapinto writes:

Placek said that it can be dangerous to become too programmatic about what he calls “tactical” aspects of naming. The real goal, he says, is to determine what “story” a client wishes to tell about his product (it’s faster, it’s more powerful, it’s easier to use) and then find a word that evokes it, without being predictable or even necessarily logical.

For better or worse, “Bodega” was definitely a name that told a story. And it ended up saying more about its founders than they probably would have liked.

Written by nevalalee

September 14, 2017 at 9:20 am

Updike’s ladder

with 2 comments

In the latest issue of The Atlantic, the author Anjali Enjeti has an article titled “Why I’m Still Trying to Get a Book Deal After Ten Years.” If just reading those words makes your palms sweat and puts your heart through a few sympathy palpitations, congratulations—you’re a writer. No matter where you might be in your career, or what length of time you can mentally insert into that headline, you can probably relate to Enjeti when she writes:

Ten years ago, while sitting at my computer in my sparsely furnished office, I sent my first email to a literary agent. The message included a query letter—a brief synopsis describing the personal-essay collection I’d been working on for the past six years, as well as a short bio about myself. As my third child kicked from inside my pregnant belly, I fantasized about what would come next: a request from the agent to see my book proposal, followed by a dream phone call offering me representation. If all went well, I’d be on my way to becoming a published author by the time my oldest child started first grade.

“Things didn’t go as planned,” Enjeti says drily, noting that after landing and leaving two agents, she’s been left with six unpublished manuscripts and little else to show for it. She goes on to share the stories of other writers in the same situation, including Michael Bourne of Poets & Writers, who accurately calls the submission process “a slow mauling of my psyche.” And Enjeti wonders: “So after sixteen years of writing books and ten years of failing to find a publisher, why do I keep trying? I ask myself this every day.”

It’s a good question. As it happens, I came across her article while reading the biography Updike by Adam Begley, which chronicles a literary career that amounts to the exact opposite of the ones described above. Begley’s account of John Updike’s first acceptance from The New Yorker—just months after his graduation from Harvard—is like lifestyle porn for writers:

He never forgot the moment when he retrieved the envelope from the mailbox at the end of the drive, the same mailbox that had yielded so many rejection slips, both his and his mother’s: “I felt, standing and reading the good news in the midsummer pink dusk of the stony road beside a field of waving weeds, born as a professional writer.” To extend the metaphor…the actual labor was brief and painless: he passed from unpublished college student to valued contributor in less than two months.

If you’re a writer of any kind, you’re probably biting your hand right now. And I haven’t even gotten to what happened to Updike shortly afterward:

A letter from Katharine White [of The New Yorker] dated September 15, 1954 and addressed to “John H. Updike, General Delivery, Oxford,” proposed that he sign a “first-reading agreement,” a scheme devised for the “most valued and most constant contributors.” Up to this point, he had only one story accepted, along with some light verse. White acknowledged that it was “rather unusual” for the magazine to make this kind of offer to a contributor “of such short standing,” but she and Maxwell and Shawn took into consideration the volume of his submissions…and their overall quality and suitability, and decided that this clever, hard-working young man showed exceptional promise.

Updike was twenty-two years old. Even now, more than half a century later and with his early promise more than fulfilled, it’s hard to read this account without hating him a little. Norman Mailer—whose debut novel, The Naked and the Dead, appeared when he was twenty-five—didn’t pull any punches in “Some Children of the Goddess,” an essay on his contemporaries that was published in Esquire in 1963: “[Updike’s] reputation has traveled in convoy up the Avenue of the Establishment, The New York Times Book Review, blowing sirens like a motorcycle caravan, the professional muse of The New Yorker sitting in the Cadillac, membership cards to the right Fellowships in his pocket.” And Begley, his biographer, acknowledges the singular nature of his subject’s rise:

It’s worth pausing here to marvel at the unrelieved smoothness of his professional path…Among the other twentieth-century American writers who made a splash before their thirtieth birthday…none piled up accomplishments in as orderly a fashion as Updike, or with as little fuss…This frictionless success has sometimes been held against him. His vast oeuvre materialized with suspiciously little visible effort. Where there’s no struggle, can there be real art? The Romantic notion of the tortured poet has left us with a mild prejudice against the idea of art produced in a calm, rational, workmanlike manner (as he put it, “on a healthy basis of regularity and avoidance of strain”), but that’s precisely how Updike got his start.

Begley doesn’t mention that the phrase “regularity and avoidance of strain” is actually meant to evoke the act of defecation, but even this provides us with an odd picture of writerly contentment. As Dick Hallorann says in The Shining, the best movie about writing ever made: “You got to keep regular, if you want to be happy.”

If there’s a larger theme here, it’s that the qualities that we associate with Updike’s career—with its reliable production of uniform hardcover editions over the course of five decades—are inseparable from the “orderly” circumstances of his rise. Updike never lacked a prestigious venue for his talents, which allowed him to focus on being productive. Writers whose publication history remains volatile and unpredictable, even after they’ve seen print, don’t always have the luxury of being so unruffled, and it can affect their work in ways that are almost subliminal. (A writer can’t survive ten years of waiting for a book deal without spending the entire time convinced that he or she is on the verge of a breakthrough, anticipating an ending that never comes, which may partially explain the literary world’s fondness for frustration and unresolved narratives.) The short answer to Begley’s question is that struggle is good for a writer, but so is success, and you take what you can get, even you’re transformed by it. I seem to think on a monthly basis of what Nicholson Baker writes of Updike in his tribute U and I:

I compared my awkward public self-promotion too with a documentary about Updike that I saw in 1983, I believe, on public TV, in which, in one scene, as the camera follows his climb up a ladder at his mother’s house to put up or take down some storm windows, in the midst of this tricky physical act, he tosses down to us some startlingly lucid little felicity, something about “These small yearly duties which blah blah blah,” and I was stunned to recognize that in Updike we were dealing with a man so naturally verbal that he could write his fucking memoirs on a ladder!

We’re all on that ladder. Some are on their way up, some are headed down, and some are stuck for years on the same rung. But you never get anywhere if you don’t try to climb.

Broyles’s Law and the Ken Burns effect

with one comment

For most of my life as a moviegoer, I’ve followed a rule that has served me pretty well. Whenever the director of a documentary narrates the story in the first person, or, worse, appears on camera, I start to get suspicious. I’m not talking about movies like Roger and Me or even the loathsome Catfish, in which the filmmakers, for better or worse, are inherently part of the action, but about films in which the director inserts himself into the frame for no particular reason. Occasionally, I can forgive this, as I did with the brilliant The Cove, but usually, I feel a moment of doubt whenever the director’s voiceover begins. (In its worst form, it opens the movie with a redundant narration: “I first came across the story that you’re about to hear in the summer of 1990…”) But while I still think that this is a danger sign, I’ve recently concluded that I was wrong about why. I had always assumed that it was a sign of ego—that these directors were imposing themselves on a story that was really about other people, because they thought that it was all about them. In reality, it seems more likely that it’s a solution to a technical problem. What happens, I think, is that the director sits down to review his footage and discovers that it can’t be cut together as a coherent narrative. Perhaps there are are crucial scenes or beats missing, but the events that the movie depicts are long over, or there’s no budget to go back and shoot more. An interview might bridge the gaps, but maybe this isn’t logistically feasible. In the end, the director is left with just one person who is available to say all the right things on the soundtrack to provide the necessary transitions and clarifications. It’s himself. In a perfect world, if he had gotten the material that he needed, he wouldn’t have to be in his own movie at all, but he doesn’t have a choice. It isn’t a failure of character, but of technique, and the result ends up being much the same.

I got to thinking about this after reading a recent New Yorker profile by Ian Parker of the documentarian Ken Burns, whose upcoming series on the Vietnam War is poised to become a major cultural event. The article takes an irreverent tone toward Burns, whose cultural status encourages him to speechification in private: “His default conversational setting is Commencement Address, involving quotation from nineteenth-century heroes and from his own previous commentary, and moments of almost rhapsodic self-appreciation. He is readier than most people to regard his creative decisions as courageous.” But Parker also shares a fascinating anecdote about which I wish I knew more:

In the mid-eighties, Burns was working on a deft, entertaining documentary about Huey Long, the populist Louisiana politician. He asked two historians, William Leuchtenburg and Alan Brinkley, about a photograph he hoped to use, as a part of the account of Long’s assassination; it showed him protected by a phalanx of state troopers. Brinkley told him that the image might mislead; Long usually had plainclothes bodyguards. Burns felt thwarted. Then Leuchtenburg spoke. He’d just watched a football game in which Frank Broyles, the former University of Arkansas coach, was a commentator. When the game paused to allow a hurt player to be examined, Broyles explained that coaches tend to gauge the seriousness of an injury by asking a player his name or the time of day; if he can’t answer correctly, it’s serious. As Burns recalled it, Broyles went on, “But, of course, if the player is important to the game, we tell him what his name is, we tell him what time it is, and we send him back in.”

Hence Broyles’s Law: “If it’s super-important, if it’s working, you tell him what his name is, and you send him back into the game.” Burns decided to leave the photo in the movie. Parker continues:

Was this, perhaps, a terrible law? Burns laughed. “It’s a terrible law!” But, he went on, it didn’t let him off the hook, ethically. “This would be Werner Herzog’s ‘ecstatic truth’—‘I can do anything I want. I’ll pay the town drunk to crawl across the ice in the Russian village.’” He was referring to scenes in Herzog’s Bells from the Deep, which Herzog has been happy to describe, and defend, as stage-managed. “If he chooses to do that, that’s okay. And then there are other people who’d rather do reenactments than have a photograph that’s vague.” Instead, Burns said, “We do enough research that we can pretty much convince ourselves—in the best sense of the word—that we’ve done the honorable job.”

The reasoning in this paragraph is a little muddled, but Burns seems to be saying that he isn’t relying on “the ecstatic truth” of Herzog, who blurs the line between fiction and reality, or the reenactments favored by Errol Morris, who sometimes seems to be making a feature film interspersed with footage of talking heads. Instead, Burns is assembling a narrative solely out of primary sources, and if an image furthers the viewer’s intellectual understanding or emotional engagement, it can be included, even if it isn’t strictly accurate. These are the compromises that you make when you’re determined to use nothing but the visuals that you have available, and you trust in your understanding of the material to tell whether or not you’ve made the “honorable” choice.

On some level, this is basically what every author of nonfiction has to consider when assembling sources, which involves countless judgment calls about emphasis, order, and selection, as I’ve discussed here before. But I’m more interested in the point that this emerges from a technical issue inherent to the form of the documentary itself, in which the viewer always has to be looking at something. When the perfect image isn’t available, you have a few different options. You can ignore the problem; you can cut to an interview subject who tells the viewers about what they’re not seeing; or you can shoot a reenactment. (Recent documentaries seem to lean heavily on animation, presumably because it’s cheaper and easier to control in the studio.) Or, like Burns, you can make do with what you have, because that’s how you’ve defined the task for yourself. Burns wants to use nothing but interviews, narration, and archival materials, and the technical tricks that we’ve come to associate with his style—like the camera pan across photos that Apple actually calls the Ken Burns effect—arise directly out of those constraints. The result is often brilliant, in large part because Burns has no choice but to think hard about how to use the materials that he has. Broyles’s Law may be “terrible,” but it’s better than most of the alternatives. Burns has the luxury of big budgets, a huge staff, and a lot of time, which allows him to be fastidious about his solutions to such problems. But a desperate documentary filmmaker, faced with no money and a hole in the story to fill, may have no other recourse than to grab a microphone, sit down in the editing bay, and start to speak: “I first came across the story that you’re about to hear in the summer of 1990…”

Written by nevalalee

September 11, 2017 at 9:12 am

%d bloggers like this: