Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘The New Yorker

The castle on the keyboard

with 2 comments

In March, the graphic artist Susan Kare, who is best known for designing the fonts and icons for the original Apple Macintosh, was awarded a medal of recognition from the professional organization AIGA. It occurred to me to write a post about her work, but when I opened a gallery of her designs, I found myself sidetracked by an unexpected sensation. I felt happy. Looking at those familiar images—the Paintbrush, the Trash Can, even the Bomb—brought me as close as I’ve come in a long time to what Proust describes after taking a bite of the madeleine in the first volume of In Search of Lost Time:

Just as the Japanese amuse themselves by filling a porcelain bowl with water and steeping in it little crumbs of paper which until then are without character or form, but, the moment they become wet, stretch themselves and bend, take on color and distinctive shape, become flowers or houses or people, permanent and recognizable, so in that moment all the flowers in our garden…and the good folk of the village and their little dwellings and the parish church and the whole of Combray and of its surroundings, taking their proper shapes and growing solid, sprang into being, town and gardens alike, from my cup of tea.

In my case, it wasn’t a physical location that blossomed into existence, but a moment in my life that I’ve tried repeatedly to evoke here before. I was in my early teens, which isn’t a great period for anyone, and I can’t say that I was content. But for better or worse, I was becoming whatever I was supposed to be, and throughout much of that process, Kare’s icons provided the inescapable backdrop.

You could argue that nostalgia for computer hardware is a fairly recent phenomenon that will repeat itself in later generations, with children who are thirteen or younger today feeling equally sentimental toward devices that their parents regard with indifference—and you might be right. But I think that Kare’s work is genuinely special in at least two ways. One is that it’s a hallmark of perhaps the last time in history when a personal computer could feel like a beguiling toy, rather than an indispensable but utilitarian part of everyday life. The other is that her icons, with their handmade look and origins, bear the impression of another human being’s personality in ways that would all but disappear within a few years. As Alexandra Lange recounts in a recent profile of Kare:

In 1982, [Kare] was a sculptor and sometime curator when her high-school friend Andy Hertzfeld asked her to create graphics for a new computer that he was working on in California. Kare brought a Grid notebook to her job interview at Apple Computer. On its pages, she had sketched, in pink marker, a series of icons to represent the commands that Hertzfeld’s software would execute. Each square represented a pixel. A pointing finger meant “Paste.” A paintbrush symbolized “MacPaint.” Scissors said “Cut.” Kare told me about this origin moment: “As soon as I started work, Andy Hertzfeld wrote an icon editor and font editor so I could design images and letterforms using the Mac, not paper,” she said. “But I loved the puzzle-like nature of working in sixteen-by-sixteen and thirty-two-by-thirty-two pixel icon grids, and the marriage of craft and metaphor.”

That same icon editor, or one of its successors, was packaged with the Mac that I used, and I vividly remember clicking on that grid myself, shaping the building blocks of the interface in a way that seems hard to imagine now.

And Kare seems to have valued these aspects of her work even at the time. There’s a famous series of photos of her in a cubicle at Apple in 1984, leaning back in her chair with one New Balance sneaker propped against her desk, looking impossibly cool. In one of the pictures, if you zoom in on the shelf of books behind her, it’s possible to make out a few titles, including the first edition of Symbol Sourcebook by Henry Dreyfuss, with an introduction by none other than R. Buckminster Fuller. Kare has spoken highly of this book elsewhere, most notably in an interview with Alex Pang of Stanford, to whom she explained:

One of my favorite parts of the book is its list of hobo signals, that hobos used to contact each other when they were on the road. They look like they’re in chalk on stones…When you’re desperate for an idea—some icons, like the piece of paper, are no problem; but others defy the visual, like “undo”—you look at things like hobo signs. Like this: “Man with a gun lives here.” Now, I can’t say that anything in this book is exactly transported into the Macintosh interface, but I think I got a lot of help from this, just thinking. This kind of symbol appeals to me because it had to be really simple, and clear to a group of people who were not going to be studying these for years in academia. I don’t understand a lot of them—“These people are rich” is a top hat and a triangle—but I always had that at Apple. I still use it, and I’m grateful for it.

And it seems likely that this was the “symbol dictionary” in which Kare discovered the Bowen Knot, a symbol once used to indicate “interesting features” at Swedish campgrounds, which lives on as the Command icon on the Mac.

According to Kare, the Bowen Knot originally represented a castle with four turrets, and if you’re imaginative enough, you can imagine it springing into being from the keys to either side of the space bar, like the village from Proust’s teacup. Like the hobo signs, Kare’s icons are a system of signals left to those who might pass by in the future, and the fact that they’ve managed to survive at Apple in even a limited way is something of a miracle in itself. (As the tech journalist Mike Murphy recently wrote: “For whatever reason, Apple looks and acts far more like a luxury brand than a consumer-technology brand in 2018.” And there isn’t much room in that business for castles or hobo signs.) When you click through the emulated versions of the earliest models of the Macintosh on the Internet Archive, it can feel like a temporary return to those values, or like a visit to a Zen garden. Yet if we only try to recapture it, we miss the point. Toward the end of In Search of Lost Time, Proust experiences a second moment of revelation, when he stumbles in a courtyard and catches himself “on a flagstone lower than the one next it,” which reminds him of a similar sensation that he had once felt at the Baptistry of St. Mark in Venice. And what he says of this flash of insight reminds me of how I feel when I look at the Happy Mac, and all the possibilities that it once seemed to express:

As at the moment when I tasted the madeleine, all my apprehensions about the future, all my intellectual doubts, were dissipated. Those doubts which had assailed me just before, regarding the reality of my literary gifts and even regarding the reality of literature itself were dispersed as though by magic…Merely repeating the movement was useless; but if…I succeeded in recapturing the sensation which accompanied the movement, again the intoxicating and elusive vision softly pervaded me, as though it said, “Grasp me as I float by you, if you can, and try to solve the enigma of happiness I offer you.”

Written by nevalalee

June 15, 2018 at 8:50 am

The president is collaborating

leave a comment »

Last week, Bill Clinton and James Patterson released their collaborative novel The President is Missing, which has already sold something like a quarter of a million copies. Its publication was heralded by a lavish two-page spread in The New Yorker, with effusive blurbs from just about everyone whom a former president and the world’s bestselling author might be expected to get on the phone. (Lee Child: “The political thriller of the decade.” Ron Chernow: “A fabulously entertaining thriller.”) If you want proof that the magazine’s advertising department is fully insulated from its editorial side, however, you can just point to the fact that the task of reviewing the book itself was given to Anthony Lane, who doesn’t tend to look favorably on much of anything. Lane’s style—he has evidently never met a smug pun or young starlet he didn’t like—can occasionally turn me off from his movie reviews, but I’ve always admired his literary takedowns. I don’t think a month goes by that I don’t remember his writeup of the New York Times bestseller list May 15, 1994, which allowed him to tackle the likes of The Bridges of Madison County, The Celestine Prophecy, and especially The Day After Tomorrow by Allan Folsom, from which he quoted a sentence that permanently changed my view of such novels: “Two hundred European cities have bus links with Frankfurt.” But he seems to have grudgingly liked The President is Missing. If nothing else, he furnishes a backhanded compliment that has already been posted, hilariously out of context, on Amazon: “If you want to make the most of your late-capitalist leisure-time, hit the couch, crack a Bud, punch the book open, focus your squint, and enjoy.”

The words “hit the couch, crack a Bud, punch the book open, [and] focus your squint,” are all callbacks to samples of Patterson’s prose that Lane quotes in the review, but the phrase “late-capitalist leisure-time” might require some additional explanation. It’s a reference to the paper “Structure over Style: Collaborative Authorship and the Revival of Literary Capitalism,” which appeared last year in Digital Humanities Review, and I’m grateful to Lane for bringing it to my attention. The authors, Simon Fuller and James O’Sullivan, focus on the factory model of novelists who employ ghostwriters to boost their productivity, and their star exhibit is Patterson, to whom they devote the same kind of computational scrutiny that has previously uncovered traces of collaboration in Shakespeare. Not surprisingly, it turns out that Patterson doesn’t write most of the books that he ostensibly coauthors. (He may not even have done much of the writing on First to Die, which credits him as the sole writer.) But the paper is less interesting for its quantitative analysis than for its qualitative evaluation of what Patterson tells us about how we consume and enjoy fiction. For instance:

The form of [Patterson’s] novels also appears to be molded by contemporary experience. In particular, his work is perhaps best described as “commuter fiction.” Nicholas Paumgarten describes how the average time for a commute has significantly increased. As a result, reading has increasingly become one of those pursuits that can pass the time of a commute. For example, a truck driver describes how “he had never read any of Patterson’s books but that he had listened to every single one of them on the road.” A number of online reader reviews also describe Patterson’s writing in terms of their commutes…With large print, and chapters of two or three pages, Patterson’s works are constructed to fit between the stops on a metro line.

Of course, you could say much the same of many thrillers, particularly the kind known as the airport novel, which wasn’t just a book that you read on planes—at its peak, it was one in which many scenes took place in airports, which were still associated with glamor and escape. What sets Patterson apart from his peers is his ability to maintain a viable brand while publishing a dozen books every year. His productivity is inseparable from his use of coauthors, but he wasn’t the first. Fuller and O’Sullivan cite the case of Alexandre Dumas, who allegedly boasted of having written four hundred novels and thirty-five plays that had created jobs for over eight thousand people. And they dig up a remarkable quote from The German Ideology by Karl Marx and Friedrich Engels, who “favorably compare French popular fiction to the German, paying particular attention to the latter’s appropriation of the division of labor”:

In proclaiming the uniqueness of work in science and art, [Max] Stirner adopts a position far inferior to that of the bourgeoisie. At the present time it has already been found necessary to organize this “unique” activity. Horace Vernet would not have had time to paint even a tenth of his pictures if he regarded them as works which “only this Unique person is capable of producing.” In Paris, the great demand for vaudevilles and novels brought about the organization of work for their production, organization which at any rate yields something better than its “unique” competitors in Germany.

These days, you could easily imagine Marx and Engels making a similar case about film, by arguing that the products of collaboration in Hollywood have often been more interesting, or at least more entertaining, than movies made by artists working outside the system. And they might be right.

The analogy to movies and television seems especially appropriate in the case of Patterson, who has often drawn such comparisons himself, as he once did to The Guardian: “There is a lot to be said for collaboration, and it should be seen as just another way to do things, as it is in other forms of writing, such as for television, where it is standard practice.” Fuller and O’Sullivan compare Patterson’s brand to that of Alfred Hitchcock, whose name was attached to everything from Dell anthologies to The Three Investigators to Alfred Hitchcock’s Mystery Magazine. It’s a good parallel, but an even better one might be hiding in plain sight. In her recent profile of the television producer Ryan Murphy, Emily Nussbaum evokes an ability to repackage the ideas of others that puts even Patterson to shame:

Murphy is also a collector, with an eye for the timeliest idea, the best story to option. Many of his shows originate as a spec script or as some other source material. (Murphy owned the rights to the memoir Orange Is the New Black before Jenji Kohan did, if you want to imagine an alternative history of television.) Glee grew out of a script by Ian Brennan; Feud began as a screenplay by Jaffe Cohen and Michael Zam. These scripts then get their DNA radically altered and replicated in Murphy’s lab, retooled with his themes and his knack for idiosyncratic casting.

Murphy’s approach of retooling existing material in his own image might be even smarter than Patterson’s method of writing outlines for others to expand, and he’s going to need it. Two months ago, he signed an unprecedented $300 million contract with Netflix to produce content of all kinds: television shows, movies, documentaries. And another former president was watching. While Bill Clinton was working with Patterson, Barack Obama was finalizing a Netflix deal of his own—and if he needs a collaborator, he doesn’t have far to look.

The Prime of Miss Elizabeth Hoover

with 2 comments

Yesterday, as I was working on my post for this blog, I found myself thinking about the first time that I ever heard of Lyme disease, which, naturally, was on The Simpsons. In the episode “Lisa’s Substitute,” which first aired on April 25, 1991, Lisa’s teacher, Miss Hoover, tells the class: “Children, I won’t be staying long. I just came from the doctor, and I have Lyme disease.” As Principal Skinner cheerfully explains: “Lyme disease is spread by small parasites called ‘ticks.’ When a diseased tick attaches itself to you, it begins sucking your blood. Malignant spirochetes infect your bloodstream, eventually spreading to your spinal fluid and on into the brain.” At the end of the second act, however, Miss Hoover unexpectedly returns, and I’ve never forgotten her explanation for her sudden recovery:

Miss Hoover: You see, class, my Lyme disease turned out to be psychosomatic.
Ralph: Does that mean you’re crazy?
Janie: It means she was faking it.
Miss Hoover: No, actually, it was a little of both. Sometimes, when a disease is in all the magazines and on all the news shows, it’s only natural that you think you have it.

And while it might seem excessive to criticize a television episode that first aired over a quarter of a century ago, it’s hard to read these lines after Porochista Khakpour’s memoir Sick without wishing that this particular joke didn’t exist.

In its chronic form, Lyme disease remains controversial, but like chronic fatigue syndrome and fibromyalgia, it’s an important element in the long, complicated history of women having trouble finding doctors who will take their pain seriously. As Lidija Haas writes in The New Yorker:

There’s a class of illnesses—multi-symptomatic, chronic, hard to diagnose—that remain associated with suffering women and disbelieving experts. Lyme disease, symptoms of which can afflict patients years after the initial tick bite, appears to be one…[The musician Kathleen Hanna] describes an experience common to many sufferers from chronic illness—that of being dismissed as an unreliable witness to what is happening inside her. Since no single medical condition, a doctor once told her, could plausibly affect so many different systems—neurological, respiratory, gastrointestinal—she must be having a panic attack…As in so many other areas of American life, women of color often endure the most extreme versions of this problem.

It goes without saying that when “Lisa’s Substitute” was written, there weren’t any women on the writing staff of The Simpsons, although even if there were, it might not have made a difference. In her recent memoir Just the Funny Parts, Nell Scovell, who worked as a freelance television writer in the early nineties, memorably describes the feeling of walking into the “all-male” Simpsons writing room, which was “welcoming, but also intimidating.” It’s hard to imagine these writers, so many of them undeniably brilliant, thinking twice about making a joke like this—and it’s frankly hard to see them rejecting it now, when it might only lead to attacks from people who, in Matt Groening’s words, “love to pretend they’re offended.”

I’m not saying that there are any subjects that should be excluded from comedic consideration, or that The Simpsons can’t joke about Lyme disease. But as I look back at the classic years of my favorite television show of all time, I’m starting to see a pattern that troubles me, and it goes far beyond Apu. I’m tempted to call it “punching down,” but it’s worse. It’s a tendency to pick what seem at the time like safe targets, and to focus with uncanny precision on comic gray areas that allow for certain forms of transgression. I know that I quoted this statement just a couple of months ago, but I can’t resist repeating what producer Bill Oakley says of Greg Daniels’s pitch about an episode on racism in Springfield:

Do you remember this? Something about Homer and Dr. Hibbert? Well, you pitched it several times and I think we were just…It was some exploration of the concept of race in Springfield, and we just said, you know, we don’t think this is the forum. The Simpsons can’t be the right forum to deal with racism.

He was probably right. But when you look at the few racially charged jokes that the show actually made, the characters involved weren’t black, but quite specifically “brown,” or members of groups that occupy a liminal space in our cultural understanding of race: Apu, Akira, Bumblebee Man. (I know that Akira was technically whiter than anybody else, but you get my drift.) By contrast, the show was very cautious when it came to its black characters. Apart from Dr. Hibbert, who was derived from Bill Cosby, the show’s only recurring black faces were Carl and Officer Lou, the latter of whom is so unmemorable that I had to look him up to make sure that he wasn’t Officer Eddie. And both Carl and Lou were given effectively the same voice by Hank Azaria, the defining feature of which was that it was nondescript as humanly possible.

I’m not necessarily criticizing the show’s treatment of race, but the unconscious conservatism that carefully avoided potentially controversial areas while lavishing attention on targets that seemed unobjectionable. It’s hard to imagine a version of the show that would have dared to employ such stereotypes, even ironically, on Carl, Lou, or even Judge Snyder, who was so racially undefined that he was occasionally colored as white. (The show’s most transgressive black figures, Drederick Tatum and Lucius Sweet, were so transparently modeled on real people that they barely even qualified as characters. As Homer once said: “You know Lucius Sweet? He’s one of the biggest names in boxing! He’s exactly as rich and as famous as Don King, and he looks just like him, too!” And I’m not even going to talk about “Bleeding Gums” Murphy.) That joke about Miss Hoover is starting to feel much the same way, and if it took two decades for my own sensibilities to catch up with that fact, it’s for the same reasons that we’re finally taking a harder look at Apu. And if I speak as a fan, it isn’t to qualify these thoughts, but to get at the heart of why I feel obliged to write about them at all. We’re all shaped by popular culture, and I can honestly say of The Simpsons, as Jack Kerouac writes in On the Road: “All my actions since then have been dictated automatically to my subconscious by this horrible osmotic experience.” The show’s later seasons are reflexively dismissed as lazy, derivative, and reliant on easy jokes, but we still venerate its golden years. Yet if The Simpsons has gradually degraded under the watch of many of its original writers and producers, this implies that we’re only seeing the logical culmination—or eruption—of something that was there all along, afflicting its viewers years after the original bite. We all believed that The Simpsons, in its prime, was making us smarter. But what if it was just psychosomatic?

Designing the future

leave a comment »

Over the last half century or so, our culture has increasingly turned to film and television, rather than to the written word, as its primary reference point when we talk about the future. This is partially because more people are likely to have seen a blockbuster movie than to have read even the most successful novel, but the visual arts might also be more useful when it comes to certain kinds of speculation. As I browsed recently through the book Speculative Everything, I was repeatedly struck by the thought that dealing with physical materials can lead to insights that can’t be reached through words alone. In his classic New Yorker profile of Stanley Kubrick, the science writer Jeremy Bernstein provided a portrait of one such master at work:

In the film [2001], the astronauts will wear space suits when they are working outside their ships, and Kubrick was very anxious that they should look like the space suits of thirty-five years from now…They were studying a vast array of samples of cloth to find one that would look right and photograph well. While this was going on, people were constantly dropping into the office with drawings, models, letters, cables, and various props, such as a model of a lens for one of the telescopes in a spaceship. (Kubrick rejected it because it looked too crude.) At the end of the day, when my head was beginning to spin, someone came by with a wristwatch that the astronauts were going to use on their Jupiter voyage (which Kubrick rejected) and a plastic drinking glass for the moon hotel (which Kubrick thought looked fine).

This is a level of detail that most writers would lack the patience or ability to develop, and even if it were possible, there’s a huge difference between describing such objects at length on the page, which is rightly discouraged, and showing it to the viewer without comment. It can also lead to new ideas or discoveries that can feed into the story itself. I never tire of quoting a piece of advice from Shamus Culhane’s Animation: From Script to Screen, in which he recommends using a list of props to generate plot points and bits of business for a short cartoon:

One good method of developing a story is to make a list of details. For example [for a cartoon about elves as clock cleaners in a cathedral], what architectural features come to mind—steeples, bells, windows, gargoyles? What props would the elves use—brushes, pails, mops, sponges…what else? Keep on compiling lists without stopping to think about them. Let your mind flow effortlessly, and don’t try to be neat or orderly. Scribble as fast as you can until you run out of ideas.

In animation—or in a medium like comics or the graphic novel—this kind of brainstorming requires nothing more than a pencil and piece of paper. Kubrick’s great achievement in 2001 was to spend the same amount of time and attention, as well as considerably more money, on solving design problems in tangible form, and in the process, he set a standard for this kind of speculation that both filmmakers and other artists have done their best to meet ever since.

In Speculative Everything, Anthony Dunne and Fiona Raby suggest that the function of a prop in a movie might limit the range of possibilities that it can explore, since it has “to be legible and support plot development.” But this might also be a hidden strength. I don’t think it’s an accident that Minority Report is both the most influential piece of futurology in recent memory and one of the few science fiction films that manages to construct a truly ingenious mystery. And in another masterpiece from the same period, Children of Men, you can clearly see the prop maker’s pragmatism at work. Dunne and Raby quote the director Alfonso Cuarón, who says in one of the special features on the DVD:

Rule number one in the film was recognizability. We didn’t want to do Blade Runner. Actually, we thought about being the anti-Blade Runner in the sense of how we were approaching reality, and that was kind of difficult for the art department, because I would say, “I don’t want inventiveness. I want reference. Don’t show me the great idea, show me the reference in real life. And more importantly, I would like—as much as possible—references of contemporary iconography that is already engraved in human consciousness.”

Consciously or otherwise, Cuarón is echoing one of my favorite pieces of writing advice from David Mamet, who had exactly one rule when it came to designing props: You’ve got to be able to recognize it.” And the need to emphasize clarity and readability in unfamiliar contexts can push production designers in directions that they never would have taken otherwise.

Yet there’s also a case to be made for engaging in visual or sculptural thinking for its own sake, which is what makes speculative design such an interesting avenue of exploration. Dunne and Raby focus on more recent examples, but there’s a surprisingly long history of futurology in pictures. (For instance, a series of French postcards dating from the late nineteenth century imagined life a hundred years in the future, which Isaac Asimov discusses in his book Futuredays, and the book and exhibition Yesterday’s Tomorrows collects many other vintage examples of artwork about the future of America.) Some of these efforts lack the discipline that a narrative imposes, but the physical constraints of the materials can lead to a similar kind of ingenuity, and the result is a distinct tradition that draws on a different set of skills than the ones that writers tend to use. But the best solution might be one that combines both words and images at a reasonable cost. The science fiction of the golden age can sometimes seem curiously lacking in visual description—it can be hard to figure out how anything is supposed to look in Asimov’s stories—and such magazines as Astounding leaned hard on its artists to fill in the blanks. And this might have been a reasonable division of labor. The fans don’t seem to have made any distinction between the stories and their illustrations, and both played a crucial role in defining the genre. Movies and television may be our current touchstones for the future, but the literary and visual arts have been conspiring to imagine the world of tomorrow for longer than we tend to remember. As Speculative Everything demonstrates, each medium can come up with remarkable things when allowed to work on its own. But they have even more power when they join forces.

Lessons of darkness

with 4 comments

Yesterday night, while browsing through the movies available on Netflix, I stumbled across Werner Herzog’s documentary Little Dieter Needs to Fly. I’d never seen it, so I put it on, and I was immediately entranced—it’s one of the most fascinating films I’ve ever seen. By now, the story is a familiar one, both through Herzog’s initial treatment of the material and his return to it in the movie Rescue Dawn. Dieter Dengler was born in Germany in 1938, fell in love with the idea of flying, emigrated to the United States to join the Air Force, and was shot down on his first bombing run over Laos. After his capture, torture, and imprisonment, he made a bloody escape, survived a barefoot trek through the jungle and downriver, and was rescued six months after his disappearance. Herzog never forgot the news reports, and in the finished film, which consists almost entirely of Dengler recounting his memories to the camera, he sticks mostly to the facts. Occasionally, he indulges in a heightening touch, as in a scene when Dengler arrives at his house in the Bay Area. As Herzog reveals in his wonderful book A Guide for the Perplexed:

When he gets out of his car, Dieter repeatedly opens and closes the car door before walking to the front door, which he again opens and closes. Eventually he goes inside. This is a scene I created…“Open and close your front door a couple of times,” I said, “then talk about the door as a symbol of freedom.” He hesitated and said, “I’ll look weird to my buddies.” What finally convinced him was when I told him how charming the ladies would think it was.

There are a few other staged moments, and most of them draw attention to themselves, as when Dengler delivers a monologue on death while standing before an aquarium tank of glowing jellyfish. For the most part, he seems happy to indulge Herzog, and we only gradually become aware of the reservoir of emotion and endurance behind his air of guilelessness. We never see Herzog, who speaks only in voiceover, but the film slowly reveals itself as a dialogue with a subject for whom the director feels nothing but respect. Herzog has made a point of cultivating his own mythology, and he more than lives up to it in practice, most famously when he was shot while talking to the BBC and made a point of finishing the interview. But he’s the one who really seems obsessed with jails, locks, and doors. In A Guide to the Perplexed, he tells us: “There is nothing wrong with spending a night in a jail cell if it means getting the shot you need.” A few lines later, he follows it with perhaps my favorite piece of advice for all aspiring artists: “Carry bolt cutters everywhere.” We can only imagine his feelings when confronted with Dengler, who, even in civilian life, is the epitome of the competent man. In his youth, he trained as a tool-and-die maker and a blacksmith, rebuilt church clocks, and willed himself into his dream job as a pilot. (Robert A. Heinlein would have loved him.) In the film, he nonchalantly shows us how to make a fire using two tubes of bamboo and how to escape from handcuffs using a paper clip, noting casually that it’s a skill that might come in handy. When Dengler displays the drums of rice and honey that he keeps under the floor of his house, just in case he needs them, you can sense Herzog nodding in agreement behind the camera.

Yet the film is also a remarkable interrogation of the myth of competence, and the ways in which it seems inseparable from luck, good timing, and even destiny. After years of trying to become a pilot, Dengler was shot down forty minutes into his first mission. In his escape from the camp, seven other prisoners got away, and five were never heard from again. The man with whom he fled, Duane W. Martin, was beheaded by a villager, and Dengler only narrowly escaped. A few weeks later, on the verge of death, he was rescued by the purest chance, when an American pilot happened to see a flash of white at the bend in the river. Only an extraordinary personality would have survived at all, but Dengler had been placed in a situation in which training, intelligence, and endurance were necessary, but not sufficient. There are obvious parallels to the American experience in Vietnam, but Herzog resists them, presumably because he doesn’t find them all that interesting. What intrigues him is the idea of competence pressed to its limits, which Dengler was forced to experience, while Herzog has actively courted it for his entire life. In a profile in The New Yorker that I’ve never forgotten, published before the release of Rescue Dawn, Daniel Zalewski quotes Herzog’s first assistant director Josef Lieck:

I have formed this theory that Werner has, probably from midpuberty, been trying very hard to die a grand, poetic death. Whenever there is anything dangerous, you can be sure he’ll run out to do it first. But I think he will have his grand, poetic death in a different way. I think he will live to be a hundred and five. He’ll have tried all his life to get chopped to pieces or fall from a helicopter, and, in the end, he will die on his pillowcase.

It isn’t clear yet how Herzog will die, a prospect that fills me with more dread than that of any other celebrity. But we know a little about how Dengler passed away. In A Guide for the Perplexed, Herzog only says: “[Dieter] died some years ago of Lou Gehrig’s disease, and the first thing the illness took was his power of speech. How scandalous that in his final days he was bereft of words…He died…a few years after Little Dieter Needs to Fly was released, having battled the disease like a warrior.” In fact, he shot himself in front of a fire station, and you can read a lot into Herzog’s unusual reticence. Dengler, a fundamentally gentle man, was repeatedly confronted by the kind of physical and spiritual struggle that Herzog seeks out, and the comparison only makes the director seem more like “a clown,” as he once described himself, particularly in the way in which he drags along his collaborators. (My favorite moment in Zalewski’s profile comes when Herzog dismisses a safety issue in a scene involving Christian Bale, who erupts: “I am not going to feckin’ die for you, Werner!”) It’s been a quarter of a century since Little Dieter was released, but I’m glad that I saw it only now, at a point in my life when I can better understand Herzog’s awe toward his subject:

What I continue to find wondrous is that Dieter emerged from his experiences without so much as a hint of bitterness; he was forever able to bear the misery with great optimism. Dieter had such an impressive and jubilant attitude to life, able to brush his experiences aside and deal with them, never making a fuss. He has been a role model for me, and even today when I am in a complicated situation I ask myself, “What would Dieter do?”

Written by nevalalee

May 31, 2018 at 9:09 am

The uranium in the wine bottle

with one comment

In the March 1944 issue of Astounding Science Fiction, readers were treated to the story “Deadline” by Cleve Cartmill, which was set on an alien planet consumed by a war between two factions known as the “Sixa” and the “Seilla.” Its hero was a spy, complete with a prehensile tail, whose mission was to fly into enemy territory and destroy the ultimate weapon before it could be detonated. The story itself was undeniably mediocre, and it would be utterly forgotten today if it weren’t for its description of the weapon in question, an atomic bomb, which Cartmill based almost verbatim on letters from the editor John W. Campbell, who had pitched the idea in the first place. According to the physicist Edward Teller, it was plausible enough to cause “astonishment” at the Manhattan Project, which counted many readers of the magazine among its scientists, and after it was brought to the attention of the Counterintelligence Corps, both Campbell and Cartmill were interviewed to investigate the possibility of a leak. In reality, “Deadline” wasn’t even much of a prediction—Campbell, who was feeling frustrated about his lack of involvement in war research, had a hunch that an atomic bomb was in the works, and he packed the story with technical information that was already in the public domain. He evidently hoped that it would draw official interest that might lead to a real defense role, which failed to materialize. After the war, however, it paid off immensely, and Campbell found himself hailed as a prophet. Cartmill, the credited author, neatly fell out of the picture, and the fact that the story hadn’t predicted much of anything was lost on most readers. Campbell had essentially orchestrated the most famous anecdote of his career, planting “Deadline” in the magazine expressly so that he could point to it later, and across multiple retellings, the details of the ensuing investigation were exaggerated beyond recognition. As the historian Donald Spoto aptly puts it: “[His] calculated image of himself as a prophet does not coincide with the truth; inspired by his sense of publicity, he told a better story than the facts reveal.”

But Spoto isn’t writing about Campbell, but about Alfred Hitchcock, in his classic biography The Dark Side of Genius, and the story here isn’t “Deadline,” but the great romantic thriller Notorious. As legend has it, when Hitchcock had to come up with the MacGuffin, or the plot point that would drive the rest of the movie, he proposed a sample of uranium hidden in a wine bottle by a group of Nazis in Brazil. As he said to François Truffaut in their famous book-length interview:

The producer said, “What in the name of good­ness is that?” I said, “This is uranium; it’s the thing they’re going to make an atom bomb with.” And he asked, “What atom bomb?” This, you must remember, was in 1944, a year before Hiroshima. I had only one clue. A writer friend of mine had told me that scientists were working on a secret project someplace in New Mexico. It was so secret that once they went into the plant, they never emerged again. I was also aware that the Germans were conducting experiments with heavy water in Norway. So these clues brought me to the uranium Mac­Guffin. The producer was skeptical, and he felt it was absurd to use the idea of an atom bomb as the basis for our story. I told him that it wasn’t the basis for the story, but only the MacGuffin, and I explained that there was no need to attach too much importance to it.

In the end, the idea was approved, and Hitchcock and screenwriter Ben Hecht allegedly went to Pasadena to get background information from the physicist Robert A. Millikan. According to Hitchcock, Millikan responded: “You want to have yourselves arrested and have me arrested as well?” After this outburst, Milkian informed them—in something of a non sequitur—that the idea was impossible anyway, although others evidently felt that they had come too close for comfort. As Hitchcock confided in Truffaut: “I learned later that after­ward the FBI had me under surveillance for three months.”

Like many movie buffs, I accepted this story without question for years, but when you encounter it after the “Deadline” incident, it starts to seem too good to be true, which it was. As Spoto writes in The Dark Side of Genius: “The business of the uranium remained a considerable source of publicity for Hitchcock  to the end of his life. To François Truffaut, to this writer, and to many others, he always insisted that he had chosen the device of uranium ore in Nazi experiments quite coincidentally, far in advance of the detonation of the atomic bomb in Japan in August 1945…He always emphasized, in every discussion of Notorious, that he was virtually a prophet.” The truth, Spoto continues, was very different:

By the time Notorious actually began filming, in October 1945, Hitchcock had made yet another trip to London…and he had returned to Los Angeles for final script work in September—after the bombings of Japan, and after he had spent several weeks in New York testing actors, among whom were several famous German refugees he finally cast in the film. On the basis of news from these German contacts, and from the accounts that flooded the world press…Hitchcock and Hecht refined the last addenda to their script just before the first day of production…All the evidence suggests that in truth the uranium was included after the fact.

As for the allegation of government surveillance, it was evidently based on a general directive from the FBI that the producer David O. Selznick received in May, which cautioned that any movie that featured American intelligence would have to be cleared by the State Department. Like Campbell, Hitchcock liked to make people think that he had been given special attention, and over the years, in both cases, the stories only grew.

There are obvious similarities between these two incidents, as well as equally noteworthy differences. With “Deadline,” the description of the bomb is the story’s sole reason for existing, while Notorious would still be a masterpiece even if the MacGuffin had been something else entirely. (As Hitchcock allegedly told his producer: “Look, if you don’t like uranium, let’s make it industrial diamonds, which the Germans need to cut their tools with.” He claimed to have later told a movie executive who had objected to the screenplay on grounds of its implausibility: “You were wrong to attach any importance to the MacGuffin. Notorious was simply the story of a man in love with a girl who, in the course of her official duties, had to go to bed with another man and even had to marry him. That’s the story.” And even if he invented the conversation, his point still stands.) The other difference is the use to which each anecdote was put. For Hitchcock, the uranium incident, and the reputation that it gave him as a “prophet,” was just another way of burnishing his image, and although he enjoyed dining out on it, it was a minor part of his legend. Campbell, by contrast, used it as the basis for his entire postwar career. Just two weeks after Hiroshima, The New Yorker profiled him in a Talk of the Town piece titled “1945 Cassandra,” in which it credulously wrote:

If you want to keep up with, or possibly stay ahead of, the development of secret weapons in time of war, you had better…go to the pulps, preferably Astounding. One reason is that Astounding, which has for the past ten years or so been predicting atomic bombs and using them to liven up its stories, has been permitted to duck some of the security rules that made high-echelon government officials such halting conversationalists in recent months.

And that reputation hinged largely on the myth of “Deadline” and its creation. It bought Campbell tremendous credibility after the war, earned or otherwise, and it played a significant role in science fiction’s big push into the mainstream. Eventually, the editor would stake—and lose—all of that goodwill on dianetics. But for a few years, Campbell, like Hitchcock, got to play his audience like a piano, and both men liked to pretend that they had once been notorious.

A season of disenchantment

leave a comment »

A few days ago, Matt Groening announced that his new animated series, Disenchantment, will premiere in August on Netflix. Under other circumstances, I might have been pleased by the prospect of another show from the creator of The Simpsons and Futurama—not to mention producers Bill Oakley and Josh Weinstein—and I expect that I’ll probably watch it. At the moment, however, it’s hard for me to think about Groening at all without recalling his recent reaction to the long overdue conversation around the character of Apu. When Bill Keveny of USA Today asked earlier this month if he had any thoughts on the subject, Groening replied: “Not really. I’m proud of what we do on the show. And I think it’s a time in our culture where people love to pretend they’re offended.” It was a profoundly disappointing statement, particularly after Hank Azaria himself had expressed his willingness to step aside from the role, and it was all the more disillusioning coming from a man whose work has been a part of my life for as long as I can remember. As I noted in my earlier post, the show’s unfeeling response to this issue is painful because it contradicts everything that The Simpsons was once supposed to represent. It was the smartest show on television; it was simply right about everything; it offered its fans an entire metaphorical language. And as the passage of time reveals that it suffered from its own set of blinders, it doesn’t just cast doubt on the series and its creators, but on the viewers, like me, who used it for so long as an intellectual benchmark.

And it’s still an inescapable part of my personal lexicon. Last year, for instance, when Elon Musk defended his decision to serve on Trump’s economic advisory council, I thought immediately of what Homer says to Marge in “Whacking Day”: “Maybe if I’m part of that mob, I can help steer it in wise directions.” Yet it turns out that I might have been too quick to give Musk—who, revealingly, was the subject of an adulatory episode of The Simpsons—the benefit of the doubt. A few months later, in response to reports of discrimination at Tesla, he wrote an email to employees that included this remarkable paragraph:

If someone is a jerk to you, but sincerely apologizes, it is important to be thick-skinned and accept that apology. If you are part of a lesser represented group, you don’t get a free pass on being a jerk yourself. We have had a few cases at Tesla were someone in a less represented group was actually given a job or promoted over more qualified highly represented candidates and then decided to sue Tesla for millions of dollars because they felt they weren’t promoted enough. That is obviously not cool.

The last two lines, which were a clear reference to the case of A.J. Vandermeyden, tell us more about Musk’s idea of a “sincere apology” than he probably intended. And when Musk responded this week to criticism of Tesla’s safety and labor practices by accusing the nonprofit Center for Investigative Reporting of bias and proposing a site where users could provide a “credibility score” for individual journalists, he sounded a lot like the president whose circle of advisers he only reluctantly left.

Musk, who benefited from years of uncritical coverage from people who will forgive anything as long as you talk about space travel, seems genuinely wounded by any form of criticism or scrutiny, and he lashes out just as Trump does—by questioning the motives of ordinary reporters or sources, whom he accuses of being in the pocket of unions or oil companies. Yet he’s also right to be worried. We’re living in a time when public figures and institutions are going to be judged by their responses to questions that they would rather avoid, which isn’t likely to change. And the media itself is hardly exempt. For the last two weeks, I’ve been waiting for The New Yorker to respond to stories about the actions of two of its most prominent contributors, Junot Díaz and the late David Foster Wallace. I’m not even sure what I want the magazine to do, exactly, except make an honest effort to grapple with the situation, and maybe even offer a valuable perspective, which is why I read it in the first place. (In all honesty, it fills much the same role in my life these days as The Simpsons did in my teens. As Norman Mailer wrote back in the sixties: “Hundreds of thousands, perhaps millions of people in the most established parts of the middle class kill their quickest impulses before they dare to act in such a way as to look ridiculous to the private eye of their taste whose style has been keyed by the eye of The New Yorker.”) As the days passed without any comment, I assumed that it was figuring out how to tackle an admittedly uncomfortable topic, and I didn’t expect it to rush. Now that we’ve reached the end of the month without any public engagement at all, however, I can only conclude that it’s deliberately ignoring the matter in hopes that it will go away. I hope that I’m wrong. But so far, it’s a discouraging omission from a magazine whose stories on Harvey Weinstein and Eric Schneiderman implicitly put it at the head of an entire movement.

The New Yorker has evidently discovered that it’s harder to take such stands when they affect people whom we know or care about— which only means that it can get in line. Our historical moment has forced some of our smartest individuals and organizations to learn how to take criticism as well as to give it, and it’s often those whose observations about others have been the sharpest who turn out to be singularly incapable, as Clarice Starling once put it, when it comes to pointing that high-powered perception on themselves. (In this list, which is constantly being updated, I include Groening, Musk, The New Yorker, and about half the cast of Arrested Development.) But I can sympathize with their predicament, because I feel it nearly every day. My opinion of Musk has always been rather mixed, but nothing can dislodge the affection and gratitude that I feel toward the first eight seasons of The Simpsons, and I expect to approvingly link to an article in The New Yorker this time next week. But if our disenchantment forces us to question the icons whose influence is fundamental to our conception of ourselves, then perhaps it will have been worth the pain. Separating our affection for the product from those who produced it is a problem that we all have to confront, and it isn’t going to get any easier. As I was thinking about this post yesterday, the news broke that Morgan Freeman had been accused by multiple women of inappropriate behavior. In response, he issued a statement that read in part: “I apologize to anyone who felt uncomfortable or disrespected.” It reminded me a little of another man who once grudgingly said of some remarks that were caught on tape: “I apologize if anyone was offended.” But it sounds a lot better when you imagine it in Morgan Freeman’s voice.

Written by nevalalee

May 25, 2018 at 9:21 am

%d bloggers like this: