Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘The New Yorker

Zen in America

with one comment

In the latest issue of The New Yorker, Adam Gopnik discusses the recent books Why Buddhism is True by Robert Wright and Stephen Batchelor’s After Buddhism, the latter of which he calls “in many ways the most intellectually stimulating book on Buddhism of the past few years.” As with most of the articles under the magazine’s departmental heading A Critic at Large, it’s less a review than an excuse to explore the subject in general, and Gopnik ends up delivering a gentle pitch for Buddhism as a secular philosophy of life. He writes:

As for the mind’s modules [Batchelor writes], “Gotama [Buddha] is interested in what people can do, not with what they are…We may have no control over the rush of fear prompted by finding a snake under our bed, but we do have the ability to respond to the situation in a way that is not determined by that fear.” Where Wright insists that the Buddhist doctrine of not-self precludes the possibility of freely chosen agency, Batchelor insists of Buddhism that “as soon as we consider it a task-based ethics…such objections vanish. The only thing that matters is whether or not you can perform a task.”

This idea was enormously appealing to certain Americans of the nineteenth century, as Gopnik notes earlier on: “These American Buddhists, drawn East in part by a rejection of Gilded Age ostentation, recognized a set of preoccupations like those they knew already—Whitman’s vision of a self that could shift and contain multitudes, or Thoreau’s secular withdrawal from the race of life…The quietist impulse in New England spirituality and the pantheistic impulse in American poetry both seemed met, and made picturesque, by the Buddhist tradition.”

I find something especially revealing in the way that such adherents hoped to combat certain stereotypically American tendencies, such as material striving and competition, with the equally American notion of a “task-based ethics.” The promise of turning one’s preference for concrete action—or rules of behavior—from a weakness into a strength is very attractive to someone like me, and I’ve always liked R.H. Blyth’s memorable description of the training of a Zen novitiate:

I remember when I began to attend lectures, at a Zen temple…I was surprised to find that there were no lofty spiritual truths enunciated at all. Two things stuck in my head, because they were repeated so often, and with such gusto. One of them, emphasized with extreme vigor, was that you must not smoke a cigarette while making water. The other was that when somebody calls you (in Japanese, “Oi!”) you must answer (“Hai!”) at once, without hesitation. When we compare this to the usual Christian exhortatory sermon, we cannot help being struck by the difference.

But I’ve also learned to be cautious about appropriating it for myself. I’ve been interested in Zen for a long time, particularly since discovering Blyth’s wonderful books Zen in English Literature and Oriental Classics and Haiku, but I don’t feel comfortable identifying with it. For one thing, it’s a complicated subject that I’ve never entirely gotten my head around, and I don’t follow its practice in fundamental ways. (I don’t meditate, for instance, although reading Gopnik’s article makes me think that I probably should.) My understanding of it is mediated through such Western interpreters as Blyth, and I feel less than qualified to talk about it for much the same reason that Robert Pirsig gives in his disclaimer to Zen and the Art of Motorcycle Maintenance: “What follows…should in no way be associated with that great body of factual information relating to orthodox Zen Buddhist practice. It’s not very factual on motorcycles, either.”

And my understanding of Zen can best be described as being not very factual on motorcycles. What I like about it is what Stewart Brand, speaking on the related issue of voluntary poverty, once called “the sheer practicality of the exercise,” and I’ve taken as much of its advice to heart as I can. It feels like common sense. The trouble, obviously, is that this extracts a tiny sliver of meaning from a vast spiritual tradition that most Westerners haven’t studied firsthand, and its cultural distance makes it easy for us to abstract whatever we want from it. As Gopnik points out:

[Batchelor’s] project is unashamedly to secularize Buddhism. But, since it’s Buddhism that he wants to secularize, he has to be able to show that its traditions are not hopelessly polluted with superstition…Batchelor, like every intelligent believer caught in an unsustainable belief, engages in a familiar set of moves. He attempts to italicize his way out of absurdity by, in effect, shifting the stresses in the simple sentence “We don’t believe that.” First, there’s “We don’t believe that”…Next comes “We don’t believe that”…In the end, we resort to “We don’t believe that”: we just accept it as an embedded metaphor of the culture that made the religion.

And Buddhism is probably more conducive to this “set of moves” by Americans than, say, Christianity, simply because it feels exotic and unfamiliar. If you look at the picture of Jesus that emerges from a study like The Five Gospels, you end up with a religious ethic that has important differences from Buddhism, but which also shares deep affinities in how it positions the self against the world. Yet it’s so tangled up with its history in America that secular types are unlikely to embrace it as a label.

Of course, this scavenging of traditions for spare parts is quintessentially American as well, and it comes out of an understandable impulse to correct our worst tendencies. In all honesty, I’m one of the least “Zen” people I know—I’m ambitious, competitive, and deeply invested in measuring myself against worldly standards of success, all of which I intend to renounce as soon as I’ve proven that I can win in all the conventional ways. It’s all very American of me. Yet it would be equally true to say that I’m drawn to the doctrine of nonattachment because I need it, and because it serves as a corrective to ingrained personality traits that would otherwise make me miserable. I’m not alone, either. Gopnik refers briefly to the San Francisco Zen Center and “its attempted marriage of spiritual elevation with wild entrepreneurial activity,” and the one thing that the most famous exponents of Zen had in common is that they were all hugely ambitious, as well as highly systematic in the way that they pursued their goals. You don’t become the spokesman for an entire religious tradition by accident, and I suspect that their ambition usually came first, and their lifelong effort to come to terms with it was channeled, not into withdrawal, but into a more active engagement with the world. This might seem contradictory, but we’re also simply more likely to talk about Blyth, Pirsig, D.T. Suzuki, Alan Watts, and the rest than the nameless monks who did the sensible thing and entered a life of quiet meditation. It skews our picture of Zen a bit, in particular by presenting it in intellectual terms that have more to do with the needs of writing a book than the inner experience of a principled adept, but not to an extent that seems impossible to overcome. It may well be, as Gopnik notes, that “secular Buddhism ends up being…secularism.” But even if we arrive there in a roundabout way, or call it by different names, it still amounts to the best set of tools that we have for survival in America, or just about anywhere else.

From Venice to Yale

leave a comment »

In a recent issue of The New Yorker, the scholar Stephen Greenblatt offers an insightful consideration of a Shakespearean comedy toward which he—like most of us—can hardly help having mixed feelings: “There is something very strange about experiencing The Merchant of Venice when you are somehow imaginatively implicated in the character and actions of its villain.” After recalling his uncomfortable experience as a Jewish undergraduate at Yale in the sixties, Greenblatt provides a beautiful summation of the pragmatic solution at which he arrived:

I wouldn’t attempt to hide my otherness and pass for what I was not. I wouldn’t turn away from works that caused me pain as well as pleasure. Instead, insofar as I could, I would pore over the whole vast, messy enterprise of culture as if it were my birthright…I was eager to expand my horizons, not to retreat into a defensive crouch. Prowling the stacks of Yale’s vast library, I sometimes felt giddy with excitement. I had a right to all of it, or, at least, to as much of it as I could seize and chew upon. And the same was true of everyone else.

Greenblatt, of course, went on to become one of our most valuable literary critics, and his evaluation of The Merchant of Venice is among the best I’ve seen: “If Shylock had behaved himself and remained a mere comic foil…there would have been no disturbance. But Shakespeare conferred too much energy on his Jewish usurer for the boundaries of native and alien, us and them, to remain intact…He did so not by creating a lovable alien—his Jew is a villain who connives at legal murder—but by giving Shylock more theatrical vitality, quite simply more urgent, compelling life, than anyone else in his world has.”

I’ve spent more time thinking about The Merchant of Venice than all but a handful of Shakespeare’s plays, precisely because of the “excess of life” that Greenblatt sees in Shylock, which is at its most impressive in a context where it has no business existing at all. Elsewhere, I’ve argued that Shakespeare’s genius is most visible when you compare him to his sources, which he transforms so completely that it destroys the notion that he was an opportunist who simply borrowed most of his plots. The Merchant of Venice is unique because its models are somehow right there on stage, existing simultaneously with the text, since we can hardly watch it and be unaware of the contrast between the antisemitic caricature of the original and Shylock’s uncanny power. Harold Bloom captures this quality in an extraordinary passage from Shakespeare: The Invention of the Human:

I have never seen The Merchant of Venice staged with Shylock as comic villain, but that is certainly how the play should be performed…If I were a director, I would instruct my Shylock to act like a hallucinatory bogeyman, a walking nightmare flamboyant with a big false nose and a bright red wig, that is to say, to look like Marlowe’s Barabas. We can imagine the surrealistic effect of such a figure when he begins to speak with the nervous intensity, the realistic energy of Shylock, who is so much of a personality as to at least rival his handful of lively precursors in Shakespeare: Faulconbridge the Bastard in King John, Mercurio and the Nurse in Romeo and Juliet, and Bottom the Weaver in A Midsummer Night’s Dream. But these characters all fit their roles, even if we can conceive of them as personalities outside of their plays. Shylock simply does not fit his role; he is the wrong Jew in the right play.

On some level, Shylock is a darker miracle of characterization than even Hamlet or Lear, because so much of his impact seems involuntary, even counterproductive. Shakespeare had no particular reason to make him into anything more than a stock villain, and in fact, his vividness actively detracts from the logic of the story itself, as Greenblatt notes: “Shylock came perilously close to wrecking the comic structure of the play, a structure that Shakespeare only barely rescued by making the moneylender disappear for good at the end of the fourth act.” Bloom, in turn, speaks of “the gap between the human that Shakespeare invents and the role that as playmaker he condemns Shylock to act,” a cognitive divide that tells us more about his art than the plays in which every part has been revised to fit like magic. I often learn more about craft from works of art that I resist than ones with which I agree completely, which only makes sense. When we want to believe in a story’s message, we’re less likely to scrutinize its methods, and we may even forgive lapses of taste or skill because we want to give it the benefit of the doubt. (This is the real reason why aspiring authors should avoid making overt political statements in a story, which encourages friendly critics to read the result more generously than it deserves. It’s gratifying in the moment, but it also can lead to faults going unaddressed until it’s too late to fix them.) Its opposite number is a work of art that we’d love to dismiss on moral or intellectual grounds, but which refuses to let us go. Since we have no imaginable reason to grant it a free pass, its craft stands out all the more clearly. The Merchant of Venice is the ultimate example. It’s the first play that I’d use to illustrate Shakespeare’s gift at creating characters who can seem more real to us than ourselves—which doesn’t make it any easier to read, teach, or perform.

This brings us back to the figure of Greenblatt at Yale, who saw the works that pained him as an essential part of his education. He writes:

I’m now an English professor at Harvard, and in recent years some of my students have seemed acutely anxious when they are asked to confront the crueler strains of our cultural legacy. In my own life, that reflex would have meant closing many of the books I found most fascinating, or succumbing to the general melancholy of my parents. They could not look out at a broad meadow from the windows of our car without sighing and talking about the number of European Jews who could have been saved from annihilation and settled in that very space. (For my parents, meadows should have come with what we now call “trigger warnings.”) I was eager to expand my horizons, not to retreat into a defensive crouch.

The question of how students should confront the problematic works of the past is one that I don’t expect to resolve here, except by noting that The Merchant of Venice represents a crucial data point. Without it, our picture of Shakespeare—and even of his greatness as a writer—is necessarily incomplete. When it comes to matters of education, it helps to keep a few simple tests in mind, and the humanities have an obligation to enable the possibility of this kind of confrontation, while also providing the framework within which it can be processed. Instead of working forward from a set of abstract principles, perhaps we should work backward from the desired result, which is to have the tools that we need when we reach the end of the labyrinth and find Shylock waiting for us. Even if we aren’t ready for him, we may not have a choice. As Bloom observes: “It would have been better for the Jews, if not for most of The Merchant of Venice’s audiences, had Shylock been a character less conspicuously alive.”

Written by nevalalee

July 18, 2017 at 8:49 am

Quote of the Day

leave a comment »

Written by nevalalee

July 6, 2017 at 7:30 am

The act of cutting

leave a comment »

In a recent article in The New Yorker on Ernest Hemingway, Adam Gopnik evocatively writes: “The heart of his style was not abbreviation but amputation; not simplicity but mystery.” He explains:

Again and again, he creates his effects by striking out what would seem to be essential material. In “Big Two-Hearted River,” Nick’s complicated European experience—or the way that fishing is sanity-preserving for Nick, the damaged veteran—is conveyed clearly in the first version, and left apparent only as implication in the published second version. In a draft of the heartbreaking early story “Hills Like White Elephants,” about a man talking his girlfriend into having an abortion, Hemingway twice uses the words “three of us.” This is the woman’s essential desire, to become three rather than two. But Hemingway strikes both instances from the finished story, so the key image remains as ghostly subtext within the sentences. We feel the missing “three,” but we don’t read it.

Gopnik concludes: “The art comes from scissoring out his natural garrulousness, and the mystery is made by what was elided. Reading through draft and then finished story, one is repeatedly stunned by the meticulous rightness of his elisions.” Following Hemingway’s own lead, Gopnik compares his practice to that of Cézanne, but it’s also reminiscent of Shakespeare, who frequently omits key information from his source material while leaving the other elements intact. Ambiguity, as I’ve noted here before, emerges from a network of specifics with one crucial piece removed.

Over the last two weeks, I’ve been ruthlessly cutting the first draft of my book, leaving me highly conscious of the effects that can come out of compression. In his fascinating notebooks, which I quoted here yesterday, Samuel Butler writes: “I have always found compressing, cutting out, and tersifying a passage suggests more than anything else does. Things pruned off in this way are like the heads of the hydra, two grow for every two that is lopped off.” This squares with my experience, and it reflects how so much of good writing depends on juxtaposition. By cutting, you’re bringing the remaining pieces closer together, which allows them to resonate. Butler then makes a very interesting point:

If a writer will go on the principle of stopping everywhere and anywhere to put down his notes, as the true painter will stop anywhere and everywhere to sketch, he will be able to cut down his works liberally. He will become prodigal not of writing—any fool can be this—but of omission. You become brief because you have more things to say than time to say them in. One of the chief arts is that of knowing what to neglect and the more talk increases the more necessary does this art become.

I love this passage because it reveals how two of my favorite activities—taking notes and cutting—are secretly the same thing. On some level, writing is about keeping the good stuff and removing as much of the rest as possible. The best ideas are likely to occur spontaneously when you’re doing something unrelated, which is why you need to write them down as soon as they come to you. When you’re sitting at your desk, you have little choice but to write mechanically in hopes that something good will happen. And in the act of cutting, the two converge.

Cutting can be a creative act in itself, which is why you sometimes need to force yourself to do it, even when you’d rather not. You occasionally see a distinction drawn between the additive and subtractive arts, but any work often partakes of both at various stages, which confer different benefits. In Behind the Seen, Charles Koppelman says of editing a movie in postproduction:

The orientation over the last six months has been one of accumulation, a building-up of material. Now the engines are suddenly thrown into full reverse. The enterprise will head in the opposite direction, shedding material as expeditiously as possible.

We shouldn’t disregard how challenging that mental switch can be. It’s why an editor like Walter Murch rarely visits the set, which allows him to maintain a kind of Apollonian detachment from the Dionysian process of filmmaking: he doesn’t want to be dissuaded from the need to cut a scene by the knowledge of how hard it was to make it. Writers and other artists working alone don’t have that luxury, and it can be difficult to work yourself up to the point where you’re ready to cut a section that took a long time to write. Time creates its own sort of psychological distance, which is why you’re often advised to put aside the draft for a few weeks, or even longer, before starting to revise it. (Zadie Smith writes deflatingly: “A year or more is ideal—but even three months will do.”) That isn’t always possible, and sometimes the best compromise is to work briefly on another project, like a short story. A change is as good as a rest, and in this case, you’re trying to transform into your future self as soon as possible, which will allow you to perform clinical surgery on the past.

The result is a lot like the old joke: you start with a block of marble, and you cut away everything that doesn’t look like an elephant. When I began to trim my manuscript, I set myself the slightly arbitrary goal of reducing it, at this stage, by thirty percent, guided by the editing rule that I mentioned here a month ago:

Murch also has his eye on what he calls the “thirty percent factor”—a rule of thumb he developed that deals with the relationship between the length of the film and the “core content” of the story. In general, thirty percent of a first assembly can be trimmed away without affecting the essential features of the script: all characters, action, story beats will be preserved and probably, like a good stew, enhanced by the reduction in bulk. But passing beyond the thirty percent barrier can usually be accomplished only by major structural alterations: the reduction or elimination of a character, or whole sequences—removing vital organs rather than trimming fat.

There’s no particular reason why the same percentage should hold for a book as well as a film, but I’ve found that it’s about right. (It also applies to other fields, like consumer electronics.) Really, though, it could have been just about any number, as long as it gave me a clear numerical goal at which to aim, and as long as it hurt a little. It’s sort of like physical exercise. If you want to lose weight, the best way is to eat less, and if you want to write a short book, ideally, you’d avoid writing too much in the first place. But the act of cutting, like exercise, has rewards of its own. As Elie Wiesel famously said: “There is a difference between a book of two hundred pages from the very beginning, and a book of two hundred pages which is the result of an original eight hundred pages. The six hundred pages are there. Only you don’t see them.” And the best indication that you’re on the right track is when it becomes physically painful. As Hemingway writes in A Farewell to Arms: “The world breaks everyone and afterward many are strong at the broken places.” That’s also true of books.

Written by nevalalee

June 29, 2017 at 8:38 am

We lost it at the movies

with 2 comments

Over a decade ago, the New Yorker film critic David Denby published a memoir titled American Sucker. I read it when it first came out, and I honestly can’t remember much about it, but there’s one section that has stuck in my mind ever since. Denby is writing of his obsession with investing, which has caused him to lose much of what he once loved about life, and he concludes sadly:

Well, you can’t get back to that. Do your job, then. After much starting and stopping, and considerable shifting of clauses, all the while watching the Nasdaq run above 5,000 on the CNNfn website, I put together the following as the opening of a review.

It happens to be his piece on Steven Soderbergh’s Erin Brockovich, which begins like this:

In Erin Brockovich, Julia Roberts appears in scene after scene wearing halter tops with a bit of bra showing; there’s a good bit of leg showing, too, often while she’s holding an infant on one arm. This upbeat, inspirational melodrama, based on a true story and written by Susannah Grant and directed by Steven Soderbergh, has been bought to life by a movie star on a heavenly rampage. Roberts swings into rooms, ablaze with indignation, her breasts pushed up and bulging out of the skimpy tops, and she rants at the people gaping at her. She’s a mother and a moral heroine who dresses like trailer trash but then snaps at anyone who doesn’t take her seriously—a real babe in arms, who gets to protect the weak and tell off the powerful while never turning her back on what she is.

Denby stops to evaluate his work: “Nothing great, but not bad either. I was reasonably happy with it as a lead—it moves, it’s active, it conveys a little of my pleasure in the picture. I got up and walked around the outer perimeter of the twentieth floor, looking west, looking east.”

I’ve never forgotten this passage, in part because it represents one of the few instances in which a prominent film critic has pulled back the curtain on an obvious but rarely acknowledged fact—that criticism is a genre of writing in itself, and that the phrases with which a movie is praised, analyzed, or dismissed are subject to the same sort of tinkering, revision, and doubt that we associate with other forms of expression. Critics are only human, even if sometimes try to pretend that they aren’t, as they present their opinions as the product of an unruffled sensibility. I found myself thinking of this again as I followed the recent furor over David Edelstein’s review of Wonder Woman in New York magazine, which starts as follows:

The only grace note in the generally clunky Wonder Woman is its star, the five-foot-ten-inch Israeli actress and model Gal Gadot, who is somehow the perfect blend of superbabe-in-the-woods innocence and mouthiness. She plays Diana, the daughter of the Amazon queen Hippolyta (Connie Nielsen) and a trained warrior. But she’s also a militant peacenik. Diana lives with Amazon women on a mystically shrouded island but she’s not Amazonian herself. She was, we’re told, sculpted by her mother from clay and brought to life by Zeus. (I’d like to have seen that.)

Edelstein was roundly attacked for what was perceived as the sexist tone of his review, which also includes such observations as “Israeli women are a breed unto themselves, which I say with both admiration and trepidation,” and “Fans might be disappointed that there’s no trace of the comic’s well-documented S&M kinkiness.” He responded with a private Facebook post, widely circulated, in which he wrote: “Right now I think the problem is that some people can’t read.” And he has since written a longer, more apologetic piece in which he tries to explain his choice of words.

I haven’t seen Wonder Woman, although I’m looking forward to it, so I won’t wade too far into the controversy itself. But when I look at these two reviews—which, significantly, are about films focusing on different sorts of heroines—I see some striking parallels. It isn’t just the echo of “a real babe in arms” with “superbabe-in-the-woods,” or how Brockovich “gets to protect the weak and tell off the powerful” while Diana is praised for her “mouthiness.” It’s something in the rhythm of their openings, which start at a full sprint with a consideration of a movie star’s appearance. As Denby says, “it moves, it’s active,” almost to a fault. Here are three additional examples, taken at random from the first paragraphs of reviews published in The New Yorker:

Gene Wilder stares at the world with nearsighted, pale-blue-eyed wonder; he was born with a comic’s flyblown wig and the look of a reddish creature from outer space. His features aren’t distinct; his personality lacks definition. His whole appearance is so fuzzy and weak he’s like mist on the lens.

There is a thick, raw sensuality that some adolescents have which seems almost preconscious. In Saturday Night Fever, John Travolta has this rawness to such a degree that he seems naturally exaggerated: an Expressionist painter’s view of a young role. As Tony, a nineteen-year-old Italian Catholic who works selling paint in a hardware store in Brooklyn’s Bay Ridge, he wears his heavy black hair brushed up in a blower-dried pompadour. His large, wide mouth stretches across his narrow face, and his eyes—small slits, close together—are, unexpectedly, glintingly blue and panicky.

As Jake La Motta, the former middleweight boxing champ, in Raging Bull, Robert De Niro wears scar tissue and a big, bent nose that deform his face. It’s a miracle that he didn’t grow them—he grew everything else. He developed a thick-muscled neck and a fighter’s body, and for the scenes of the broken, drunken La Motta he put on so much weight that he seems to have sunk in the fat with hardly a trace of himself left.

All of these reviews were written, of course, by Pauline Kael, who remains the movie critic who has inspired the greatest degree of imitation among her followers. And when you go back and read Denby and Edelstein’s openings, they feel like Kael impersonations, which is the mode on which a critic tends to fall back when he or she wants to start a review so that “it moves, it’s active.” Beginning with a description of the star, delivered in her trademark hyperaware, slightly hyperbolic style, was one of Kael’s stock devices, as if she were observing an animal seen in the wild and frantically jotting down her impressions before they faded. It’s a technical trick, but it’s a good one, and it isn’t surprising that Kael’s followers like to employ it, consciously or otherwise. It’s when a male critic uses it to describe the appearance of a woman that we run into trouble. (The real offender here isn’t Denby or Edelstein, but Anthony Lane, Kael’s successor at The New Yorker, whose reviews have the curious habit of panning a movie for a page and a half, and then pausing a third of the way from the end to rhapsodize about the appearance of a starlet in a supporting role, which is presented as its only saving grace. He often seems to be leering at her a little, which is possibly an inadvertent consequence of his literary debt to Kael. When Lane says of Scarlett Johansson, “She seemed to be made from champagne,” he’s echoing the Kael who wrote of Madeline Kahn: “When you look at her, you see a water bed at just the right temperature.”) Kael was a sensualist, and to the critics who came after her, who are overwhelmingly male, she bequeathed a toolbox that is both powerful and susceptible to misuse when utilized reflexively or unthinkingly. I don’t think that Edelstein is necessarily sexist, but he was certainly careless, and in his routine ventriloquism of Kael, which to a professional critic comes as easily as breathing, he temporarily forgot who he was and what movie he was reviewing. Kael was the Wonder Woman of film critics. But when we try to channel her voice, and we can hardly help it, it’s worth remembering—as another superhero famously learned—that with great power comes great responsibility.

Avocado’s number

with one comment

Earlier this month, you may have noticed a sudden flurry of online discussion around avocado toast. It was inspired by a remark by a property developer named Tim Gurner, who said to the Australian version of 60 Minutes: “When I was trying to buy my first home, I wasn’t buying smashed avocados for nineteen bucks and four coffees at four dollars each.” Gurner’s statement, which was fairly bland and unmemorable in itself, was promptly transformed into the headline “Millionaire to Millennials: Stop Buying Avocado Toast If You Want to Buy a Home.” From there, it became the target of widespread derision, with commentators pointing out that if owning a house seems increasingly out of reach for many young people, it has more to do with rising real estate prices, low wages, and student loans than with their irresponsible financial habits. And the fact that such a forgettable sentiment became the focal point for so much rage—mostly from people who probably didn’t see the original interview—implies that it merely catalyzed a feeling that had been building for some time. Millennials, it’s fair to say, have been getting it from both sides. When they try to be frugal by using paper towels as napkins, they’re accused of destroying the napkin industry, but they’re also scolded over spending too much at brunch. They’re informed that their predicament is their own fault, unless they’re also being idealized as “joyfully engaged in a project of creative destruction,” as Laura Marsh noted last year in The New Republic. “There’s nothing like being told precarity is actually your cool lifestyle choice,” Marsh wrote, unless it’s being told, as the middle class likes to maintain to the poor, that financial stability is only a matter of hard work and a few small sacrifices.

It also reflects an overdue rejection of what used to be called the latte factor, as popularized by the financial writer David Bach in such books as Smart Women Finish Rich. As Helaine Olen writes in Slate:

Bach calculated that eschewing a five-dollar daily bill at Starbucks—because who, after all, really needs anything at Starbucks?—for a double nonfat latte and biscotti with chocolate could net a prospective saver $150 a month, or $2,000 a year. If she then took that money and put it all in stocks that Bach, ever an optimist, assumed would grow at an average annual rate of eleven percent a year, “chances are that by the time she reached sixty-five, she would have more than $2 million sitting in her account.”

There are a lot of flaws in this argument. Bach rounds up his numbers, assumes an unrealistic rate of return, and ignores taxes and inflation. Most problematic of all is his core assumption that tiny acts of indulgence are what prevent the average investor from accumulating wealth. In fact, big, unpredictable risk factors and fixed expenses play a much larger role, as Olen points out:

Buying common luxury items wasn’t the issue for most Americans. The problem was the fixed costs, the things that are difficult to cut back on. Housing, health care, and education cost the average family seventy-five percent of their discretionary income in the 2000s. The comparable figure in 1973: fifty percent. Indeed, studies demonstrate that the quickest way to land in bankruptcy court was not by buying the latest Apple computer but through medical expenses, job loss, foreclosure, and divorce.

It turns out that incremental acts of daily discipline are powerless in the face of systemic factors that have a way of erasing all your efforts—and this applies to more than just personal finance. Back when I was trying to write my first novel, I was struck by the idea that if I managed to write just one hundred words every day, I’d have a manuscript in less than three years. I was so taken by this notion that I wrote it down on an index card and stuck it to my bathroom mirror. That was over a decade ago, and while I can’t quite remember how long I stuck with that regimen, it couldn’t have been more than a few weeks. Novels, I discovered, aren’t written a hundred words at a time, at least not in a fashion that can be banked in the literary equivalent of a penny jar. They’re the product of hard work combined with skills that can only be developed after a period of sustained engagement. There’s a lot of trial and error involved, and you can only arrive at a workable system through the kind of experience that comes from addressing issues of craft with maximal attention. Luck and timing also play a role, particularly when it comes navigating the countless pitfalls that lie between a finished draft and its publication. In finance, we’re inclined to look at a historical return series and attribute it after the fact to genius, rather than to variables that are out of our hands. Similarly, every successful novel creates its own origin story. We naturally underestimate the impact of factors that can’t be credited to individual initiative and discipline. As a motivational tool, there’s a place for this kind of myth. But if novels were written using the literary equivalent of the latte factor, we’d have more novels, just as we’d have more millionaires.

Which isn’t to say that routine doesn’t play a crucial role. My favorite piece of writing advice ever is what David Mamet writes in Some Freaks:

As a writer, I’ve tried to train myself to go one achievable step at a time: to say, for example, “Today I don’t have to be particularly inventive, all I have to be is careful, and make up an outline of the actual physical things the character does in Act One.” And then, the following day to say, “Today I don’t have to be careful. I already have this careful, literal outline, and I all have to do is be a little bit inventive,” et cetera, et cetera.

A lot of writing comes down to figuring out what to do on any given morning—but it doesn’t mean doing the same thing each day. Knowing what achievable steps are appropriate at every stage is as important here as it is anywhere else. You can acquire this knowledge as systematically or haphazardly as you like, but you can also do everything right and still fail in the end. (If we define failure as spending years on a novel that will never be published, it’s practically a requirement of the writer’s education.) Books on writing and personal finance continue to take up entire shelves at bookstores, and they can sound very much alike. In “The Writer’s Process,” a recent, and unusually funny, humor piece in The New Yorker, Hallie Cantor expertly skewers their tone—“I give myself permission to write a clumsy first draft and vigorously edit it later”—and concludes: “Anyway, I guess that’s my process. It’s all about repetition, really—doing the same thing every single day.” We’ve all heard this advice. I’ve been guilty of it myself. But when you don’t take the big picture into account, it’s just a load of smashed avocado.

Brand awareness

leave a comment »

Over the last few months, I’ve noticed that Stewart Brand, the founder of the Whole Earth Catalog and one of my personal heroes, has been popping up a lot in the press. In his excellent piece earlier this year in The New Yorker on survival prep among the rich, Evan Osnos called Brand to get a kind of sanity check:

At seventy-seven, living on a tugboat in Sausalito, Brand is less impressed by signs of fragility than by examples of resilience…He sees risks in escapism. As Americans withdraw into smaller circles of experience, we jeopardize the “larger circle of empathy,” he said, the search for solutions to shared problems. “The easy question is, How do I protect me and mine? The more interesting question is, What if civilization actually manages continuity as well as it has managed it for the past few centuries? What do we do if it just keeps on chugging?”

More recently, in an article in the same magazine about the Coachella Festival, John Seabrook wrote: “The short-lived first era of rock festivals began in San Francisco. The incubator was Stewart Brand and Ramon Sender’s three-day Trips Festival, a kind of ‘super acid test,’ in Tom Wolfe’s famed account.” The New York Times Magazine published a piece in March on Brand’s efforts to revive extinct species, and just last week, Real Life featured an essay by Natasha Young on the Long Now Foundation.

So why is Brand back in style? Young’s article offers a tempting clue: “The Long Now’s objective is to support the defense of the future against the finite play of selfish actors.” I don’t think it’s an exaggeration to say that if Donald Trump is the question, Stewart Brand is the answer, although it would be harder to imagine two white males of the same generation—Brand is eight years older than Trump—with less to say to each other. Yet his example is even more damning for those who claim to be following in his footsteps. The historical connections between Silicon Valley and the Catalog have been amply chronicled elsewhere, and much of the language that technology companies use to talk about themselves might have been copied straight from Brand’s work, with its insistence that information and modern tools could improve the lives of individuals and communities. To say that these ideals have been corrupted would be giving his self-appointed successors too much credit. It takes a certain degree of cluelessness to talk about making the world a better place while treating customers as fungible data points and unloading as much risk as possible onto outside parties, but it isn’t even particularly impressive. It’s the kind of evil that comes less out of ruthless efficiency or negative capability than short-term expediency, unexamined notions, lousy incentives, and the desperate hope that somebody involved knows what he or she is doing. Brand was a more capable organizer of time, capital, and talent than any of his imitators, and he truly lived the values that he endorsed. His life stands as a rebuke to the rest of us, and it didn’t lead him to a mansion, but to a houseboat in Sausalito.

Brand matters, in other words, not because he was a better person than most of his contemporaries, but because he was vastly more competent. This fact has a way of being lost, even as we rush to honor a man whose like we might never see again. His legacy can be hard to pin down because he’s simply a guy who got it right, quietly and consistently, for four decades, and because it reflects what seems at first like a confusing array of influences. It includes Buckminster Fuller’s futurism and Norbert Wiener’s cybernetics; the psychedelic fringe of Timothy Leary and Ken Kesey, as flavored by mysticism, Jungian psychology, and Zen Buddhism; Native American culture, which led Tom Wolfe to refer to Brand as “an Indian freak”; and the communalist movement of young, mostly affluent urbanites going back to the land in pursuit of greater simplicity. That’s a lot to keep in your head at once. But it’s also what you’d expect from a naturally curious character who spent years exploring whatever he found interesting. My favorite statement by Brand is what he says about voluntary simplicity:

Personally I don’t like the term…I’m more comfortable with the idea of “right livelihood,” which is one of the folds of the Buddhist Eightfold Path to enlightenment. It’s less of an exhortation than an observation—greedy behavior makes a sour life. The idealism of “Voluntary Simplicity” is okay I suppose, but it obscures what I find far more interesting—the sheer practicality of the exercise.

“Sheer practicality” sums up how I like to think about Brand, who lists the rewards of such an existence: “Time to do your work well enough to be proud of it. Time for an occasional original idea and time to follow it. Time for community.”

Take that recipe and extend it across a lifetime, and you end up with a career like Brand’s, which I’ve been contemplating for most of my life. Before I ended up working on my current nonfiction project, I seriously thought about pitching a book on Brand and the Catalog, simply because I thought it would be good for me. As it turns out, I don’t need to write it: John Markoff, the former technology reporter for the New York Times, is working on a biography of Brand, and Caroline Maniaque-Benton and Meredith Gaglio recently edited the anthology Whole Earth Field Guide. I’d be jealous, if I weren’t also grateful. And Brand’s impact can be seen right here every day. Kevin Kelly, Brand’s protégé, once wrote:

[The] missives in the Catalog were blog postings. Except rather than being published individually on home pages, they were handwritten and mailed into the merry band of Whole Earth editors who would typeset them with almost no editing (just the binary editing of print or not-print) and quickly “post” them on cheap newsprint to the millions of readers who tuned in to the Catalog‘s publishing stream. No topic was too esoteric, no degree of enthusiasm too ardent, no amateur expertise too uncertified to be included…It is no coincidence that the Whole Earth Catalogs disappeared as soon as the web and blogs arrived. Everything the Whole Earth Catalogs did, the web does better.

Personally, I think that there’s a lot to be said for putting out a version on paper, and Kelly evidently came around to the same conclusion, publishing the lovely tribute Cool Tools. But the basic form of the Catalog—excerpts from worthwhile sources interspersed with commentary—is the one that I’ve tried to follow. This blog is a kind of portrait of myself, and although its emphasis has changed a lot over the years, I’d like to think that it has remained fairly consistent in terms of the perspective that it presents. And I owe it more to Stewart Brand than to anybody else.

%d bloggers like this: