Posts Tagged ‘Adam Gopnik’
Checks and balances
About a third of the way through my upcoming book, while discussing the May 1941 issue of Astounding Science Fiction, I include the sentence: “The issue also featured Heinlein’s “Universe,” which was based on Campbell’s premise about a lost generation starship.” My copy editor amended this to “a lost-generation starship,” to which I replied: “This isn’t a ‘lost-generation’ starship, but a generation starship that happens to be lost.” And the exchange gave me a pretty good idea for a story that I’ll probably never write. (I don’t really have a plot for it yet, but it would be about Hemingway and Fitzgerald on a trip to Alpha Centauri, and it would be called The Double Sun Also Rises.) But it also reminded me of one of the benefits of a copy edit, which is its unparalleled combination of intense scrutiny and total detachment. I sent drafts of the manuscript to some of the world’s greatest nitpickers, who saved me from horrendous mistakes, and the result wouldn’t be nearly as good without their advice. But there’s also something to be said for engaging the services of a diligent reader who doesn’t have any connection to the subject. I deliberately sought out feedback from a few people who weren’t science fiction fans, just to make sure that it remained accessible to a wider audience. And the ultimate example is the copy editor, who is retained to provide an impartial consideration of every semicolon without any preconceived notions outside the text. It’s what Heinlein might have had in mind when he invented the Fair Witness, who said when asked about the color of a nearby house: “It’s white on this side.”
But copy editors are human beings, not machines, and they occasionally get their moment in the spotlight. Recently, their primary platform has been The New Yorker, which has been quietly highlighting the work of its copy editors and fact checkers over the last few years. We can trace this tendency back to Between You & Me, a memoir by Mary Norris that drew overdue attention to the craft of copy editing. In “Holy Writ,” a delightful excerpt in the magazine, Norris writes of the supposed objectivity and rigor of her profession: “The popular image of the copy editor is of someone who favors rigid consistency. I don’t usually think of myself that way. But, when pressed, I do find I have strong views about commas.” And she says of their famous detachment:
There is a fancy word for “going beyond your province”: “ultracrepidate.” So much of copy editing is about not going beyond your province. Anti-ultracrepidationism. Writers might think we’re applying rules and sticking it to their prose in order to make it fit some standard, but just as often we’re backing off, making exceptions, or at least trying to find a balance between doing too much and doing too little. A lot of the decisions you have to make as a copy editor are subjective. For instance, an issue that comes up all the time, whether to use “that” or “which,” depends on what the writer means. It’s interpretive, not mechanical—though the answer often boils down to an implicit understanding of commas.
In order to be truly objective, in other words, you have to be a little subjective. Which equally true of writing as a whole.
You could say much the same of the fact checker, who resembles the copy editor’s equally obsessive cousin. As a rule, books aren’t fact-checked, which is a point that we only seem to remember when the system breaks down. (Astounding was given a legal read, but I was mostly on my own when it came to everything else, and I’m grateful that some of the most potentially contentious material—about L. Ron Hubbard’s writing career—drew on an earlier article that was brilliantly checked by Matthew Giles of Longreads.) As John McPhee recently wrote of the profession:
Any error is everlasting. As Sara [Lippincott] told the journalism students, once an error gets into print it “will live on and on in libraries carefully catalogued, scrupulously indexed…silicon-chipped, deceiving researcher after researcher down through the ages, all of whom will make new errors on the strength of the original errors, and so on and on into an exponential explosion of errata.” With drawn sword, the fact-checker stands at the near end of this bridge. It is, in part, why the job exists and why, in Sara’s words, a publication will believe in “turning a pack of professional skeptics loose on its own galley proofs.”
McPhee continues: “Book publishers prefer to regard fact-checking as the responsibility of authors, which, contractually, comes down to a simple matter of who doesn’t pay for what. If material that has appeared in a fact-checked magazine reappears in a book, the author is not the only beneficiary of the checker’s work. The book publisher has won a free ticket to factual respectability.” And its absence from the publishing process feels like an odd evolutionary vestige of the book industry that ought to be fixed.
As a result of such tributes, the copy editors and fact checkers of The New Yorker have become cultural icons in themselves, and when an error does make it through, it can be mildly shocking. (Last month, the original version of a review by Adam Gopnik casually stated that Andrew Lloyd Webber was the composer of Chess, and although I knew perfectly well that this was wrong, I had to look it up to make sure that I hadn’t strayed over into a parallel universe.) And their emergence at this particular moment may not be an accident. The first installment of “Holy Writ” appeared on February 23, 2015, just a few months before Donald Trump announced that he was running for president, plunging us all into world in which good grammar and factual accuracy can seem less like matters of common decency than obstacles to be obliterated. Even though the timing was a coincidence, it’s tempting to read our growing appreciation for these unsung heroes as a statement about the importance of the truth itself. As Alyssa Rosenberg writes in the Washington Post:
It’s not surprising that one of the persistent jokes from the Trump era is the suggestion that we’re living in a bad piece of fiction…Pretending we’re all minor characters in a work of fiction can be a way of distancing ourselves from the seeming horror of our time or emphasizing our own feelings of powerlessness, and pointing to “the writers” often helps us deny any responsibility we may have for Trump, whether as voters or as journalists who covered the election. But whatever else we’re doing when we joke about Trump and the swirl of chaos around him as fiction, we’re expressing a wish that this moment will resolve in a narratively and morally comprehensible fashion.
Perhaps we’re also hoping that reality itself will have a fact checker after all, and that the result will make a difference. We don’t know if it will yet. But I’m hopeful that we’ll survive the exponential explosion of errata.
Going with the flow
On July 13, 1963, New York University welcomed a hundred attendees to an event called the Conference on Education for Creativity in the Sciences. The gathering, which lasted for three days, was inspired by the work of Dr. Myron A. Coler, the director of the school’s Creative Science Program. There isn’t a lot of information available online about Coler, who was trained as an electrical engineer, and the best source I’ve found is an unsigned Talk of the Town piece that ran earlier that week in The New Yorker. It presents Coler as a scholar who was interested in the problem of scientific creativity long before it became fashionable: “What is it, how does it happen, how is it fostered—can it be isolated, measured, nurtured, predicted, directed, and so on…By enhancing it, you produce more from what you have of other resources. The ability to exploit a resource is in itself a resource.” He conducted monthly meetings for years with a select group of scientists, writing down everything that they had to say on the subject, including a lot of wild guesses about how to identify creative or productive people. Here’s my favorite:
One analyst claims that one of the best ways that he knows to test an individual is to take him out to dinner where lobster or crab is served. If the person uses his hands freely and seems to enjoy himself at the meal, he is probably well adjusted. If, on the other hand, he has trouble in eating the crab, he probably will have trouble in his relations with people also.
The conference was overseen by Jerome B. Wiesner, another former electrical engineer, who was appointed by John F. Kennedy to chair the President’s Science Advisory Committee. Wiesner’s interest lay in education, and particularly in identifying and training children who showed an early aptitude for science. In an article that was published a few years later in the journal Daedalus, Wiesner listed some of the attributes that were often seen in such individuals, based on the work of the pioneering clinical psychologist Anne Roe:
A childhood environment in which knowledge and intellectual effort were so highly valued for themselves than an addiction to reading and study was firmly established at an early age; an unusual degree of independence which, among other things, led them to discover early that they could satisfy their curiosity by personal efforts; an early dependence on personal resources, and on the necessity to think for oneself; an intense drive that generated concentrated, persistent, time-ignoring efforts in their studies and work; a secondary-school training that tended to emphasize science rather than the humanities; and high, but not necessarily remarkably high, intelligence.
But Wiesner also closed on a note of caution: “We do not now have useful techniques for predicting with comfortable reliability which individuals will turn out to be creative in the sciences or in any other field, no matter how great an investment we make in their education. Nor does it appear likely that such techniques will be developed in the immediate future.”
As it happened, one of the attendees at the conference was Isaac Asimov, who took the bus down to New York from Boston. Years afterward, he said that he couldn’t remember much about the experience—he was more concerned by the fact that he lost the wad of two hundred dollars that he had brought as emergency cash—and that his contributions to the discussion weren’t taken seriously. When the question came up of how to identify potentially creative individuals at a young age, he said without hesitation: “Keep an eye peeled for science-fiction readers.” No one else paid much attention, but Asimov didn’t forget the idea, and he wrote it up later that year in his essay “The Sword of Achilles,” which was published by The Bulletin of the Atomic Scientists. His views on the subject were undoubtedly shaped by his personal preferences, but he was also probably right. (He certainly met most of the criteria listed by Weisner, aside from “an unusual degree of independence,” since he was tied down for most of his adolescence to his father’s candy store.) And science fiction had more in common with Coler and Wiesner’s efforts than they might have appreciated. The editor John W. Campbell had always seen the genre as a kind of training program that taught its readers how to survive in the future, and Weisner described “tomorrow’s world” in terms that might have been pulled straight from Astounding: “That world will be more complex than it is today, will be changing more rapidly than now, and it will have jobs only for the well trained.” Weisner closed with a quotation from the philosopher Alfred North Whitehead:
In the conditions of modern life, the rule is absolute, the race which does not value trained intelligence is doomed…Today we maintain ourselves. Tomorrow science will have moved forward one more step, and there will be no appeal from the judgment which will then be pronounced on the uneducated.
These issues tend to come to the forefront during times of national anxiety, and it’s no surprise that we’re seeing a resurgence in them today. In last week’s issue of The New Yorker, Adam Gopnik rounded up a few recent titles on education and child prodigies, which reflect “the sense that American parents have gone radically wrong, making themselves and their kids miserable in the process, by hovering over them like helicopters instead of observing them from a watchtower, at a safe distance.” The catch is that while the current wisdom says that we should maximize our children’s independence, most child prodigies were the result of intensive parental involvement, which implies that the real secret to creative achievement lies somewhere else. And the answer may be right in front of us. As Gopnik writes of the author Ann Hulbert’s account of of the piano prodigy Lang Lang:
Lang Lang admits to the brutal pressures placed on him by his father…He was saved because he had, as Hulbert writes, “carved out space for a version of the ‘autotelic experience’—absorption in an activity purely for its own sake, a specialty of childhood.” Following the psychologist Mihaly Csikszentmihalyi, Hulbert maintains that it was being caught in “the flow,” the feeling of the sudden loss of oneself in an activity, that preserved Lang Lang’s sanity: “The prize always beckoned, but Lang was finding ways to get lost in the process.”
This is very close to the “concentrated, persistent, time-ignoring efforts” that Weisner described fifty years ago, as well as his characterization of learning as “an addiction.” Gopnik concludes: “Accomplishment, the feeling of absorption in the flow, of mastery for its own sake, of knowing how to do this thing, is what keeps all of us doing what we do, if we like what we do at all.” And it seems to have been this sense of flow, above all else, that led Asimov to write more than four hundred books. He was addicted to it. As he once wrote to Robert A. Heinlein: “I like it in the attic room with the wallpaper. I’ve been all over the galaxy. What’s left to see?”
Calder’s baggage
For most of the last week, I’ve been obsessively leafing through all of the multivolume biographies that I own, glancing over their endnotes, reading their acknowledgments, and marveling both at their sheer bulk and at the commitment of time that they require. You don’t need to be a psychologist to understand why. If all goes well, on Monday, I’ll be delivering a draft of Astounding to my editor. It’s a little anticlimactic—there’s plenty of rewriting to come, and I’m sending it out now mostly because that’s what it says in my contract. But it means, if nothing else, that I’m technically done, which I don’t take for granted. This project will have taken up three years of my life from initial conception to publication, which feels like a long time, although you don’t need to look far to find examples that dwarf it. (The champion here might be Muriel St. Clare Byrne, who spent fifty years on The Lisle Letters.) I would have happily worked for longer, and one of my readers rather deflatingly suggested, after reading a recent draft, that I ask my publisher for another year. But the more this kind of project drags out, the greater the chance that it won’t be finished at all, and on balance, I think it’s best for me to push ahead. The dust jacket of Robert A. Caro’s The Path to Power refers to it as “the first of the three volumes that will constitute The Years of Lyndon Johnson,” and we’re all still waiting patiently for number five to take us even as far as Vietnam. Much the same thing happened with John Richardson’s massive life of Picasso, which was originally supposed to be just one book, only to be touted later as an “exceedingly detailed yet readable three-volume life.” Richardson is currently at work on the fourth volume, which only follows Picasso up through World War II, with three decades still left to be covered. When recently asked if he thought he would ever get to a fifth, the author replied: “Listen, I’m ninety-one—I don’t think I have time for that.”
These days, such books are testing the limits of mortality, not just for authors and editors, but possibly for print media itself. When Caro published The Path to Power back in 1982, it would have been impossible to anticipate the changes in publishing that were looming on the horizon, and perhaps the arrival of another doorstopper about Lyndon Johnson every decade or so provides us with a sentimental connection to an earlier era of books. Yet the multivolume life seems more popular than ever, at least among major publishers. In the latest issue of The New Yorker, Adam Gopnik issues a mild protest against “the multivolume biography of the single-volume life”:
In the nineteenth century, the big sets were usually reserved for the big politicians. Disraeli got seven volumes and Gladstone three, but the lives of the poets or the artists or even the scientists tended to be enfolded within the limits of a single volume. John Forster’s life of Dickens did take its time, and tomes, but Elizabeth Gaskell kept Charlotte Brontë within one set of covers, and Darwin got his life and letters presented in one compact volume, by his son. The modern mania for the multivolume biography of figures who seem in most ways “minor” may have begun with Michael Holroyd’s two volumes devoted to Lytton Strachey, who was wonderful and influential but a miniaturist perhaps best treated as such. Strachey, at least, talked a lot and had a vivid sex life. But we are now headed toward a third volume of the life of Bing Crosby, and already have two volumes on Dai Vernon, the master card magician (a master, yes, but of card magic). This season, the life of Alexander Calder, toymaker to the modernist muses, arrives in the first volume of what promises to be two.
Gopnik seems bemused by the contrast between the size of Jed Perl’s Calder: The Conquest of Time: The Early Years: 1898-1940, which is seven hundred pages long, and the delicacy of the mobiles on which its subject’s reputation rests. And although he asks why we seem to be seeing more such efforts, which come off as oddly anachronistic at a time when publishing as a whole is struggling, he doesn’t really answer his own question. I can think of a few possible reasons. The most plausible explanation, I suspect, is that there’s an economic incentive to extending a life over multiple volumes, as long as the publisher is reasonably confident that an audience for it exists. If you’re the sort of person who would buy a huge biography of Alexander Calder at all, you’re probably going to buy two, and the relationship between the number of volumes and the rate of return—even after you account for time, production costs, and the loss of readers turned off by its size or lack of completion—might be narrowly positive. (You might think that these gains would be offset by the need to pay the author more money, but that probably isn’t the case. Looking at the acknowledgments for Richardson’s A Life of Picasso, it seems clear that his years of work were largely underwritten by outside sources, including nothing less than the John Richardson Fund for Picasso Research, set up by Sid and Mercedes Bass.) There’s a psychological side to this. As our online reading habits become divided into ever smaller particles of attention, perhaps we’re more drawn to these huge tomes as a sort of counterbalance, whether or not we have any intention of reading them. Publishing is as subject to the blockbuster mentality as any other art form, and it may well be that a book of fourteen hundred pages on Calder has a greater chance of reaching readers than one of three hundred pages would.
This kind of logic isn’t altogether unfamiliar in the art world, and Gopnik identifies a similar trend in Calder’s career, in which “the early sense of play gave way to dulled-down, chunk-of-metal-in-a-plaza heaviness.” Bigger can seem better for certain books as well, and biographers fill pages in the only way that they can. As Gopnik writes:
Calder’s is not a particularly dramatic life—he was neither much of a talker nor a prolific lover. In broad strokes, the career follows the customary arc of a modern artist, going from small, animated Parisian experiments, in the twenties, and ending with big, dull American commissions fifty years later—and though we are hungry to get him, we are not perhaps hungry to get him at quite this length. A dubious density of detailing—“In Paris, Calder had to wait an hour for his luggage, which he had checked through in London”—of the kind inevitable to such multivolume investigations may daunt even the reader who was eager at the start.
And that image of Calder waiting an hour for his luggage is one that every biographer should regard with dread. (It belongs on the same shelf as the line from Allan Folsom’s The Day After Tomorrow that Anthony Lane quoted to illustrate the accretion of procedural detail that deadens so many thrillers: “Two hundred European cities have bus links with Frankfurt.”) Not every big book suffers from this tendency—I don’t think that many readers wish that The Power Broker were shorter, even if its size discourages others from starting in the first place. And some lives do benefit from multiple books delivered over the course of many years. But they can also put readers in the position of waiting for more baggage—and when it comes at last, they’re the ones who get to decide whether or not it was worth it.
Zen in America
In the latest issue of The New Yorker, Adam Gopnik discusses the recent books Why Buddhism is True by Robert Wright and Stephen Batchelor’s After Buddhism, the latter of which he calls “in many ways the most intellectually stimulating book on Buddhism of the past few years.” As with most of the articles under the magazine’s departmental heading A Critic at Large, it’s less a review than an excuse to explore the subject in general, and Gopnik ends up delivering a gentle pitch for Buddhism as a secular philosophy of life. He writes:
As for the mind’s modules [Batchelor writes], “Gotama [Buddha] is interested in what people can do, not with what they are…We may have no control over the rush of fear prompted by finding a snake under our bed, but we do have the ability to respond to the situation in a way that is not determined by that fear.” Where Wright insists that the Buddhist doctrine of not-self precludes the possibility of freely chosen agency, Batchelor insists of Buddhism that “as soon as we consider it a task-based ethics…such objections vanish. The only thing that matters is whether or not you can perform a task.”
This idea was enormously appealing to certain Americans of the nineteenth century, as Gopnik notes earlier on: “These American Buddhists, drawn East in part by a rejection of Gilded Age ostentation, recognized a set of preoccupations like those they knew already—Whitman’s vision of a self that could shift and contain multitudes, or Thoreau’s secular withdrawal from the race of life…The quietist impulse in New England spirituality and the pantheistic impulse in American poetry both seemed met, and made picturesque, by the Buddhist tradition.”
I find something especially revealing in the way that such adherents hoped to combat certain stereotypically American tendencies, such as material striving and competition, with the equally American notion of a “task-based ethics.” The promise of turning one’s preference for concrete action—or rules of behavior—from a weakness into a strength is very attractive to someone like me, and I’ve always liked R.H. Blyth’s memorable description of the training of a Zen novitiate:
I remember when I began to attend lectures, at a Zen temple…I was surprised to find that there were no lofty spiritual truths enunciated at all. Two things stuck in my head, because they were repeated so often, and with such gusto. One of them, emphasized with extreme vigor, was that you must not smoke a cigarette while making water. The other was that when somebody calls you (in Japanese, “Oi!”) you must answer (“Hai!”) at once, without hesitation. When we compare this to the usual Christian exhortatory sermon, we cannot help being struck by the difference.
But I’ve also learned to be cautious about appropriating it for myself. I’ve been interested in Zen for a long time, particularly since discovering Blyth’s wonderful books Zen in English Literature and Oriental Classics and Haiku, but I don’t feel comfortable identifying with it. For one thing, it’s a complicated subject that I’ve never entirely gotten my head around, and I don’t follow its practice in fundamental ways. (I don’t meditate, for instance, although reading Gopnik’s article makes me think that I probably should.) My understanding of it is mediated through such Western interpreters as Blyth, and I feel less than qualified to talk about it for much the same reason that Robert Pirsig gives in his disclaimer to Zen and the Art of Motorcycle Maintenance: “What follows…should in no way be associated with that great body of factual information relating to orthodox Zen Buddhist practice. It’s not very factual on motorcycles, either.”
And my understanding of Zen can best be described as being not very factual on motorcycles. What I like about it is what Stewart Brand, speaking on the related issue of voluntary poverty, once called “the sheer practicality of the exercise,” and I’ve taken as much of its advice to heart as I can. It feels like common sense. The trouble, obviously, is that this extracts a tiny sliver of meaning from a vast spiritual tradition that most Westerners haven’t studied firsthand, and its cultural distance makes it easy for us to abstract whatever we want from it. As Gopnik points out:
[Batchelor’s] project is unashamedly to secularize Buddhism. But, since it’s Buddhism that he wants to secularize, he has to be able to show that its traditions are not hopelessly polluted with superstition…Batchelor, like every intelligent believer caught in an unsustainable belief, engages in a familiar set of moves. He attempts to italicize his way out of absurdity by, in effect, shifting the stresses in the simple sentence “We don’t believe that.” First, there’s “We don’t believe that”…Next comes “We don’t believe that”…In the end, we resort to “We don’t believe that”: we just accept it as an embedded metaphor of the culture that made the religion.
And Buddhism is probably more conducive to this “set of moves” by Americans than, say, Christianity, simply because it feels exotic and unfamiliar. If you look at the picture of Jesus that emerges from a study like The Five Gospels, you end up with a religious ethic that has important differences from Buddhism, but which also shares deep affinities in how it positions the self against the world. Yet it’s so tangled up with its history in America that secular types are unlikely to embrace it as a label.
Of course, this scavenging of traditions for spare parts is quintessentially American as well, and it comes out of an understandable impulse to correct our worst tendencies. In all honesty, I’m one of the least “Zen” people I know—I’m ambitious, competitive, and deeply invested in measuring myself against worldly standards of success, all of which I intend to renounce as soon as I’ve proven that I can win in all the conventional ways. It’s all very American of me. Yet it would be equally true to say that I’m drawn to the doctrine of nonattachment because I need it, and because it serves as a corrective to ingrained personality traits that would otherwise make me miserable. I’m not alone, either. Gopnik refers briefly to the San Francisco Zen Center and “its attempted marriage of spiritual elevation with wild entrepreneurial activity,” and the one thing that the most famous exponents of Zen had in common is that they were all hugely ambitious, as well as highly systematic in the way that they pursued their goals. You don’t become the spokesman for an entire religious tradition by accident, and I suspect that their ambition usually came first, and their lifelong effort to come to terms with it was channeled, not into withdrawal, but into a more active engagement with the world. This might seem contradictory, but we’re also simply more likely to talk about Blyth, Pirsig, D.T. Suzuki, Alan Watts, and the rest than the nameless monks who did the sensible thing and entered a life of quiet meditation. It skews our picture of Zen a bit, in particular by presenting it in intellectual terms that have more to do with the needs of writing a book than the inner experience of a principled adept, but not to an extent that seems impossible to overcome. It may well be, as Gopnik notes, that “secular Buddhism ends up being…secularism.” But even if we arrive there in a roundabout way, or call it by different names, it still amounts to the best set of tools that we have for survival in America, or just about anywhere else.
The Bollingen Library and the future of media
About a year ago, I began to notice that many of the books in my home library came from the same place. It all started when I realized that Kenneth Clark’s The Nude and E.H. Gombrich’s Art and Illusion—two of the most striking art books of the century—had originally been delivered as part of the A.W. Mellon Lectures in Fine Art and published by the Bollingen Library. Looking more closely, I found that the Bollingen Foundation, whatever that was, had been responsible for countless other titles that have played important roles in my life and those of other readers: Vladimir Nabokov’s massive translation and commentary of Eugene Onegin, the Richard Wilhelm edition of the I Ching, D.T. Suzuki’s Zen and Japanese Culture, Jacques Maritain’s Creative Intuition in Art and Poetry, Huntington Cairns’s extraordinary anthology The Limits of Art, and, perhaps most famously, Joseph Campbell’s The Hero With a Thousand Faces. Intrigued, I sought out more books from the Bollingen imprint, looking for used copies online and purchasing them sight unseen. So far, I’ve acquired tomes like The Survival of the Pagan Gods, The Eternal Present, The Gothic Cathedral, and The Demands of Art. Along with a shared concern with the humanities and their role in modern life, they’re all physically beautiful volumes, a delight to hold and browse through, and I hope to acquire more for as long as I can.
Which, when you think about it, is highly unusual. Most of us don’t pay much attention to the publishers of the books we buy: we may subconsciously sense that, say, the Knopf imprint is a mark of quality, but we don’t pick up a novel solely because of the borzoi logo on the spine. (The one big exception may be Taschen, which has built up a reputation for large, indecently attractive coffee table books.) Publishers would love it if we did, of course, just as television networks and movie studios would be happy if we automatically took their brands as a seal of approval. That’s rare in any medium: HBO and Disney have managed it, but not many more. So it’s worth taking a closer look at Bollingen to see how, exactly, it caught my eye. And what we discover is that Bollingen was a philanthropic enterprise, essentially an academic press without the university. It was founded in 1945 by Paul Mellon, heir to the Andrew W. Mellon fortune, as a tribute to his late wife, a devotee of Carl Jung, and while it initially focused on Jungian studies—it was named after Jung’s famous tower and country home in Switzerland—it gradually expanded into a grander project centered on the interconnectedness and relevance of art, history, literature, and psychology. As names like Gombrich and Clark indicate, it arose out of much the same circle as the Warburg Institute in London, which was recently the subject of a loving profile by Adam Gopnik in The New Yorker, but with far greater resources, patronage, and financial support.
In the end, after publishing hundreds of books, sponsoring lectures, and awarding generous stipends to the likes of Marianne Moore and Alexis Leger, the foundation discontinued operations in 1968, noting that the generation it had served was yielding to another set of concerns. And while it may not seem to have much relevance to the problem of media brands today, it offers some surprising lessons. Bollingen started as an act of philanthropy, without any expectation of profit, and arose out of a highly focused, idiosyncratic vision: these were simply books that Mellon and his editors wanted to see, and they trusted that they would find an appreciative audience over time. Which, in many respects, is still how meaningful brands are created or sustained. Matthew Yglesias once referred to Amazon as “a charitable organization being run by elements of the investment community for the benefit of consumers,” and although he was being facetious, he had a point. It’s easy to make fun of startup companies that are obsessed with eyeballs, rather than sustainable profits, as venture capitalist Chris Sacca put it on Alex Blumberg’s Startup podcast:
That’s usually a bad move for an early-stage company—to get cash-flow positive. I have strong opinions about that. Everyone I know who pushes for cash-flow positivity that early stops growing at the rate they should be growing, and gets so anchored by this idea that “we need to keep making money.”
Sacca concludes that you don’t want a “lifestyle business”—that is, a business growing at a pace where you get to take vacations—and that growth for its own sake should be pursued at all costs. And it’s a philosophy that has resulted, infamously, in countless “hot” tech companies that are years, if not a lifetime, away from profitability.
But I think Sacca is half right, and despite the obvious disparity in ideals, he all but circles back around to the impulse behind Bollingen. Venture investors don’t have any desire to run a charitable enterprise, but they end up doing so anyway, at least for the years in which a company is growing, because that’s how brands are made. Someone’s money has to be sacrificed to lay the foundations for anything lasting, both because of the timelines involved and because it’s the only way to avoid the kind of premature compromise that can turn off potential users or readers. We’re living in an age when such investments are more likely to take the form of startup capital than charitable largess, but the principle is fundamentally the same. It’s the kind of approach that can’t survive a short-term obsession with advertisers or page views, and it requires patrons with deep pockets, a tolerance for idiosyncrasy, an eye for quality, and a modicum of patience. (In journalism, the result might look a lot like The Distance, a publication in whose success I have a considerable personal stake.) More realistically, it may take the form of a prestigious but money-losing division within a larger company, like Buzzfeed’s investigative pieces or most of the major movie studios. The reward, as Yglesias puts it, is a claim on “a mighty engine of consumer surplus”—and if we replace “consumer” with “cultural,” we get something very much like the Bollingen Foundation. Bollingen wasn’t interested in growth in itself, but in influencing the entire culture, and in at least one book, The Hero With a Thousand Faces, it went viral to an extent that makes even the most widely shared article seem lame. Like Jung’s tower, it was made for its own sake. And its legacy still endures.