Posts Tagged ‘Steve Jobs’
A Fuller Life
I’m pleased to announce that I’ve finally figured out the subject of my next book, which will be a biography of the architect and futurist Buckminster Fuller. If you’re a regular reader of this blog, you probably know how much Fuller means to me, and I’m looking forward to giving him the comprehensive portrait that he deserves. (Honestly, that’s putting it mildly. I’ve known for over a week that I’ll have a chance to tackle this project, and I still can’t quite believe that it’s really happening. And I’m especially happy that my current publisher has agreed to give me a shot at it.) At first glance, this might seem like a departure from my previous work, but it presents an opportunity to explore some of the same themes from a different angle, and to explore how they might play out in the real world. The timelines of the two projects largely coincide, with a group of subjects who were affected by the Great Depression, World War II, the Cold War, and the social upheavals of the sixties. All of them had highly personal notions about the fate of America, and Fuller used physical artifacts much as Campbell, Asimov, and Heinlein employed science fiction—to prepare their readers for survival in an era of perpetual change. Fuller’s wife, Anne, played an unsung role in his career that recalls many of the women in Astounding. Like Campbell, he approached psychology as a category of physics, and he hoped to turn the prediction of future trends into a science in itself. His skepticism of governments led him to conclude that society should be changed through design, not political institutions, and like many science fiction writers, he acted as if all disciplines could be reduced to subsets of engineering. And for most of his life, he insisted that complicated social problems could be solved through technology.
Most of his ideas were expressed through the geodesic dome, the iconic work of structural design that made him famous—and I hope that this book will be as much about the dome as about Fuller himself. It became a universal symbol of the space age, and his reputation as a futurist may have been founded largely on the fact that his most recognizable achievement instantly evoked the landscape of science fiction. From the beginning, the dome was both an elegant architectural conceit and a potent metaphor. The concept of a hemispherical shelter that used triangular elements to enclose the maximum amount of space had been explored by others, but Fuller was the first to see it as a vehicle for social change. With design principles that could be scaled up or down without limitation, it could function as a massive commercial pavilion or as a house for hippies. (Ken Kesey dreamed of building a geodesic dome to hold one of his acid tests.) It could be made out of plywood, steel, or cardboard. A dome could be cheaply assembled by hand by amateur builders, which encouraged experimentation, and its specifications could be laid out in a few pages and shared for free, like the modern blueprints for printable houses. It was a hackable, open-source machine for living that reflected a set of tools that spoke to the same men and women who were teaching themselves how to code. As I noted here recently, a teenager named Jaron Lanier, who was living in a tent with his father on an acre of desert in New Mexico, used nothing but the formulas in Lloyd Kahn’s Domebook to design and build a house that he called “Earth Station Lanier.” Lanier, who became renowned years later as the founder of virtual reality, never got over the experience. He recalled decades later: “I loved the place; dreamt about it while sleeping inside it.”
During his lifetime, Fuller was one of the most famous men in America, and he managed to become an idol to both the establishment and the counterculture. In the three decades since his death, his reputation has faded, but his legacy is visible everywhere. The influence of his geodesic structures can be seen in the Houston Astrodome, at Epcot Center, on thousands of playgrounds, in the dome tents favored by backpackers, and in the emergency shelters used after Hurricane Katrina. Fuller had a lasting impact on environmentalism and design, and his interest in unconventional forms of architecture laid the foundation for the alternative housing movement. His homegrown system of geometry led to insights into the biological structure of viruses and the logic of communications networks, and after he died, he was honored by the discoverers of a revolutionary form of carbon that resembled a geodesic sphere, which became known as fullerene, or the buckyball. And I’m particularly intrigued by his parallels to the later generation of startup founders. During the seventies, he was a hero to the likes of Steve Wozniak and Steve Jobs, who later featured him prominently in the first “Think Different” commercial, and he was the prototype of the Silicon Valley types who followed. He was a Harvard dropout who had been passed over by the college’s exclusive social clubs, and despite his lack of formal training, he turned himself into an entrepreneur who believed in changing society through innovative products and environmental design. Fuller wore the same outfit to all his public appearances, and his personal habits amounted to an early form of biohacking. (Fuller slept each day for just a few hours, taking a nap whenever he felt tired, and survived mostly on steak and tea.) His closest equivalent today may well be Elon Musk, which tells us a lot about both men.
And this project is personally significant to me. I first encountered Fuller through The Whole Earth Catalog, which opened its first edition with two pages dedicated to his work, preceded by a statement from editor Stewart Brand: “The insights of Buckminster Fuller initiated this catalog.” I was three years old when he died, and I grew up in the shadow of his influence in the Bay Area. The week before my freshman year in high school, I bought a used copy of his book Critical Path, and I tried unsuccessfully to plow through Synergetics. (At the time, this all felt kind of normal, and it’s only when I look back that it seems strange—which tells you a lot about me, too.) Above all else, I was drawn to his reputation as the ultimate generalist, which reflected my idea of what my life should be, and I’m hugely excited by the prospect of returning to him now. Fuller has been the subject of countless other works, but never a truly authoritative biography, which is a project that meets both Susan Sontag’s admonition that a writer should try to be useful and the test that I stole from Lin-Manuel Miranda: “What’s the thing that’s not in the world that should be in the world?” Best of all, the process looks to be tremendously interesting for its own sake—I think it’s going to rewire my brain. It also requires an unbelievable amount of research. To apply the same balanced, fully sourced, narrative approach to his life that I tried to take for Campbell, I’ll need to work through all of Fuller’s published work, a mountain of primary sources, and what might literally be the largest single archive for any private individual in history. I know from experience that I can’t do it alone, and I’m looking forward to seeking help from the same kind of brain trust that I was lucky to have for Astounding. Those of you who have stuck with this blog should be prepared to hear a lot more about Fuller over the next three years, but I wouldn’t be doing this at all if I didn’t think that you might find it interesting. And who knows? He might change your life, too.
Famous monsters of filmland
For his new book The Big Picture: The Fight for the Future of the Movies, the journalist Ben Fritz reviewed every email from the hack of Sony Pictures, which are still available online. Whatever you might think about the ethics of using such material, it’s a gold mine of information about how Hollywood has done business over the last decade, and Fritz has come up with some fascinating nuggets. One of the most memorable finds is an exchange between studio head Amy Pascal and the producer Scott Rudin, who was trying to convince her to take a chance on Danny Boyle’s adaptation of Steve Jobs. Pascal had expressed doubts about the project, particularly over the casting of Michael Fassbender in the lead, and after arguing it was less risky than The Social Network, Rudin delivered a remarkable pep talk:
You ought to be doing this movie—period—and you and I both know that the cold feet you are feeling is costing you this movie that you want. Once you have cold feet, you’re done. You’re making this decision in the anticipation of how you will be looked at in failure. That’s how you fail. So you’re feeling wobbly in the job right now. Here’s the fact: nothing conventional you could do is going to change that, and there is no life-changing hit that is going to fall into your lap that is not a nervous decision, because the big obvious movies are going to go elsewhere and you don’t have the IP right now to create them from standard material. You have this. Face it…Force yourself to muster some confidence about it and do the exact thing right now for which your career will be known in movie history: be the person who makes the tough decisions and sticks with them and makes the unlikely things succeed. Fall on your sword—when you’ve lost that, it’s finished. You’re the person who does these movies. That’s—for better or worse—who you are and who you will remain. To lose that is to lose yourself.
Steve Jobs turned out to be a financial disappointment, and its failure—despite the prestige of its subject, director, and cast—feels emblematic of the move away from films driven by stars to those that depend on “intellectual property” of the kind that Sony lacked. In particular, the movie industry seems to have shifted to a model perfected by Marvel Studios, which builds a cinematic universe that can drum up excitement for future installments and generate huge grosses overseas. Yet this isn’t exactly new. In the groundbreaking book The Genius of the System, which was published three decades ago, Thomas Schatz notes that Universal did much the same in the thirties, when it pioneered the genre of cinematic horror under founder Carl Laemmle and his son:
The horror picture scarcely emerged full-blown from the Universal machinery, however. In fact, the studio had been cultivating the genre for years, precisely because it played to Universal’s strengths and maximized its resources…Over the years Carl Laemmle built a strong international distribution system, particularly in Europe…[European filmmakers] brought a fascination for the cinema’s distinctly unrealistic qualities, its capacity to depict a surreal landscape of darkness, nightmare logic, and death. This style sold well in Europe.
After noting that the aesthetics of horror lent itself to movies built out of little more than shadows and fog, which were the visual effects of its time, Schatz continues: “This rather odd form of narrative economy was vitally important to a studio with limited financial resources and no top stars to carry its pictures. And in casting, too, the studio turned a limitation into an asset, since the horror film did not require romantic leads or name stars.”
The turning point was Tod Browning’s Dracula, a movie “based on a presold property” that could serve as an entry point for other films along the same lines. It didn’t require a star, but “an offbeat character actor,” and Universal’s expectations for it eerily foreshadow the way in which studio executives still talk today. Schatz writes:
Laemmle was sure it would [succeed]—so sure, in fact, that he closed the Frankenstein deal several weeks before Dracula’s February 1931 release. The Lugosi picture promptly took off at the box office, and Laemmle was more convinced than ever that the horror film was an ideal formula for Universal, given its resources and the prevailing market conditions. He was convinced, too, that he had made the right decision with Frankenstein, which had little presold appeal but now had the success of Dracula to generate audience anticipation.
Frankenstein, in short, was sort of like the Ant-Man of the thirties, a niche property that leveraged the success of its predecessors into something like real excitement. It worked, and Universal’s approach to its monsters anticipates what Marvel would later do on a vaster scale, with “ambitious crossover events” like House of Frankenstein and House of Dracula that combined the studio’s big franchises with lesser names that seemed unable to carry a film on their own. (If Universal’s more recent attempt to do the same with The Mummy fell flat, it was partially because it was unable to distinguish between the horror genre, the star picture, and the comic book movie, resulting in a film that turned out to be none of the above. The real equivalent today would be Blumhouse Productions, which has done a much better job of building its brand—and which distributes its movies through Universal.)
And the inability of such movies to provide narrative closure isn’t a new development, either. After seeing James Whale’s Frankenstein, Carl Laemmle, Jr. reacted in much the same way that executives presumably do now:
Junior Laemmle was equally pleased with Whale’s work, but after seeing the rough cut he was certain that the end of the picture needed to be changed. His concerns were twofold. The finale, in which both Frankenstein and his monster are killed, seemed vaguely dissatisfying; Laemmle suspected that audiences might want a glimmer of hope or redemption. He also had a more pragmatic concern about killing off the characters—and thus any possibility of sequels. Laemmle now regretted letting Professor Van Helsing drive that stake through Count Dracula’s heart, since it consigned the original character to the grave…Laemmle was not about to make the same mistake by letting that angry mob do away with the mad doctor and his monster.
Whale disagreed, but he was persuaded to change the ending after a preview screening, leaving open the possibility that the monster might have survived. Over eight decades later, Joss Whedon offered a similar explanation in an interview with Mental Floss: “It’s difficult because you’re living in franchise world—not just Marvel, but in most big films—where you can’t kill anyone, or anybody significant…My feeling in these situations with Marvel is that if somebody has to be placed on the altar and sacrificed, I’ll let you guys decide if they stay there.” For now, we’re living in a world made by the Universal monsters—and with only a handful of viable properties, half of which are owned by Disney. Without them, it might seem impossible, as Rudin said, “to create them from standard material.” But we’re also still waiting to be blindsided by the next great franchise. As another famous monster once put it: “A lot of times, people don’t know what they want until you show it to them.” And when it came to the movies, at least, Steve Jobs was right.
The closed circle
In his wonderful book The Nature of Order, the architect Christopher Alexander lists fifteen properties that characterize places and buildings that feel alive. (“Life” itself is a difficult concept to define, but we can come close to understanding it by comparing any two objects and asking the one question that Alexander identifies as essential: “Which of the two is a better picture of my self?”) These properties include such fundamentals of design as “Levels of Scale,” “Local Symmetries,” and “Positive Space,” and elements that are a bit trickier to pin down, including “Echoes,” “The Void,” and “Simplicity and Inner Calm.” But the final property, and the one that Alexander suggests is the most important, bears the slightly clunky name of “Not-Separateness.” He points to the Tower of the Wild Goose in China as an example of this quality at its best, and he says of its absence:
When a thing lacks life, is not whole, we experience it as being separate from the world and from itself…In my experiments with shapes and buildings, I have discovered that the other fourteen ways in which centers come to life will make a center which is compact, beautiful, determined, subtle—but which, without this fifteenth property, can still often somehow be strangely separate, cut off from what lies around it, lonely, awkward in its loneliness, too brittle, too sharp, perhaps too well delineated—above all, too egocentric, because it shouts, “Look at me, look at me, look how beautiful I am.”
The fact that he refers to this property as “Non-Separateness,” rather than the more obvious “Connectedness,” indicates that he sees it as a reaction against the marked tendency of architects and planners to strive for distinctiveness and separation. “Those unusual things which have the power to heal…are never like this,” Alexander explains. “With them, usually, you cannot really tell where one thing breaks off and the next begins, because the thing is smokily drawn into the world around it, and softly draws this world into itself.” It’s a characteristic that has little to do with the outsized personalities who tend to be drawn to huge architectural projects, and Alexander firmly skewers the motivations behind it:
This property comes about, above all, from an attitude. If you believe that the thing you are making is self-sufficient, if you are trying to show how clever you are, to make something that asserts its beauty, you will fall into the error of losing, failing, not-separateness. The correct connection to the world will only be made if you are conscious, willing, that the thing you make be indistinguishable from its surroundings; that, truly, you cannot tell where one ends and the next begins, and you do not even want to be able to do so.
This doesn’t happen by accident, particularly when millions of dollars and correspondingly inflated egos are involved. (The most blatant way of separating a building from its surroundings is to put your name on it.) And because it explicitly asks the designer to leave his or her cleverness behind, it amounts to the ultimate test of the subordination of the self to the whole. You can do great work and still falter at the end, precisely because of the strengths that allowed you to get that far in the first place.
It’s hard for me to read these words without thinking of Apple’s new headquarters in Cupertino, variously known as the Ring and the Mothership, which is scheduled to open later this year. A cover story in Wired by Steven Levy describes it in enraptured terms, in which you can practically hear Also Sprach Zarathustra:
As we emerge into the light, the Ring comes into view. As the Jeep orbits it, the sun glistens off the building’s curved glass surface. The “canopies”—white fins that protrude from the glass at every floor—give it an exotic, retro-future feel, evoking illustrations from science fiction pulp magazines of the 1950s. Along the inner border of the Ring, there is a walkway where one can stroll the three-quarter-mile perimeter of the building unimpeded. It’s a statement of openness, of free movement, that one might not have associated with Apple. And that’s part of the point.
There’s a lot to unpack here, from the reference to pulp science fiction to the notion of “orbiting” the building to the claim that the result is “a statement of openness.” As for the contrary view, here’s what another article in Wired, this one by Adam Rogers, had to say about it a month later:
You can’t understand a building without looking at what’s around it—its site, as the architects say. From that angle, Apple’s new [headquarters] is a retrograde, literally inward-looking building with contempt for the city where it lives and cities in general. People rightly credit Apple for defining the look and feel of the future; its computers and phones seem like science fiction. But by building a mega-headquarters straight out of the middle of the last century, Apple has exacerbated the already serious problems endemic to twenty-first-century suburbs like Cupertino—transportation, housing, and economics. Apple Park is an anachronism wrapped in glass, tucked into a neighborhood.
Without delving into the economic and social context, which a recent article in the New York Times explores from another perspective, I think it’s fair to say that Apple Park is an utter failure from the point of view of “Not-Separateness.” But this isn’t surprising. Employees may just be moving in now, but its public debut dates back to June 7, 2011, when Steve Jobs himself pitched it to the Cupertino City Council. Jobs was obsessed by edges and boundaries, both physical and virtual, insisting that the NeXT computer be a perfect cube and introducing millions of consumers to the word “bezel.” Compare this to what Alexander writes of boundaries in architecture:
In things which have not-separateness, there is often a fragmented boundary, an incomplete edge, which destroys the hard line…Often, too, there is a gradient of the boundary, a soft edge caused by a gradient in which scale decreases…so that at the edge it seems to melt indiscernibly into the next thing…Finally, the actual boundary is sometimes rather careless, deliberately placed to avoid any simple complete sharp cutting off of the thing from its surroundings—a randomness in the actual boundary line which allows the thing to be connected to the world.
The italics are mine, because it’s hard to imagine anything less like Jobs or the company he created. Apple Park is being positioned as Jobs’s posthumous masterpiece, which reminds me of the alternate wording to Alexander’s one question: “Which one of these two things would I prefer to become by the day of my death?” (If the building is a monument to Jobs, it’s also a memorial to the ways in which he shaded imperceptibly into Trump, who also has a fixation with borders.) It’s the architectural equivalent of the design philosophy that led Apple to glue in its batteries and made it impossible to upgrade the perfectly cylindrical Mac Pro. Apple has always loved the idea of a closed system, and now its employees get to work in one.
On not knowing what you’re doing
A few days ago, I stumbled across the little item that The Onion ran shortly after the death of Steve Jobs: “Last American Who Knew What The Fuck He Was Doing Dies.” It’s especially amusing to read it now, at a time when the cult of adulation that surrounded Jobs seems to be in partial retreat. These days, it’s impossible to find an article about, say, the upcoming biopic written by Aaron Sorkin without a commenter bringing up all the usual counterarguments: Jobs was fundamentally a repackager and popularizer of other people’s ideas, he was a bully and a bad boss, he hated to share credit, he benefited enormously from luck and good timing, and he pushed a vision of simplicity and elegance that only reduces the user’s freedom of choice. There’s a lot of truth to these points. Yet the fact remains that Jobs did know what he was doing, or at least that he carefully cultivated the illusion that he did, and he left a void in the public imagination that none of his successors have managed to fill. He was fundamentally right about a lot of things for a very long time, and the legacy he left continues to shape our lives, in ways both big and small, one minute after another.
And that Onion headline has been rattling around in my head for most of the week, because I often get the sense I don’t really know what I’m doing, as a writer, as a dad, or as a human being. I do my best to stick to the channel, as Stanislavski would say: I follow the rules I know, maintain good habits, make my lists, and seek out helpful advice wherever I can find it. I have what I think is a realistic sense of my own strengths and weaknesses; I’m a pretty good writer and a pretty good father. But there’s no denying that writing a novel and raising a child are tasks of irreducible complexity, particularly when you’re trying to do both at the same time. Writing, like parenting, imposes a state of constant creative uncertainty: just because you had one good idea or wrote a few decent pages yesterday is no guarantee that you’ll be able to do the same today. If I weren’t fundamentally okay with that, I wouldn’t be here. But there always comes a time when I find myself repeating that line from Calvin and Hobbes I never tire of quoting: “I don’t think I’d have been in such a hurry to reach adulthood if I’d known the whole thing was going to be ad-libbed.”
My only consolation is that I’m not alone. Recently, I’ve been rereading The Magus by John Fowles, a novel that made a huge impression on me when I first encountered it over twenty years ago. In places, it feels uncomfortably like the first work of a young man writing for other young men, but it still comes off as spectacularly assured, which is why it’s all the more striking to read what Fowles has to say about it in his preface:
My strongest memory is of constantly having to abandon drafts because of an inability to describe what I wanted…The Magus remains essentially where a tyro taught himself to write novels—beneath its narrative, a notebook of an exploration, often erring and misconceived, into an unknown land. Even in its final published form it was a far more haphazard and naïvely instinctive work than the more intellectual reader can easily imagine; the hardest blows I had to bear from critics were those that condemned the book as a coldly calculated exercise in fantasy, a cerebral game. But then one of the (incurable) faults of the book was the attempt to conceal the real state of endless flux in which it was written.
Fowles is being consciously self-deprecating, but he hits on a crucial point, which is that most novels are designed to make a story that emerged from countless wrong turns and shots in the dark seem inevitable. In fact, it’s a little like being a parent, or a politician, or the CEO of a major corporation: you need to project an air of authority even if you don’t have the slightest idea if you’re doing the right thing. (And just as you can’t fully appreciate your own parents until you’ve had a kid of your own, you can’t understand the network of uncertainties underlying even the most accomplished novel until you’ve written a few for yourself.) I’d like to believe that the uncertainties, doubts, and fears that persist throughout are a necessary corrective, a way of keeping us humble in the face of challenges that can’t be reduced to a few clear rules. The real danger isn’t being unsure about what comes next; it’s turning into a hedgehog in a world of foxes, convinced that we know the one inarguable truth that applies to every situation. In fiction, that kind of dogmatic certainty leads to formula or propaganda, and we’ve all seen its effects in business, politics, and parenting. It’s better, perhaps, to admit that we’re all faking it until we make it, and that we should be satisfied if we’re right ever so slightly more often than we’re wrong.
The perfect bookstore
I’ve written before about the end of browsing, but for me, it took place a little sooner than I expected. In the old days, I’d spend hours roaming through used bookstores like The Strand in New York and Open Books here in Chicago, keeping an eye out both for books I was hoping to find and for a few unexpected discoveries, but now, with a baby in tow, it’s hard to get into the timeless, transcendent state required for a deep dive into the perfect bookstore. My definition of the perfect used bookstore is a simple one: it needs to have an enormous inventory of interesting books, low prices, and the possibility of exciting serendipity. You shouldn’t know precisely what to expect going in, and if you do find the book you’re looking for, you feel a surge of delight in the same region of the brain that responds to varied, unpredictable pleasures. I thought I’d left this kind of browsing behind, but recently, I discovered a way to do it from the comfort of my own home. And although it really only works for the kind of idiosyncratic, obsessive browsing that I prefer, I’m sharing it here, in hopes that someone else will find it useful.
The first step is to get your hands on a copy of The Whole Earth Catalog. I’ve sung the praises of the Catalog here more than once, but even more than “Google in book form,” as Steve Jobs memorably called it, it’s a portable simulation of the perfect bookstore. It’s usually associated with the 1970s hippie culture of Berkeley and the rest of the East Bay, and not without reason: the older editions include several pages of resources on how to build your own geodesic dome. Really, though, it’s a book for curious readers of every persuasion. Every page is bursting with fascinating, often unfairly neglected or forgotten books on every subject imaginable: literature, art, science, history, philosophy, religion, design, and much more, along with the more famous sections on homesteading, environmentalism, and sustainable living. If you’re the kind of browser I have in mind, it’s the ultimate book of daydreams. (Any edition will work for our purposes, but if you can only get one, I’d recommend The Next Whole Earth Catalog, which gives you the greatest poundage per dollar and breathes the right air of intelligent funkiness.)
Next, you need to head over to Better World Books, my favorite online used bookstore. More specifically, you want to check out their Bargain Bin, which allows you to buy four or more used books at a discount, usually translating to something like four books for $12. (You’ll also want to get on their email list for flash sales and special events, which can lead to even better deals.) Then you settle down in a comfortable chair—or maybe a bed—with The Whole Earth Catalog and start to browse, looking for a book or subject that catches your eye. Maybe it’s Form, Function and Design by Paul Jacques Grillo, or The Natural Way to Draw by Kimon Nicolaides, or Soil and Civilization by Edward S. Hyams, or the works of R. Buckminster Fuller. Then you check the Bargain Bin to see if the book you want is there. In my experience, four times out of ten, you’ll find it, which may not seem like a great percentage if you absolutely need a copy, but it’s ideal for browsing. Even better, since you need four or more books to qualify for the deal, you’ve got to keep going, and it’s often when you’re looking for one last book to fill out your order—and end up exploring unexpected nooks of the Catalog or your own imagination—that you make the most serendipitous discoveries.
Best of all, the Catalog is only a starting point. When you’re leafing through it, you may end up on the page devoted to computers and remember, as I did recently, that you’d been meaning to pick up a copy of the legendary handbook The C Programming Language—and bam, there it is for four dollars. Each page is likely to remind you of other books that you’ve long wanted to explore, and if you follow that train of thought wherever it leads, you’ll find yourself in some unexpected places. And it’s the peculiar constraint of the Bargain Bin, in which you might find the book you want for a wonderful price, that makes the exercise so rewarding, and so much like the classic used bookstore experience. (If you don’t have a copy of the Catalog, you can also use other books with big annotated bibliographies to spark your search: if you’re interested in the sciences, for instance, the one in Gödel, Escher, Bach is particularly good.) When you’re done, you’ll have a package on the way, and part of the fun of Better World Books, as opposed to Amazon Prime, is that you’re never quite sure when it will arrive, which gives each mail delivery an extra frisson of interest. I find myself doing this every month or two, whenever Better World Books has a sale, and I love it: it’s a sustaining shot of happiness for only ten dollars a pop. And my only problem is that I’m running out of shelf space.
Thinking in groups, thinking alone
Where do good ideas come from? A recent issue of the New Yorker offers up a few answers, in a fascinating article on the science of groupthink by Jonah Lehrer, who debunks some widely cherished notions about creative collaboration. Lehrer suggests that brainstorming—narrowly defined as a group activity in which a roomful of people generates as many ideas as possible without pausing to evaluate or criticize—is essentially useless, or at least less effective than spirited group debate or working alone. The best kind of collaboration, he says, occurs when people from diverse backgrounds are thrown together in an environment where they can argue, share ideas, or simply meet by chance, and he backs this up with an impressive array of data, ranging from studies of the genesis of Broadway musicals to the legendary Building 20 at MIT, where individuals as different as Amar Bose and Noam Chomsky thrived in an environment in which the walls between disciplines could literally be torn down.
What I love about Lehrer’s article is that its vision of productive group thinking isn’t that far removed from my sense of what writers and other creative artists need to do on their own. The idea of subjecting the ideas in brainstorming sessions to a rigorous winnowing process has close parallels to Dean Simonton’s Darwinian model of creativity: quality, he notes, is a probabilistic function of quantity, so the more ideas you have, the better—but only if they’re subjected to the discipline of natural selection. This selection can occur in the writer’s mind, in a group, or in the larger marketplace, but the crucial thing is that it take place at all. Free association or productivity isn’t enough without that extra step of revision, or rendering, which in most cases requires a strong external point of view. Hence the importance of outside readers and editors to every writer, no matter how successful.
The premise that creativity flowers most readily from interactions between people from different backgrounds has parallels in one’s inner life as well. In The Act of Creation, Arthur Koestler concludes that bisociation, or the intersection of two unrelated areas of knowledge in unexpected ways, is the ultimate source of creativity. On the highest plane, the most profound innovations in science and the arts often occur when an individual of genius changes fields. On a more personal level, nearly every good story idea I’ve ever had came from the juxtaposition of two previously unrelated concepts, either done on purpose—as in my focused daydreaming with science magazines, which led to stories like “Kawataro,” “The Boneless One,” and “Ernesto”—or by accident. Even accidents, however, can benefit from careful planning, as in the design of the Pixar campus, as conceived by Steve Jobs, in which members of different departments have no choice but to cross paths on their way to the bathroom or cafeteria.
Every creative artist needs to find ways of maximizing this sort of serendipity in his or her own life. My favorite personal example is my own home library: partially out of laziness, my bookshelves have always been a wild jumble of volumes in no particular order, an arrangement that sometimes makes it hard to find a specific book when I need it, but also leads to serendipitous arrangements of ideas. I’ll often be looking for one book when another catches my eye, even if I haven’t read it in years, which takes me, in turn, in unexpected directions. Even more relevant to Lehrer’s article is the importance of talking to people from different fields: writers benefit enormously from working around people who aren’t writers, which is why college tends to be a more creatively fertile period than graduate school. “It is the human friction,” Lehrer concludes, “that makes the sparks.” And we should all arrange our lives accordingly.
A year’s worth of reading
These days, I’m fortunate enough to have more work than I can handle, which also means that I no longer have much time to read for my own pleasure. The past year, in particular, was all business: I had just over nine months to take City of Exiles from conception to final draft, along with a number of other projects, which meant that nearly all my free time was devoted to either writing or research. All the same, I managed to make time to read a number of books that didn’t have anything to do with my work, either in my spare moments, on vacation, or in parallel with writing the novel itself. (Like many writers, I like to read a few pages of an author I admire before starting work for the day, which means that I tend to read books in piecemeal over the course of many weeks or months.) And while I doubt I’ll ever return to being the sort of omnivorous reader I was growing up, it’s still important to me to read as much as possible, both for professional reasons and for the sake of my own sanity.
Much of this year was spent catching up on books that I’d been meaning to read for a long time. The best book I read this year, by far, was The Magic Mountain by Thomas Mann, which seems likely to stand as one of my ten favorite novels, followed close behind by Catch-22, which really does deserve its reputation as the most inventive comic novel of the twentieth century. Turning to slightly more recent books, I was able to catch up on such disparate works as The English Patient, Cloud Atlas, and The Time Traveler’s Wife, all of which I admired. Of these, the two that retain the strongest hold on my imagination are John Crowley’s Little, Big, despite my mixed feelings on reading it for the first time, and J.M. Coetzee’s Disgrace, which strikes me as one of the most perfect of all recent novels. More disappointing were London Fields, Updike’s Terrorist, and, somewhat to my surprise, A Confederacy of Dunces, which I found clumsy and only intermittently engaging, despite its reputation as a classic.
Of books published in the last few years, my reading consisted mostly of nonfiction, despite my nagging resolve to read more contemporary novels. I greatly enjoyed The Immortal Life of Henrietta Lacks by Rebecca Skloot, which is a model of both popular science and investigative journalism. Like everybody else, I bought and read Steve Jobs by Walter Isaacson, which is short on analysis but long on fascination—more a gold mine of material than a real portrait, but still an essential document. I read The New York Regional Mormon Singles Halloween Dance by Elna Baker partly as background material for my novel, but was ultimately won over by Baker’s genuine wit and candor—it’s one of the funniest books I’ve read in a long time. And although The Possessed by Elif Batuman was a little thin, like a selection of essays in search of a theme, it made me curious to see what she’ll do next, given a more substantial project.
As for the coming year, as before, I expect that most of my time will be spent on background reading and research. Still, I have a few other authors I’ve been meaning to try. I’m going to read DeLillo for the first time, probably starting with Underworld, and then the later Philip Roth, beginning with American Pastoral. If I’m feeling really ambitious, I’ll tackle Faulkner, Morrison, and Perec’s Life: A User’s Manual as well. Above all else, I’m going to make a concerted effort to read more contemporary fiction. A glance at the bookshelves in the next room—the property of my wife, who is a much better reader than I am—reveals such titles as A Visit From the Goon Squad, Swamplandia!, and The Magicians, all of which have been beckoning to me for some time now. These days, of course, even my leisure reading has something mercenary about it, as I look for tricks and techniques to borrow or steal. As the year goes on, then, I hope to have a chance to talk more about these books, and if all goes well, I’ll have a few useful things to share, too.
The Oxford English Addictionary
As I mentioned yesterday, the day after Thanksgiving, I found myself the proud new owner of the Compact Edition of the Oxford English Dictionary. I was all set to sit down with my prize, but unfortunately, fate had other plans: the following day, I was on a plane to Hong Kong, and spent the next two weeks traveling there and in China. And while it was a wonderful trip, I have to admit that my thoughts occasionally strayed back home, where my dictionary was patiently waiting. Upon my return, then, I threw down my suitcase and all but tore into the dining room—the only place in the house with a table large enough to comfortably accommodate this kind of work—for some quality time with the OED. (My sister-in-law says that this serves as ample proof that I’m a huge nerd, which will surely come as a surprise to this blog’s regular readers.)
A few words about the Compact OED itself. As many of you probably know, the Compact Edition contains all the material of the twelve-volume Oxford English Dictionary, with the text photographically reduced so that four pages fit on each regular page of the two-volume version, to be read with an included magnifying glass. The copy I purchased is the second edition, not the third, which means that it lacks the supplements and updates that the dictionary has acquired since 1971. All the same, these updates comprise maybe five percent of the dictionary’s total length, and the older edition may actually be more useful for my purposes: with the four-up format, I can just about read the text even without the magnifying glass, while the latest edition is nine-up, making the type too small to browse conveniently.
And you need to be able to browse in this dictionary, which is a browser’s paradise. Opening it now at random, I’m at a loss as to where to begin: there’s Duumvirate, Dwale, and close to a whole page devoted to variations on Dwarf, with citations ranging from the year 1450 (“that wretchit dorche”) to 1846 (“If a dwarf on the shoulders of a giant can see further than the giant, he is no less a dwarf in comparison to the giant”). Turning to another page, we have Mithridatic, Mitraille (“Small missiles, as fragments of iron, heads of nails, etc. shot in masses from a cannon”), and Mitre (“A headband or fillet worn by ancient Greek women; also, a kind of head-dress common among Asiatics, the wearing of which by men was regarded by the Romans as a mark of effeminacy”). And these are just a few pages chosen randomly out of more than sixteen thousand. The result isn’t just a dictionary, but an entire world, at least the part described in English, and it offers a lifetime’s worth of exploration.
Clearly, I’m addicted: reading this dictionary makes me feel as if I’ll never need to do anything else. Steve Jobs once called the Whole Earth Catalog “Google in paperback form,” and the Compact OED, in this wonderfully browsable edition, gives me something of the same sensation: each page opens up onto new horizons, with one word leading to another, and to unexpected byways of etymology, history, and literature—as big as the Web, but richer and more nourishing. The result seems less like a book than a living being, vibrating with possibility even as it sits reassuringly on the shelf. Whether or not it will enhance my vocabulary remains to be seen, but it’s already had a fertilizing effect on my imagination. Perhaps it will for you as well, or for someone you love. After all, Christmas is coming. What better gift could there be?
“To be truly simple, you have to go really deep”
Simplicity isn’t just a visual style. It’s not just minimalism or the absence of clutter. It involves digging through the depth of the complexity. To be truly simple, you have to go really deep. For example, to have no screws on something, you can end up having a product that is so convoluted and complex. The better way is to go deeper with simplicity, to understand everything about it and how it’s manufactured. You have to deeply understand the essence of a product in order to be able to get rid of the parts that are not essential.
—Apple designer Jonathan Ive, quoted by Walter Isaacson in Steve Jobs
Steve Jobs and “the hippie Wikipedia”
With the unexpected resignation of Steve Jobs as chief executive of Apple, many of us, including me, have probably been inspired to revisit the legendary commencement address he gave at Stanford in 2005, which has deservedly become one of the most famous speeches of its kind. The entire address is worth reading, of course, but in particular, I’ve always loved its closing appreciation of The Whole Earth Catalog, which Jobs describes as “sort of like Google in paperback form.” More recently, a New York Times article on Jobs referred to it as “a kind of hippie Wikipedia.” Both characterizations are fairly accurate, but The Whole Earth Catalog is much more. For as long as I can remember, I’ve found it to be an invaluable guide and source of inspiration, and I can sincerely say that it deserves to be a part of every thinking person’s life.
Of course, I’m somewhat biased, because The Whole Earth Catalog is a product of a time and place that is close to my heart: the Bay Area of the 1970s, centered in particular on Berkeley, Sausalito, and Menlo Park. Stewart Brand, another singular visionary, founded the Catalog to provide access to tools for those interested in exploring a wide range of issues that remain important today, notably sustainable living, simplicity, and ecology in its original sense, which spans everything from environmentalism to the most straightforward kind of home economics. Above all, the Catalog was the expression of the same restless curiosity that informed the early years of Apple. It gave you the tools to investigate space exploration, personal computing, art, literature, anthropology, architecture, health, backpacking, mysticism, and much more, almost without end. And the most useful tools were books.
As a lifelong obsessive reader, I’m always looking for new things to read, and the classic editions of the Catalog have pointed me toward more great books, many neglected or out of print, than any other source. First and foremost is Christopher Alexander’s A Pattern Language, the best nonfiction book of the past fifty years, which gets a page of its own in the Catalog, with R.H. Blyth’s great, eccentric Zen in English Literature and Oriental Classics close behind. There’s The Plan of St. Gall in Brief; D’Arcy Wentworth Thompson’s classic On Growth and Form; and such odd, essential books as Soil and Civilization; Form, Function, and Design; Structures; The Prodigious Builders; The Natural Way to Draw; Poker: A Guaranteed Income for Life; Japanese Homes and Their Surroundings; and the works of Lewis Mumford and Buckminster Fuller. All these I owe to the Catalog.
And the Catalog itself is full of wisdom that doesn’t date: original essays, tidbits of advice in the writeups of individual books, ideas and inspirations all but tucked into the margin. I own three editions, but my favorite is The Next Whole Earth Catalog, which, at five pounds and fifteen by eleven inches, is as big as a paperback book can get. Opening it to any page reminds me at once of what really matters, a world of books, ideas, and simple living, and it has always steered me back on track whenever I’ve been tempted to stray. And Steve Jobs can probably say the same thing. At the end of his address at Stanford, he quotes four words from the back cover of the 1974 edition of the Catalog, which many have since misattributed to Jobs himself: “Stay hungry. Stay foolish.” And if the career of Steve Jobs is merely the most striking illustration of what these words can do, we can thank the Catalog for this as well.
Quote of the Day
The only way to do great work is to love what you do. If you haven’t found it yet, keep looking. Don’t settle. As with all matters of the heart, you’ll know when you find it.
—Steve Jobs, in a commencement speech at Stanford University