Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Steve Jobs

A Fuller Life

with 8 comments

I’m pleased to announce that I’ve finally figured out the subject of my next book, which will be a biography of the architect and futurist Buckminster Fuller. If you’re a regular reader of this blog, you probably know how much Fuller means to me, and I’m looking forward to giving him the comprehensive portrait that he deserves. (Honestly, that’s putting it mildly. I’ve known for over a week that I’ll have a chance to tackle this project, and I still can’t quite believe that it’s really happening. And I’m especially happy that my current publisher has agreed to give me a shot at it.) At first glance, this might seem like a departure from my previous work, but it presents an opportunity to explore some of the same themes from a different angle, and to explore how they might play out in the real world. The timelines of the two projects largely coincide, with a group of subjects who were affected by the Great Depression, World War II, the Cold War, and the social upheavals of the sixties. All of them had highly personal notions about the fate of America, and Fuller used physical artifacts much as Campbell, Asimov, and Heinlein employed science fiction—to prepare their readers for survival in an era of perpetual change. Fuller’s wife, Anne, played an unsung role in his career that recalls many of the women in Astounding. Like Campbell, he approached psychology as a category of physics, and he hoped to turn the prediction of future trends into a science in itself. His skepticism of governments led him to conclude that society should be changed through design, not political institutions, and like many science fiction writers, he acted as if all disciplines could be reduced to subsets of engineering. And for most of his life, he insisted that complicated social problems could be solved through technology.

Most of his ideas were expressed through the geodesic dome, the iconic work of structural design that made him famous—and I hope that this book will be as much about the dome as about Fuller himself. It became a universal symbol of the space age, and his reputation as a futurist may have been founded largely on the fact that his most recognizable achievement instantly evoked the landscape of science fiction. From the beginning, the dome was both an elegant architectural conceit and a potent metaphor. The concept of a hemispherical shelter that used triangular elements to enclose the maximum amount of space had been explored by others, but Fuller was the first to see it as a vehicle for social change. With design principles that could be scaled up or down without limitation, it could function as a massive commercial pavilion or as a house for hippies. (Ken Kesey dreamed of building a geodesic dome to hold one of his acid tests.) It could be made out of plywood, steel, or cardboard. A dome could be cheaply assembled by hand by amateur builders, which encouraged experimentation, and its specifications could be laid out in a few pages and shared for free, like the modern blueprints for printable houses. It was a hackable, open-source machine for living that reflected a set of tools that spoke to the same men and women who were teaching themselves how to code. As I noted here recently, a teenager named Jaron Lanier, who was living in a tent with his father on an acre of desert in New Mexico, used nothing but the formulas in Lloyd Kahn’s Domebook to design and build a house that he called “Earth Station Lanier.” Lanier, who became renowned years later as the founder of virtual reality, never got over the experience. He recalled decades later: “I loved the place; dreamt about it while sleeping inside it.”

During his lifetime, Fuller was one of the most famous men in America, and he managed to become an idol to both the establishment and the counterculture. In the three decades since his death, his reputation has faded, but his legacy is visible everywhere. The influence of his geodesic structures can be seen in the Houston Astrodome, at Epcot Center, on thousands of playgrounds, in the dome tents favored by backpackers, and in the emergency shelters used after Hurricane Katrina. Fuller had a lasting impact on environmentalism and design, and his interest in unconventional forms of architecture laid the foundation for the alternative housing movement. His homegrown system of geometry led to insights into the biological structure of viruses and the logic of communications networks, and after he died, he was honored by the discoverers of a revolutionary form of carbon that resembled a geodesic sphere, which became known as fullerene, or the buckyball. And I’m particularly intrigued by his parallels to the later generation of startup founders. During the seventies, he was a hero to the likes of Steve Wozniak and Steve Jobs, who later featured him prominently in the first “Think Different” commercial, and he was the prototype of the Silicon Valley types who followed. He was a Harvard dropout who had been passed over by the college’s exclusive social clubs, and despite his lack of formal training, he turned himself into an entrepreneur who believed in changing society through innovative products and environmental design. Fuller wore the same outfit to all his public appearances, and his personal habits amounted to an early form of biohacking. (Fuller slept each day for just a few hours, taking a nap whenever he felt tired, and survived mostly on steak and tea.) His closest equivalent today may well be Elon Musk, which tells us a lot about both men.

And this project is personally significant to me. I first encountered Fuller through The Whole Earth Catalog, which opened its first edition with two pages dedicated to his work, preceded by a statement from editor Stewart Brand: “The insights of Buckminster Fuller initiated this catalog.” I was three years old when he died, and I grew up in the shadow of his influence in the Bay Area. The week before my freshman year in high school, I bought a used copy of his book Critical Path, and I tried unsuccessfully to plow through Synergetics. (At the time, this all felt kind of normal, and it’s only when I look back that it seems strange—which tells you a lot about me, too.) Above all else, I was drawn to his reputation as the ultimate generalist, which reflected my idea of what my life should be, and I’m hugely excited by the prospect of returning to him now. Fuller has been the subject of countless other works, but never a truly authoritative biography, which is a project that meets both Susan Sontag’s admonition that a writer should try to be useful and the test that I stole from Lin-Manuel Miranda: “What’s the thing that’s not in the world that should be in the world?” Best of all, the process looks to be tremendously interesting for its own sake—I think it’s going to rewire my brain. It also requires an unbelievable amount of research. To apply the same balanced, fully sourced, narrative approach to his life that I tried to take for Campbell, I’ll need to work through all of Fuller’s published work, a mountain of primary sources, and what might literally be the largest single archive for any private individual in history. I know from experience that I can’t do it alone, and I’m looking forward to seeking help from the same kind of brain trust that I was lucky to have for Astounding. Those of you who have stuck with this blog should be prepared to hear a lot more about Fuller over the next three years, but I wouldn’t be doing this at all if I didn’t think that you might find it interesting. And who knows? He might change your life, too.

Written by nevalalee

November 16, 2018 at 8:50 am

Famous monsters of filmland

leave a comment »

For his new book The Big Picture: The Fight for the Future of the Movies, the journalist Ben Fritz reviewed every email from the hack of Sony Pictures, which are still available online. Whatever you might think about the ethics of using such material, it’s a gold mine of information about how Hollywood has done business over the last decade, and Fritz has come up with some fascinating nuggets. One of the most memorable finds is an exchange between studio head Amy Pascal and the producer Scott Rudin, who was trying to convince her to take a chance on Danny Boyle’s adaptation of Steve Jobs. Pascal had expressed doubts about the project, particularly over the casting of Michael Fassbender in the lead, and after arguing it was less risky than The Social Network, Rudin delivered a remarkable pep talk:

You ought to be doing this movie—period—and you and I both know that the cold feet you are feeling is costing you this movie that you want. Once you have cold feet, you’re done. You’re making this decision in the anticipation of how you will be looked at in failure. That’s how you fail. So you’re feeling wobbly in the job right now. Here’s the fact: nothing conventional you could do is going to change that, and there is no life-changing hit that is going to fall into your lap that is not a nervous decision, because the big obvious movies are going to go elsewhere and you don’t have the IP right now to create them from standard material. You have this. Face it…Force yourself to muster some confidence about it and do the exact thing right now for which your career will be known in movie history: be the person who makes the tough decisions and sticks with them and makes the unlikely things succeed. Fall on your sword—when you’ve lost that, it’s finished. You’re the person who does these movies. That’s—for better or worse—who you are and who you will remain. To lose that is to lose yourself.

Steve Jobs turned out to be a financial disappointment, and its failure—despite the prestige of its subject, director, and cast—feels emblematic of the move away from films driven by stars to those that depend on “intellectual property” of the kind that Sony lacked. In particular, the movie industry seems to have shifted to a model perfected by Marvel Studios, which builds a cinematic universe that can drum up excitement for future installments and generate huge grosses overseas. Yet this isn’t exactly new. In the groundbreaking book The Genius of the System, which was published three decades ago, Thomas Schatz notes that Universal did much the same in the thirties, when it pioneered the genre of cinematic horror under founder Carl Laemmle and his son:

The horror picture scarcely emerged full-blown from the Universal machinery, however. In fact, the studio had been cultivating the genre for years, precisely because it played to Universal’s strengths and maximized its resources…Over the years Carl Laemmle built a strong international distribution system, particularly in Europe…[European filmmakers] brought a fascination for the cinema’s distinctly unrealistic qualities, its capacity to depict a surreal landscape of darkness, nightmare logic, and death. This style sold well in Europe.

After noting that the aesthetics of horror lent itself to movies built out of little more than shadows and fog, which were the visual effects of its time, Schatz continues: “This rather odd form of narrative economy was vitally important to a studio with limited financial resources and no top stars to carry its pictures. And in casting, too, the studio turned a limitation into an asset, since the horror film did not require romantic leads or name stars.”

The turning point was Tod Browning’s Dracula, a movie “based on a presold property” that could serve as an entry point for other films along the same lines. It didn’t require a star, but “an offbeat character actor,” and Universal’s expectations for it eerily foreshadow the way in which studio executives still talk today. Schatz writes:

Laemmle was sure it would [succeed]—so sure, in fact, that he closed the Frankenstein deal several weeks before Dracula’s February 1931 release. The Lugosi picture promptly took off at the box office, and Laemmle was more convinced than ever that the horror film was an ideal formula for Universal, given its resources and the prevailing market conditions. He was convinced, too, that he had made the right decision with Frankenstein, which had little presold appeal but now had the success of Dracula to generate audience anticipation.

Frankenstein, in short, was sort of like the Ant-Man of the thirties, a niche property that leveraged the success of its predecessors into something like real excitement. It worked, and Universal’s approach to its monsters anticipates what Marvel would later do on a vaster scale, with “ambitious crossover events” like House of Frankenstein and House of Dracula that combined the studio’s big franchises with lesser names that seemed unable to carry a film on their own. (If Universal’s more recent attempt to do the same with The Mummy fell flat, it was partially because it was unable to distinguish between the horror genre, the star picture, and the comic book movie, resulting in a film that turned out to be none of the above. The real equivalent today would be Blumhouse Productions, which has done a much better job of building its brand—and which distributes its movies through Universal.)

And the inability of such movies to provide narrative closure isn’t a new development, either. After seeing James Whale’s Frankenstein, Carl Laemmle, Jr. reacted in much the same way that executives presumably do now:

Junior Laemmle was equally pleased with Whale’s work, but after seeing the rough cut he was certain that the end of the picture needed to be changed. His concerns were twofold. The finale, in which both Frankenstein and his monster are killed, seemed vaguely dissatisfying; Laemmle suspected that audiences might want a glimmer of hope or redemption. He also had a more pragmatic concern about killing off the characters—and thus any possibility of sequels. Laemmle now regretted letting Professor Van Helsing drive that stake through Count Dracula’s heart, since it consigned the original character to the grave…Laemmle was not about to make the same mistake by letting that angry mob do away with the mad doctor and his monster.

Whale disagreed, but he was persuaded to change the ending after a preview screening, leaving open the possibility that the monster might have survived. Over eight decades later, Joss Whedon offered a similar explanation in an interview with Mental Floss: “It’s difficult because you’re living in franchise world—not just Marvel, but in most big films—where you can’t kill anyone, or anybody significant…My feeling in these situations with Marvel is that if somebody has to be placed on the altar and sacrificed, I’ll let you guys decide if they stay there.” For now, we’re living in a world made by the Universal monsters—and with only a handful of viable properties, half of which are owned by Disney. Without them, it might seem impossible, as Rudin said, “to create them from standard material.” But we’re also still waiting to be blindsided by the next great franchise. As another famous monster once put it: “A lot of times, people don’t know what they want until you show it to them.” And when it came to the movies, at least, Steve Jobs was right.

The closed circle

leave a comment »

In his wonderful book The Nature of Order, the architect Christopher Alexander lists fifteen properties that characterize places and buildings that feel alive. (“Life” itself is a difficult concept to define, but we can come close to understanding it by comparing any two objects and asking the one question that Alexander identifies as essential: “Which of the two is a better picture of my self?”) These properties include such fundamentals of design as “Levels of Scale,” “Local Symmetries,” and “Positive Space,” and elements that are a bit trickier to pin down, including “Echoes,” “The Void,” and “Simplicity and Inner Calm.” But the final property, and the one that Alexander suggests is the most important, bears the slightly clunky name of “Not-Separateness.” He points to the Tower of the Wild Goose in China as an example of this quality at its best, and he says of its absence:

When a thing lacks life, is not whole, we experience it as being separate from the world and from itself…In my experiments with shapes and buildings, I have discovered that the other fourteen ways in which centers come to life will make a center which is compact, beautiful, determined, subtle—but which, without this fifteenth property, can still often somehow be strangely separate, cut off from what lies around it, lonely, awkward in its loneliness, too brittle, too sharp, perhaps too well delineated—above all, too egocentric, because it shouts, “Look at me, look at me, look how beautiful I am.”

The fact that he refers to this property as “Non-Separateness,” rather than the more obvious “Connectedness,” indicates that he sees it as a reaction against the marked tendency of architects and planners to strive for distinctiveness and separation. “Those unusual things which have the power to heal…are never like this,” Alexander explains. “With them, usually, you cannot really tell where one thing breaks off and the next begins, because the thing is smokily drawn into the world around it, and softly draws this world into itself.” It’s a characteristic that has little to do with the outsized personalities who tend to be drawn to huge architectural projects, and Alexander firmly skewers the motivations behind it:

This property comes about, above all, from an attitude. If you believe that the thing you are making is self-sufficient, if you are trying to show how clever you are, to make something that asserts its beauty, you will fall into the error of losing, failing, not-separateness. The correct connection to the world will only be made if you are conscious, willing, that the thing you make be indistinguishable from its surroundings; that, truly, you cannot tell where one ends and the next begins, and you do not even want to be able to do so.

This doesn’t happen by accident, particularly when millions of dollars and correspondingly inflated egos are involved. (The most blatant way of separating a building from its surroundings is to put your name on it.) And because it explicitly asks the designer to leave his or her cleverness behind, it amounts to the ultimate test of the subordination of the self to the whole. You can do great work and still falter at the end, precisely because of the strengths that allowed you to get that far in the first place.

It’s hard for me to read these words without thinking of Apple’s new headquarters in Cupertino, variously known as the Ring and the Mothership, which is scheduled to open later this year. A cover story in Wired by Steven Levy describes it in enraptured terms, in which you can practically hear Also Sprach Zarathustra:

As we emerge into the light, the Ring comes into view. As the Jeep orbits it, the sun glistens off the building’s curved glass surface. The “canopies”—white fins that protrude from the glass at every floor—give it an exotic, retro-­future feel, evoking illustrations from science fiction pulp magazines of the 1950s. Along the inner border of the Ring, there is a walkway where one can stroll the three-quarter-mile perimeter of the building unimpeded. It’s a statement of openness, of free movement, that one might not have associated with Apple. And that’s part of the point.

There’s a lot to unpack here, from the reference to pulp science fiction to the notion of “orbiting” the building to the claim that the result is “a statement of openness.” As for the contrary view, here’s what another article in Wired, this one by Adam Rogers, had to say about it a month later:

You can’t understand a building without looking at what’s around it—its site, as the architects say. From that angle, Apple’s new [headquarters] is a retrograde, literally inward-looking building with contempt for the city where it lives and cities in general. People rightly credit Apple for defining the look and feel of the future; its computers and phones seem like science fiction. But by building a mega-headquarters straight out of the middle of the last century, Apple has exacerbated the already serious problems endemic to twenty-first-century suburbs like Cupertino—transportation, housing, and economics. Apple Park is an anachronism wrapped in glass, tucked into a neighborhood.

Without delving into the economic and social context, which a recent article in the New York Times explores from another perspective, I think it’s fair to say that Apple Park is an utter failure from the point of view of “Not-Separateness.” But this isn’t surprising. Employees may just be moving in now, but its public debut dates back to June 7, 2011, when Steve Jobs himself pitched it to the Cupertino City Council. Jobs was obsessed by edges and boundaries, both physical and virtual, insisting that the NeXT computer be a perfect cube and introducing millions of consumers to the word “bezel.” Compare this to what Alexander writes of boundaries in architecture:

In things which have not-separateness, there is often a fragmented boundary, an incomplete edge, which destroys the hard line…Often, too, there is a gradient of the boundary, a soft edge caused by a gradient in which scale decreases…so that at the edge it seems to melt indiscernibly into the next thing…Finally, the actual boundary is sometimes rather careless, deliberately placed to avoid any simple complete sharp cutting off of the thing from its surroundings—a randomness in the actual boundary line which allows the thing to be connected to the world.

The italics are mine, because it’s hard to imagine anything less like Jobs or the company he created. Apple Park is being positioned as Jobs’s posthumous masterpiece, which reminds me of the alternate wording to Alexander’s one question: “Which one of these two things would I prefer to become by the day of my death?” (If the building is a monument to Jobs, it’s also a memorial to the ways in which he shaded imperceptibly into Trump, who also has a fixation with borders.) It’s the architectural equivalent of the design philosophy that led Apple to glue in its batteries and made it impossible to upgrade the perfectly cylindrical Mac Pro. Apple has always loved the idea of a closed system, and now its employees get to work in one.

Written by nevalalee

July 5, 2017 at 8:59 am

The Ian Malcolm rule

with one comment

Jeff Goldblum in Jurassic Park

A man is rich in proportion to the number of things he can afford to leave alone.

—Henry David Thoreau, Walden

Last week, at the inaugural town hall meeting at Facebook headquarters, one brave questioner managed to cut through the noise and press Mark Zuckerberg on the one issue that really matters: what’s the deal with that gray shirt he always wears? Zuckerberg replied:

I really want to clear my life to make it so I have to make as few decisions as possible about anything except best how to serve this community…I’m in this really lucky position where I get to wake up every day and help serve more than a billion people. And I feel like I’m not doing my job if I spend any of my energy on things that are silly or frivolous about my life…So even though it kind of sounds silly—that that’s my reason for wearing a gray t-shirt every day—it also is true.

There’s a surprising amount to unpack here, starting with the fact, as Allison P. Davis of New York Magazine points out, that it’s considerably easier for a young white male to always wear the same clothes than a woman in the same situation. It’s also worth noting that wearing the exact same shirt each day turns simplicity into a kind of ostentation: there are ways of minimizing the amount of time you spend thinking about your wardrobe without calling attention to it so insistently.

Of course, Zuckerberg is only the latest in a long line of high-achieving nerds who insist, rightly or wrongly, that they have more important things to think about than what they’re going to wear. There’s more than an echo here of the dozens of black Issey Miyake turtlenecks that were stacked in Steve Jobs’s closet, and in the article linked above, Vanessa Friedman of The New York Times also notes that Zuckerberg sounds a little like Obama, who told Michael Lewis in Vanity Fair: “You’ll see I wear only gray or blue suits. I’m trying to pare down decisions. I don’t want to make decisions about what I’m eating or wearing. Because I have too many other decisions to make.” Even Christopher Nolan gets into the act, as we learn in the recent New York Times Magazine profile by Gideon Lewis-Kraus:

Nolan’s own look accords with his strict regimen of optimal resource allocation and flexibility: He long ago decided it was a waste of energy to choose anew what to wear each day, and the clubbable but muted uniform on which he settled splits the difference between the demands of an executive suite and a tundra. The ensemble is smart with a hint of frowzy, a dark, narrow-lapeled jacket over a blue dress shirt with a lightly fraying collar, plus durable black trousers over scuffed, sensible shoes.

Mark Zuckerberg

If you were to draw a family tree between all these monochromatic Vulcans, you’d find that, consciously or not, they’re all echoing their common patron saint, Ian Malcolm in Jurassic Park, who says:

In any case, I wear only two colors, black and gray…These colors are appropriate for any occasion…and they go well together, should I mistakenly put on a pair of gray socks with my black trousers…I find it liberating. I believe my life has value, and I don’t want to waste it thinking about clothing.

As Malcolm speaks, Crichton writes, “Ellie was staring at him, her mouth open”—apparently stunned into silence, as all women would be, at this display of superhuman rationality. And while it’s easy to make fun of it, I’m basically one of those guys. I eat the same breakfast and lunch every day; my daily uniform of polo shirt, jeans, and New Balance sneakers rarely, if ever, changes; and I’ve had the same haircut for the last eighteen years. If pressed, I’d probably offer a rationale more or less identical to the ones given above. As a writer, I’m called upon to solve a series of agonizingly specific problems each time I sit down at my desk, so the less headspace I devote to everything else, the better.

Which is all well and good. But it’s also easy to confuse the externals with their underlying intention. The world, or at least the Bay Area, is full of young guys with the Zuckerberg look, but it doesn’t matter how little time you spend getting dressed if you aren’t mindfully reallocating the time you save, or extending the principle beyond the closet. The most eloquent defense of minimizing extraneous thinking was mounted by the philosopher Alfred North Whitehead, who writes:

It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking of what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them. Operations of thought are like cavalry charges in a battle—they are strictly limited in number, they require fresh horses, and must only be made at decisive moments.

Whitehead isn’t talking about his shirts here; he’s talking about the Arabic number system, a form of “good notation” that frees the mind to think about more complicated problems. Which only reminds us that the shirts you wear won’t make you more effective if you aren’t being equally thoughtful about the decisions that really count. Otherwise, they’re only an excuse for laziness or indifference, which is just as contagious as efficiency. And it often comes to us as a wolf in nerd’s clothing.

On not knowing what you’re doing

with 4 comments

Steve Wozniak and Steve Jobs

A few days ago, I stumbled across the little item that The Onion ran shortly after the death of Steve Jobs: “Last American Who Knew What The Fuck He Was Doing Dies.” It’s especially amusing to read it now, at a time when the cult of adulation that surrounded Jobs seems to be in partial retreat. These days, it’s impossible to find an article about, say, the upcoming biopic written by Aaron Sorkin without a commenter bringing up all the usual counterarguments: Jobs was fundamentally a repackager and popularizer of other people’s ideas, he was a bully and a bad boss, he hated to share credit, he benefited enormously from luck and good timing, and he pushed a vision of simplicity and elegance that only reduces the user’s freedom of choice. There’s a lot of truth to these points. Yet the fact remains that Jobs did know what he was doing, or at least that he carefully cultivated the illusion that he did, and he left a void in the public imagination that none of his successors have managed to fill. He was fundamentally right about a lot of things for a very long time, and the legacy he left continues to shape our lives, in ways both big and small, one minute after another.

And that Onion headline has been rattling around in my head for most of the week, because I often get the sense I don’t really know what I’m doing, as a writer, as a dad, or as a human being. I do my best to stick to the channel, as Stanislavski would say: I follow the rules I know, maintain good habits, make my lists, and seek out helpful advice wherever I can find it. I have what I think is a realistic sense of my own strengths and weaknesses; I’m a pretty good writer and a pretty good father. But there’s no denying that writing a novel and raising a child are tasks of irreducible complexity, particularly when you’re trying to do both at the same time. Writing, like parenting, imposes a state of constant creative uncertainty: just because you had one good idea or wrote a few decent pages yesterday is no guarantee that you’ll be able to do the same today. If I weren’t fundamentally okay with that, I wouldn’t be here. But there always comes a time when I find myself repeating that line from Calvin and Hobbes I never tire of quoting: “I don’t think I’d have been in such a hurry to reach adulthood if I’d known the whole thing was going to be ad-libbed.”

John Fowles

My only consolation is that I’m not alone. Recently, I’ve been rereading The Magus by John Fowles, a novel that made a huge impression on me when I first encountered it over twenty years ago. In places, it feels uncomfortably like the first work of a young man writing for other young men, but it still comes off as spectacularly assured, which is why it’s all the more striking to read what Fowles has to say about it in his preface:

My strongest memory is of constantly having to abandon drafts because of an inability to describe what I wanted…The Magus remains essentially where a tyro taught himself to write novels—beneath its narrative, a notebook of an exploration, often erring and misconceived, into an unknown land. Even in its final published form it was a far more haphazard and naïvely instinctive work than the more intellectual reader can easily imagine; the hardest blows I had to bear from critics were those that condemned the book as a coldly calculated exercise in fantasy, a cerebral game. But then one of the (incurable) faults of the book was the attempt to conceal the real state of endless flux in which it was written.

Fowles is being consciously self-deprecating, but he hits on a crucial point, which is that most novels are designed to make a story that emerged from countless wrong turns and shots in the dark seem inevitable. In fact, it’s a little like being a parent, or a politician, or the CEO of a major corporation: you need to project an air of authority even if you don’t have the slightest idea if you’re doing the right thing. (And just as you can’t fully appreciate your own parents until you’ve had a kid of your own, you can’t understand the network of uncertainties underlying even the most accomplished novel until you’ve written a few for yourself.) I’d like to believe that the uncertainties, doubts, and fears that persist throughout are a necessary corrective, a way of keeping us humble in the face of challenges that can’t be reduced to a few clear rules. The real danger isn’t being unsure about what comes next; it’s turning into a hedgehog in a world of foxes, convinced that we know the one inarguable truth that applies to every situation. In fiction, that kind of dogmatic certainty leads to formula or propaganda, and we’ve all seen its effects in business, politics, and parenting. It’s better, perhaps, to admit that we’re all faking it until we make it, and that we should be satisfied if we’re right ever so slightly more often than we’re wrong.

Written by nevalalee

October 20, 2014 at 8:59 am

The perfect bookstore

with 2 comments

The Next Whole Earth Catalog

I’ve written before about the end of browsing, but for me, it took place a little sooner than I expected. In the old days, I’d spend hours roaming through used bookstores like The Strand in New York and Open Books here in Chicago, keeping an eye out both for books I was hoping to find and for a few unexpected discoveries, but now, with a baby in tow, it’s hard to get into the timeless, transcendent state required for a deep dive into the perfect bookstore. My definition of the perfect used bookstore is a simple one: it needs to have an enormous inventory of interesting books, low prices, and the possibility of exciting serendipity. You shouldn’t know precisely what to expect going in, and if you do find the book you’re looking for, you feel a surge of delight in the same region of the brain that responds to varied, unpredictable pleasures. I thought I’d left this kind of browsing behind, but recently, I discovered a way to do it from the comfort of my own home. And although it really only works for the kind of idiosyncratic, obsessive browsing that I prefer, I’m sharing it here, in hopes that someone else will find it useful.

The first step is to get your hands on a copy of The Whole Earth Catalog. I’ve sung the praises of the Catalog here more than once, but even more than “Google in book form,” as Steve Jobs memorably called it, it’s a portable simulation of the perfect bookstore. It’s usually associated with the 1970s hippie culture of Berkeley and the rest of the East Bay, and not without reason: the older editions include several pages of resources on how to build your own geodesic dome. Really, though, it’s a book for curious readers of every persuasion. Every page is bursting with fascinating, often unfairly neglected or forgotten books on every subject imaginable: literature, art, science, history, philosophy, religion, design, and much more, along with the more famous sections on homesteading, environmentalism, and sustainable living. If you’re the kind of browser I have in mind, it’s the ultimate book of daydreams. (Any edition will work for our purposes, but if you can only get one, I’d recommend The Next Whole Earth Catalog, which gives you the greatest poundage per dollar and breathes the right air of intelligent funkiness.)

Better World Books

Next, you need to head over to Better World Books, my favorite online used bookstore. More specifically, you want to check out their Bargain Bin, which allows you to buy four or more used books at a discount, usually translating to something like four books for $12. (You’ll also want to get on their email list for flash sales and special events, which can lead to even better deals.) Then you settle down in a comfortable chair—or maybe a bed—with The Whole Earth Catalog and start to browse, looking for a book or subject that catches your eye. Maybe it’s Form, Function and Design by Paul Jacques Grillo, or The Natural Way to Draw by Kimon Nicolaides, or Soil and Civilization by Edward S. Hyams, or the works of R. Buckminster Fuller. Then you check the Bargain Bin to see if the book you want is there. In my experience, four times out of ten, you’ll find it, which may not seem like a great percentage if you absolutely need a copy, but it’s ideal for browsing. Even better, since you need four or more books to qualify for the deal, you’ve got to keep going, and it’s often when you’re looking for one last book to fill out your order—and end up exploring unexpected nooks of the Catalog or your own imagination—that you make the most serendipitous discoveries.

Best of all, the Catalog is only a starting point. When you’re leafing through it, you may end up on the page devoted to computers and remember, as I did recently, that you’d been meaning to pick up a copy of the legendary handbook The C Programming Language—and bam, there it is for four dollars. Each page is likely to remind you of other books that you’ve long wanted to explore, and if you follow that train of thought wherever it leads, you’ll find yourself in some unexpected places. And it’s the peculiar constraint of the Bargain Bin, in which you might find the book you want for a wonderful price, that makes the exercise so rewarding, and so much like the classic used bookstore experience. (If you don’t have a copy of the Catalog, you can also use other books with big annotated bibliographies to spark your search: if you’re interested in the sciences, for instance, the one in Gödel, Escher, Bach is particularly good.) When you’re done, you’ll have a package on the way, and part of the fun of Better World Books, as opposed to Amazon Prime, is that you’re never quite sure when it will arrive, which gives each mail delivery an extra frisson of interest. I find myself doing this every month or two, whenever Better World Books has a sale, and I love it: it’s a sustaining shot of happiness for only ten dollars a pop. And my only problem is that I’m running out of shelf space.

Written by nevalalee

July 18, 2013 at 8:39 am

Thinking in groups, thinking alone

leave a comment »

Where do good ideas come from? A recent issue of the New Yorker offers up a few answers, in a fascinating article on the science of groupthink by Jonah Lehrer, who debunks some widely cherished notions about creative collaboration. Lehrer suggests that brainstorming—narrowly defined as a group activity in which a roomful of people generates as many ideas as possible without pausing to evaluate or criticize—is essentially useless, or at least less effective than spirited group debate or working alone. The best kind of collaboration, he says, occurs when people from diverse backgrounds are thrown together in an environment where they can argue, share ideas, or simply meet by chance, and he backs this up with an impressive array of data, ranging from studies of the genesis of Broadway musicals to the legendary Building 20 at MIT, where individuals as different as Amar Bose and Noam Chomsky thrived in an environment in which the walls between disciplines could literally be torn down.

What I love about Lehrer’s article is that its vision of productive group thinking isn’t that far removed from my sense of what writers and other creative artists need to do on their own. The idea of subjecting the ideas in brainstorming sessions to a rigorous winnowing process has close parallels to Dean Simonton’s Darwinian model of creativity: quality, he notes, is a probabilistic function of quantity, so the more ideas you have, the better—but only if they’re subjected to the discipline of natural selection. This selection can occur in the writer’s mind, in a group, or in the larger marketplace, but the crucial thing is that it take place at all. Free association or productivity isn’t enough without that extra step of revision, or rendering, which in most cases requires a strong external point of view. Hence the importance of outside readers and editors to every writer, no matter how successful.

The premise that creativity flowers most readily from interactions between people from different backgrounds has parallels in one’s inner life as well. In The Act of Creation, Arthur Koestler concludes that bisociation, or the intersection of two unrelated areas of knowledge in unexpected ways, is the ultimate source of creativity. On the highest plane, the most profound innovations in science and the arts often occur when an individual of genius changes fields. On a more personal level, nearly every good story idea I’ve ever had came from the juxtaposition of two previously unrelated concepts, either done on purpose—as in my focused daydreaming with science magazines, which led to stories like “Kawataro,” “The Boneless One,” and “Ernesto”—or by accident. Even accidents, however, can benefit from careful planning, as in the design of the Pixar campus, as conceived by Steve Jobs, in which members of different departments have no choice but to cross paths on their way to the bathroom or cafeteria.

Every creative artist needs to find ways of maximizing this sort of serendipity in his or her own life. My favorite personal example is my own home library: partially out of laziness, my bookshelves have always been a wild jumble of volumes in no particular order, an arrangement that sometimes makes it hard to find a specific book when I need it, but also leads to serendipitous arrangements of ideas. I’ll often be looking for one book when another catches my eye, even if I haven’t read it in years, which takes me, in turn, in unexpected directions. Even more relevant to Lehrer’s article is the importance of talking to people from different fields: writers benefit enormously from working around people who aren’t writers, which is why college tends to be a more creatively fertile period than graduate school. “It is the human friction,” Lehrer concludes, “that makes the sparks.” And we should all arrange our lives accordingly.

Written by nevalalee

February 1, 2012 at 10:26 am

%d bloggers like this: