Posts Tagged ‘Apple’
The castle on the keyboard
In March, the graphic artist Susan Kare, who is best known for designing the fonts and icons for the original Apple Macintosh, was awarded a medal of recognition from the professional organization AIGA. It occurred to me to write a post about her work, but when I opened a gallery of her designs, I found myself sidetracked by an unexpected sensation. I felt happy. Looking at those familiar images—the Paintbrush, the Trash Can, even the Bomb—brought me as close as I’ve come in a long time to what Proust describes after taking a bite of the madeleine in the first volume of In Search of Lost Time:
Just as the Japanese amuse themselves by filling a porcelain bowl with water and steeping in it little crumbs of paper which until then are without character or form, but, the moment they become wet, stretch themselves and bend, take on color and distinctive shape, become flowers or houses or people, permanent and recognizable, so in that moment all the flowers in our garden…and the good folk of the village and their little dwellings and the parish church and the whole of Combray and of its surroundings, taking their proper shapes and growing solid, sprang into being, town and gardens alike, from my cup of tea.
In my case, it wasn’t a physical location that blossomed into existence, but a moment in my life that I’ve tried repeatedly to evoke here before. I was in my early teens, which isn’t a great period for anyone, and I can’t say that I was content. But for better or worse, I was becoming whatever I was supposed to be, and throughout much of that process, Kare’s icons provided the inescapable backdrop.
You could argue that nostalgia for computer hardware is a fairly recent phenomenon that will repeat itself in later generations, with children who are thirteen or younger today feeling equally sentimental toward devices that their parents regard with indifference—and you might be right. But I think that Kare’s work is genuinely special in at least two ways. One is that it’s a hallmark of perhaps the last time in history when a personal computer could feel like a beguiling toy, rather than an indispensable but utilitarian part of everyday life. The other is that her icons, with their handmade look and origins, bear the impression of another human being’s personality in ways that would all but disappear within a few years. As Alexandra Lange recounts in a recent profile of Kare:
In 1982, [Kare] was a sculptor and sometime curator when her high-school friend Andy Hertzfeld asked her to create graphics for a new computer that he was working on in California. Kare brought a Grid notebook to her job interview at Apple Computer. On its pages, she had sketched, in pink marker, a series of icons to represent the commands that Hertzfeld’s software would execute. Each square represented a pixel. A pointing finger meant “Paste.” A paintbrush symbolized “MacPaint.” Scissors said “Cut.” Kare told me about this origin moment: “As soon as I started work, Andy Hertzfeld wrote an icon editor and font editor so I could design images and letterforms using the Mac, not paper,” she said. “But I loved the puzzle-like nature of working in sixteen-by-sixteen and thirty-two-by-thirty-two pixel icon grids, and the marriage of craft and metaphor.”
That same icon editor, or one of its successors, was packaged with the Mac that I used, and I vividly remember clicking on that grid myself, shaping the building blocks of the interface in a way that seems hard to imagine now.
And Kare seems to have valued these aspects of her work even at the time. There’s a famous series of photos of her in a cubicle at Apple in 1984, leaning back in her chair with one New Balance sneaker propped against her desk, looking impossibly cool. In one of the pictures, if you zoom in on the shelf of books behind her, it’s possible to make out a few titles, including the first edition of Symbol Sourcebook by Henry Dreyfuss, with an introduction by none other than R. Buckminster Fuller. Kare has spoken highly of this book elsewhere, most notably in an interview with Alex Pang of Stanford, to whom she explained:
One of my favorite parts of the book is its list of hobo signals, that hobos used to contact each other when they were on the road. They look like they’re in chalk on stones…When you’re desperate for an idea—some icons, like the piece of paper, are no problem; but others defy the visual, like “undo”—you look at things like hobo signs. Like this: “Man with a gun lives here.” Now, I can’t say that anything in this book is exactly transported into the Macintosh interface, but I think I got a lot of help from this, just thinking. This kind of symbol appeals to me because it had to be really simple, and clear to a group of people who were not going to be studying these for years in academia. I don’t understand a lot of them—“These people are rich” is a top hat and a triangle—but I always had that at Apple. I still use it, and I’m grateful for it.
And it seems likely that this was the “symbol dictionary” in which Kare discovered the Bowen Knot, a symbol once used to indicate “interesting features” at Swedish campgrounds, which lives on as the Command icon on the Mac.
According to Kare, the Bowen Knot originally represented a castle with four turrets, and if you’re imaginative enough, you can imagine it springing into being from the keys to either side of the space bar, like the village from Proust’s teacup. Like the hobo signs, Kare’s icons are a system of signals left to those who might pass by in the future, and the fact that they’ve managed to survive at Apple in even a limited way is something of a miracle in itself. (As the tech journalist Mike Murphy recently wrote: “For whatever reason, Apple looks and acts far more like a luxury brand than a consumer-technology brand in 2018.” And there isn’t much room in that business for castles or hobo signs.) When you click through the emulated versions of the earliest models of the Macintosh on the Internet Archive, it can feel like a temporary return to those values, or like a visit to a Zen garden. Yet if we only try to recapture it, we miss the point. Toward the end of In Search of Lost Time, Proust experiences a second moment of revelation, when he stumbles in a courtyard and catches himself “on a flagstone lower than the one next it,” which reminds him of a similar sensation that he had once felt at the Baptistry of St. Mark in Venice. And what he says of this flash of insight reminds me of how I feel when I look at the Happy Mac, and all the possibilities that it once seemed to express:
As at the moment when I tasted the madeleine, all my apprehensions about the future, all my intellectual doubts, were dissipated. Those doubts which had assailed me just before, regarding the reality of my literary gifts and even regarding the reality of literature itself were dispersed as though by magic…Merely repeating the movement was useless; but if…I succeeded in recapturing the sensation which accompanied the movement, again the intoxicating and elusive vision softly pervaded me, as though it said, “Grasp me as I float by you, if you can, and try to solve the enigma of happiness I offer you.”
The closed circle
In his wonderful book The Nature of Order, the architect Christopher Alexander lists fifteen properties that characterize places and buildings that feel alive. (“Life” itself is a difficult concept to define, but we can come close to understanding it by comparing any two objects and asking the one question that Alexander identifies as essential: “Which of the two is a better picture of my self?”) These properties include such fundamentals of design as “Levels of Scale,” “Local Symmetries,” and “Positive Space,” and elements that are a bit trickier to pin down, including “Echoes,” “The Void,” and “Simplicity and Inner Calm.” But the final property, and the one that Alexander suggests is the most important, bears the slightly clunky name of “Not-Separateness.” He points to the Tower of the Wild Goose in China as an example of this quality at its best, and he says of its absence:
When a thing lacks life, is not whole, we experience it as being separate from the world and from itself…In my experiments with shapes and buildings, I have discovered that the other fourteen ways in which centers come to life will make a center which is compact, beautiful, determined, subtle—but which, without this fifteenth property, can still often somehow be strangely separate, cut off from what lies around it, lonely, awkward in its loneliness, too brittle, too sharp, perhaps too well delineated—above all, too egocentric, because it shouts, “Look at me, look at me, look how beautiful I am.”
The fact that he refers to this property as “Non-Separateness,” rather than the more obvious “Connectedness,” indicates that he sees it as a reaction against the marked tendency of architects and planners to strive for distinctiveness and separation. “Those unusual things which have the power to heal…are never like this,” Alexander explains. “With them, usually, you cannot really tell where one thing breaks off and the next begins, because the thing is smokily drawn into the world around it, and softly draws this world into itself.” It’s a characteristic that has little to do with the outsized personalities who tend to be drawn to huge architectural projects, and Alexander firmly skewers the motivations behind it:
This property comes about, above all, from an attitude. If you believe that the thing you are making is self-sufficient, if you are trying to show how clever you are, to make something that asserts its beauty, you will fall into the error of losing, failing, not-separateness. The correct connection to the world will only be made if you are conscious, willing, that the thing you make be indistinguishable from its surroundings; that, truly, you cannot tell where one ends and the next begins, and you do not even want to be able to do so.
This doesn’t happen by accident, particularly when millions of dollars and correspondingly inflated egos are involved. (The most blatant way of separating a building from its surroundings is to put your name on it.) And because it explicitly asks the designer to leave his or her cleverness behind, it amounts to the ultimate test of the subordination of the self to the whole. You can do great work and still falter at the end, precisely because of the strengths that allowed you to get that far in the first place.
It’s hard for me to read these words without thinking of Apple’s new headquarters in Cupertino, variously known as the Ring and the Mothership, which is scheduled to open later this year. A cover story in Wired by Steven Levy describes it in enraptured terms, in which you can practically hear Also Sprach Zarathustra:
As we emerge into the light, the Ring comes into view. As the Jeep orbits it, the sun glistens off the building’s curved glass surface. The “canopies”—white fins that protrude from the glass at every floor—give it an exotic, retro-future feel, evoking illustrations from science fiction pulp magazines of the 1950s. Along the inner border of the Ring, there is a walkway where one can stroll the three-quarter-mile perimeter of the building unimpeded. It’s a statement of openness, of free movement, that one might not have associated with Apple. And that’s part of the point.
There’s a lot to unpack here, from the reference to pulp science fiction to the notion of “orbiting” the building to the claim that the result is “a statement of openness.” As for the contrary view, here’s what another article in Wired, this one by Adam Rogers, had to say about it a month later:
You can’t understand a building without looking at what’s around it—its site, as the architects say. From that angle, Apple’s new [headquarters] is a retrograde, literally inward-looking building with contempt for the city where it lives and cities in general. People rightly credit Apple for defining the look and feel of the future; its computers and phones seem like science fiction. But by building a mega-headquarters straight out of the middle of the last century, Apple has exacerbated the already serious problems endemic to twenty-first-century suburbs like Cupertino—transportation, housing, and economics. Apple Park is an anachronism wrapped in glass, tucked into a neighborhood.
Without delving into the economic and social context, which a recent article in the New York Times explores from another perspective, I think it’s fair to say that Apple Park is an utter failure from the point of view of “Not-Separateness.” But this isn’t surprising. Employees may just be moving in now, but its public debut dates back to June 7, 2011, when Steve Jobs himself pitched it to the Cupertino City Council. Jobs was obsessed by edges and boundaries, both physical and virtual, insisting that the NeXT computer be a perfect cube and introducing millions of consumers to the word “bezel.” Compare this to what Alexander writes of boundaries in architecture:
In things which have not-separateness, there is often a fragmented boundary, an incomplete edge, which destroys the hard line…Often, too, there is a gradient of the boundary, a soft edge caused by a gradient in which scale decreases…so that at the edge it seems to melt indiscernibly into the next thing…Finally, the actual boundary is sometimes rather careless, deliberately placed to avoid any simple complete sharp cutting off of the thing from its surroundings—a randomness in the actual boundary line which allows the thing to be connected to the world.
The italics are mine, because it’s hard to imagine anything less like Jobs or the company he created. Apple Park is being positioned as Jobs’s posthumous masterpiece, which reminds me of the alternate wording to Alexander’s one question: “Which one of these two things would I prefer to become by the day of my death?” (If the building is a monument to Jobs, it’s also a memorial to the ways in which he shaded imperceptibly into Trump, who also has a fixation with borders.) It’s the architectural equivalent of the design philosophy that led Apple to glue in its batteries and made it impossible to upgrade the perfectly cylindrical Mac Pro. Apple has always loved the idea of a closed system, and now its employees get to work in one.
The frigid juicemaker
By now, many of you have probably heard of the sad case of Juicero, the technology startup that developed the world’s most advanced juicer, which retails for hundreds of dollars, only to be rocked by a Bloomberg report that revealed that its juice packs could just as easily be squeezed by hand. At first glance, this seems like another cautionary tale of Silicon Valley design gone wrong, along the lines of the $1,500 toaster oven, but its lessons are slightly more profound. A few days ago, Ben Einstein, a general partner at the venture capital firm Bolt, conducted a teardown of the Juicero Press to figure out why it was so costly, and he came away impressed by its design and construction: his writeup is filled with such phrases as “beautifully molded,” “a complex assembly with great attention to detail,” “painstakingly textured,” and “incredibly complex and beautifully engineered.” At one point, Einstein marvels: “The number, size, complexity and accuracy of these parts is somewhat mind-blowing for a young hardware startup.” The trouble, he points out, is that the cost of such components makes the juicer far more expensive than most consumers are willing to pay, and it could have delivered comparable performance at a lower price by rethinking its design. A Juicero Press uniformly compresses the entire surface of the juice pack, requiring thousands of pounds of force, while a human hand gets much the same result simply by squeezing it unevenly. Einstein concludes:
I have to believe the engineers that built this product looked at other ways of pressing the juice, but if the primary mechanism could apply force in a more focused way it could easily save hundreds of dollars off the shelf price of the product.
As it stands, the engineers at Juicero evidently “went wild,” building a beautifully made and confoundingly expensive product in the hopes that a market for it would somehow materialize. It’s like a juicer designed by Damien Hirst. In a peculiar way, it makes for a refreshing contrast to the usual hardware startup horror story, in which a company’s plans to build the world’s greatest espresso machine run aground on the inconvenient realities of manufacturing and supply chain management. Juicero’s engineers obviously knew what they were doing, at least on a technical level, but their pursuit of great design for its own sake appears to have blinded them to more practical realities. The market for juicers isn’t the same as that for fine watches, and its buyers have different motivations. In the absence of feedback from customers, the engineers went ahead and built a juicer for themselves, loading it with features that even the most discerning of users would either never notice or wouldn’t feel like factoring into the purchase price. In real estate terms, they overimproved it. When my wife and I bought our house six years ago, our realtor warned us against overspending on renovations—you don’t want to invest so much in the property that, if you sell it, you’re forced to list it at a point that doesn’t make sense for your block. The Juicero’s lovingly machined parts and moldings are the equivalent of granite countertops and a master bathroom in a neighborhood where homeowners are more interested in paying a reasonable price for a short walk to the train.
There are two big takeaways here. One is the fact that there’s no such thing as good design or engineering in isolation—you always have to keep the intended user in mind. The other is that aesthetic considerations or technical specifications aren’t sufficient guidelines in themselves, and that they have to be shaped by other constraints to be channeled in productive directions. Elsewhere, I’ve noted that Apple’s cult of thinness seems to be driven by the search for quantifiable benchmarks that can drive innovation. Lowering the price of its products would be an even better goal, although it isn’t one that Apple seems inclined to pursue. Juicero, to its detriment, doesn’t appear to have been overly concerned by either factor. A juicer that sits on your kitchen counter doesn’t need to be particularly light, and there’s little incentive to pare down the ounces. There clearly wasn’t much of an effort to keep down the price. A third potential source of constraints, and probably the best of all, is careful attention to the consumer, which didn’t happen here, either. As Einstein notes:
Our usual advice to hardware founders is to focus on getting a product to market to test the core assumptions on actual target customers, and then iterate. Instead, Juicero spent $120 million over two years to build a complex supply chain and perfectly engineered product that is too expensive for their target demographic.
Imagine a world where Juicero raised only $10 million and built a product subject to significant constraints. Maybe the Press wouldn’t be so perfectly engineered but it might have fewer features and cost a fraction of the original $699…Suddenly Juicero is incredibly compelling as a product offering, at least to this consumer.
And you don’t need to look hard to find equivalents in other fields. A writer who endlessly revises the same manuscript without seeking comments from readers—or sending it to agents or publishers—is engaging in the same cycle of destructive behavior. In The Art of Fiction, John Gardner talks about artistic frigidity, which he defines as a moral failing that confuses side issues with what really matters. The symptoms are much the same in literature as they are in engineering: “It is sometimes frigidity that leads writers to tinker, more and more obsessively, with form.” Juicero suffered from a kind of technological frigidity, as does its obvious role model, Apple, which seems increasingly obsessed with aesthetic considerations that either have a minimal impact on the user experience or actively undermine it. (We saw this most recently with the Mac Pro, which had a striking cylindrical design that was hard to configure and suffered from heating issues. As engineering chief Craig Federighi admitted: “I think we designed ourselves into a bit of a thermal corner.” And it seems only fitting that Apple’s frigidity led to a problem with heat.) Ordinary companies, or writers, have no choice but to adjust to reality. Deadlines, length limits, and the demands of the market all work together to enforce pragmatic compromises, and if you remain frigid, you die. As the world’s largest tech company, Apple has to actively seek out constraints that will rein in its worst impulses, much as successful writers need to find ways of imposing the same restrictions that existed when they were struggling to break in. As Juicero’s example demonstrates, a company that tries to ignore such considerations from the beginning may never get a chance to prove itself at all. Whether you’re a writer or an engineer, it’s easy to convince yourself that you’re selling juicers, but you’re not. You’re selling the juice.
The case against convenience
Last week, I finally bought a MacBook Pro. It’s a slightly older model, since I wanted the optical drive and the ports that Apple is busy prying away from its current generation of devices, and though it isn’t as powerful under the hood as most of its younger cousins, it’s by any measure the nicest laptop I’ve ever owned. (For the last few years, I’ve been muddling through with a refurbished MacBook that literally disintegrated beneath my fingers as I used it: the screws came out of the case, the plastic buckled and warped, and I ended up keeping it together with packing tape and prayer. If this new computer self-destructs, I assume that it won’t be in such a dramatic fashion.) And while it might seem strange that I sprang for a relatively expensive art object from Apple shortly after my conversion to an Android phone, my favorite thing about this new arrangement is that I don’t need to worry about syncing a damned thing. For years, keeping my laptop and my phone synced up was a minor but real annoyance, particularly on a computer that seemed to audibly gasp for air whenever I connected it with my iPhone. Now that I don’t have that option, it feels weirdly liberating. My smartphone is off in its own little world, interacting happily with my personal data through Google Photos and other apps, while my laptop has access to the same information without any need to connect to my phone, physically or otherwise. Each has its own separate umbilicus linking it with the cloud—and never the twain shall meet.
And there’s something oddly comforting about relegating these devices to two separate spheres, as defined by their incompatible operating systems. I’ve spoken here before about Metcalfe’s Law, which is a way of thinking about the links between nodes in a telecommunications network: in theory, the more connections, the greater the total value. And while this may well be true of systems, like social media, in which each user occupies a single node, it’s a little different when you apply it to all the devices you own, since the complexity of overseeing those gadgets and their connections—which are entities in themselves—can quickly become overwhelming. Let’s say you have a laptop, a tablet, a smartphone. If each connects separately with the cloud, you’ve only got three connections to worry about, and you can allocate separate headspace to each one. But if they’re connected with each other as well as the cloud, the number of potential connections increases to six. This may not sound like much, although even two extra connections can grow burdensome if you’re dealing with them every day. But it’s even worse than that: the connections don’t run in parallel, but form a web, so that any modification you make to one invisibly affects all the others. If you’re anything like me, you’ve experienced the frustration of trying to customize the way you interact with one device, only to find that you’ve inadvertently changed the settings on another. The result is a mare’s nest of incompatible preferences that generate unpredictable interference patterns.
Segregating all the parts of your digital life from one another takes away much of that confusion: you don’t have to think about any of it if your computer and your phone don’t speak a common language. (They can each talk to the cloud, but not to each other, which provides all the connectivity you need while keeping the nodes at arm’s length.) But Apple and other tech companies seem determined to combine all of our devices into one terrifying hydra of information. One of the big selling points of the last few Mac OS X updates has been a feature ominously known as Continuity: you can start writing an email or editing a document on one device and pick it up on another, or use your laptop or tablet to make calls through your phone. This sounds like a nice feature in theory, but on closer scrutiny, it falls apart. The whole point of owning multiple devices is that each one is best suited for a certain kind of activity: I don’t want to edit a text document on my phone or make a call on my laptop if I can possibly avoid it. It might be nice to have the option of resuming on one device where you left off somewhere else, but in practice, most of us structure our routines so that we don’t have to worry about that: we can always save something and come back to it, and if we can’t, it implies that we’re enslaved to our work in a way that makes a mockery of any discussion of convenience. And retaining that option, in the rare cases when it’s really useful, involves tethering ourselves to a whole other system of logins, notifications, and switching stations that clutter up the ordinary tasks that don’t require that kind of connectivity.
Is the result “convenient?” Maybe for a user assembling such a system from scratch, like Adam naming the animals. But if you’re at all intelligent or thoughtful about how you work, you’ve naturally built up existing routines that work for you alone, using the tools that you have available. No solution designed for everybody is going to be perfect for any particular person, and in practice, the “continuity” that it promises is really a series of discontinuous interruptions, as you struggle to reconcile your work habits with the prepackaged solution that Apple provides. That search for idiosyncratic, practical, and provisional solutions for managing information and switching between different activities is central to all forms of work, creative and otherwise, and an imperfect solution that belongs to you—even if it involves rearranging your plans, heaven forbid, to suit whatever device happens to be accessible at the time—is likely to be more useful than whatever Apple has in mind. And treating the different parts of your digital life as essentially separate seems like a good first step. When we keep each device in its own little silo, we have a decent shot at figuring out an arrangement that suits each one individually, rather than wrestling with the octopus of connectivity. In the long run, any version of convenience that has been imposed from the outside isn’t convenient at all. And that’s the inconvenient truth.
The droid you’re looking for
A few weeks ago, I replaced my iPhone 5 with a cheap Android device. Yesterday, Apple reported that its phone sales had slowed to their lowest growth rate ever. Clearly, the two facts are connected—and I’m not entirely kidding about this. My phone had been giving me problems for about six months, ever since the display was cracked in an accident last year, but I managed to soldier on with a replacement screen until shortly after Christmas. A gentle fall from my sofa to the carpeted floor was enough to finish it off, and dead lines appeared on the touchscreen, forcing me to rotate the phone repeatedly to perform even the simplest of tasks. (For a while, I seriously considered trying to write all of my texts without using the letter “P,” which was the only one that was permanently disabled.) Finally, I gave in. I was still several months away from my next upgrade, so I went to what my daughter called “the boring cell phone store” to evaluate my options. After about five minutes of consideration, I ended up shelling out forty bucks for a Chinese-made Android model, reasoning that it would make a serviceable interim phone, if nothing else, until I could shop around for a more lasting replacement. But as it turns out, I love this goddamned little phone. I like it so much, in fact, that it’s caused me to question my lifelong devotion to Apple, which has begun to seem like an artifact of a set of assumptions about consumer technology that no longer apply.
My new phone is a ZTE Maven that runs Android 5.1. Its specs aren’t particularly impressive: eight gigs of storage, 4.5-inch screen, 1.2Ghz quad-core processor. Online reviews give it an average of three stars out of five. But I’ve been consistently delighted by it. The camera is significantly worse than the one for my old phone—a real issue for the parent of a three-year-old, since nearly every shot I take of my daughter, who refuses to sit still, ends up smeared into an indecipherable blur. But I’ve learned to live with it. And in every other respect, it either matches or exceeds my iPhone’s performance. Google apps, which I use for email, maps, and web browsing, load much faster and more smoothly than before. Notifications are so seamless that they take much of the fun out of checking my email: I simply know at a glance whether or not I’ve got a new message, while my old phone kept me in delightful suspense as it slowly spun its wheels. Eight gigabytes doesn’t leave me with any room for media, but between Google Photos, which I now use to store all of my old pictures in the cloud, and streaming music from Amazon and other sources, I don’t need to keep much of anything on the phone itself. And while it might seem unfair to compare a newish Android device to an iPhone that is several product cycles behind, the fact that the latter cost ten times as much is a big part of the reason that I held onto it for so long.
And to repeat: this phone cost forty dollars. It doesn’t matter if I drop it in the toilet or lose it or break it, or if my eye is caught by something new. There’s nothing stored on it that can’t be replaced. I have close to zero connection to it as a consumer fetish item, which, paradoxically, makes me regard it with even more affection than my old iPhone. If an iPhone lags, it feels like a betrayal; if this phone stalls for a few seconds, which happens rarely, I want to give it an encouraging pat on the head and say that I know it’s doing its best. And it constantly surprises me on the upside. Part of this is due to the relentless march of technology, in which a phone that would have seemed miraculous ten years ago is now being all but given away, but it more subtly reflects the fact that the actual phone is no longer a significant object in itself. Instead, it’s a tiny aperture that provides a window between two huge collections of information to either side. You could think of our online existence as an hourglass, with one large bulb encompassing the data and activities of our personal lives and the other embracing everything in the cloud. The slender neck connecting those two gigantic spheres is your phone—it’s the channel through which one passes into the other. And we’re at a stage where it doesn’t really matter what physical device provides the interface where those larger masses press together, like the film that forms between two soap bubbles. You could even argue that the cheaper the device, the more it fulfills its role as an intermediary, rather than as a focal point.
As a result, my crummy little Android phone—which, once again, isn’t even all that crummy—feels more like the future than whatever Apple is currently marketing. There’s something slightly disingenuous in the way Apple keeps pushing users to the cloud, to the extent of removing all of its old ports and optical drives, while still insisting that this incorporeal universe of information can best be accessed through a sleek, expensive machine. Elsewhere, I’ve written that Apple seems to use thinness or lightness as a quantifiable proxy for good design, and it could theoretically do the same with price, although the odds of this actually happening seem close to zero. Apple’s business model depends so much on charging a premium that it doesn’t feel much pressure to innovate below the line, at least not in ways that are visible to consumers. But this isn’t just about cheap phones: it’s about the question of whether we need any particular phone, rather than a series of convenient, provisional, and ultimately disposable lenses through which to see the content on the other side. The truth, I think, is that we don’t need much of a phone at all, or that it only has to be the phone we need for the moment—and it should be priced at a point at which we have no qualms about casually replacing it when necessary, any more than we’d think twice about buying a new light bulb when the old one burns out. If the Internet, as people never tire of telling us, is a utility like heat or running water, the phone isn’t even the fuse box: it’s the fuse. And it took a forty-dollar phone to get me to realize this.
Apple and the cult of thinness
Recently, I’ve been in the market for a new computer. After some thought, I’ve settled on an older model of the MacBook Pro, both because of its price and because it’s the last remaining Apple laptop with an optical drive, which I still occasionally use. The experience put me in mind of a cartoon posted yesterday on Reddit, which shows a conversation between an Apple user and a helpful technician: “So what’s this update you’re installing?” “I’m just removing your USB ports.” “Great!” Apple’s obsession with eliminating unsightly ports, as well as any other features that might interfere with a device’s slim profile, has long been derided, and the recent news that the headphone jack might disappear from the next iPhone has struck many users as a bridge too far. Over the last decade or so, Apple has seemed fixated on pursuing thinness and lightness above all else, even though consumers don’t appear to be clamoring for thinner laptops or phones, and devices that are pared down past a certain point suffer in terms of strength and usability. (Apple isn’t alone in this, of course. Last week, I purchased a Sony Blu-ray player to replace the aging hulk I’d been using for the last five years, and although I like the new one, the lack of a built-in display that provides information on what the player is actually doing is a minor but real inconvenience, and it’s so light that I often end up pushing it backward on the television stand when I press the power button. As far as I can tell, there’s no reason why any device that spends its entire life on the same shelf needs to be so small.)
Obviously, I’m not the first person to say this, and in particular, the design gurus Don Norman and Bruce Tognazzini wrote a long, devastating piece for Fast Company last month on Apple’s pursuit of beauty over functionality. But I’d like to venture an alternative explanation for why it has taken this approach. Apple is a huge corporation, and like all large businesses, it needs quantifiable benchmarks to drive innovation. Once any enterprise becomes big enough, qualitative metrics alone don’t cut it: you need something to which you can assign a number. And while you can’t quantify usability, or even beauty, you can quantify thinness and weight. Apple seems to be using the physical size of a device as a proxy for innovative thought about design, which isn’t so different from the strategy that many writers use during the revision process. I’ve written here before about how I sometimes set length limits for stories or individual chapters, and how this kind of writing by numbers forces me to be smarter and more rigorous about my choices. John McPhee says much the same thing in a recent piece in The New Yorker about the exercise of “greening,” as once practiced by Time, which involved cutting an arbitrary number of lines. As Calvin Trillin writes elsewhere: “I was surprised that what I had thought of as a tightly constructed seventy-line story—a story so tightly constructed that it had resisted the inclusion of that maddening leftover fact—was unharmed, or even improved, by greening ten percent of it. The greening I did in Time Edit convinced me that just about any piece I write could be improved if, when it was supposedly ready to hand in, I looked in the mirror and said sternly to myself ‘Green fourteen’ or ‘Green eight.’ And one of these days I’m going to begin doing that.”
Apple appears to have come to a similar conclusion about its devices, which is that by greening away weight and thickness, you end up with other desirable qualities. And it works—but only up to a point. As McPhee observes, greening is supposed to be invisible: “The idea is to remove words in such a manner that no one would notice that anything has been removed.” And once you pass beyond a certain limit, you risk omitting essential elements, as expressed in the book Behind the Seen by Charles Koppelman, which describes the process of the legendary film editor Walter Murch:
Murch also has his eye on what he calls the “thirty percent factor”—a rule of thumb he developed that deals with the relationship between the length of the film and the “core content” of the story. In general, thirty percent of a first assembly can be trimmed away without affecting the essential features of the script: all characters, action, story beats will be preserved and probably, like a good stew, enhanced by the reduction in bulk. But passing beyond the thirty percent barrier can usually be accomplished only by major structural alterations: the reduction or elimination of a character, or whole sequences—removing vital organs rather than trimming fat. “It can be done,” says Murch, “and I have done it on a number of films that turned out well in the end. But it is tricky, and the outcome is not guaranteed—like open-heart surgery. The patient is put at risk, and the further beyond thirty percent you go, the greater the risk.
And Apple—which has had a long and productive relationship with Murch, a vocal champion of its Final Cut Pro software—should pay attention. In the past, the emphasis on miniaturization was undoubtedly a force for innovative solutions, but we’ve reached the point where the patient is being endangered by removing features of genuine utility. Murch’s thirty percent factor turns out to describe the situation at Apple eerily well: the earliest models of the MacBook Pro weighed about five and a half pounds, implying that once the weight was reduced below four pounds or so, vital organs would be threatened, which is exactly what happened. (Even more insidiously, the trend has spread into realms where the notion of thinness is entirely abstract, like the fonts that Apple uses for its mobile devices, which, as Norman and Tognazzini point out, are so slender that they’ve become difficult to read.) These changes aren’t driven by consumer demand, but by a corporate culture that has failed to recognize that its old ways of quantifying innovation no longer serve their intended purpose. The means have been confused with the end. Ultimately, I’m still a fan of Apple, and I’m still going to buy that MacBook. I don’t fault it for wanting to qualify its processes: it’s a necessary part of managing creativity on a large scale. But it has to focus on a number other than thickness or weight. What Apple needs is a new internal metric, similarly quantifiable, that reflects something that consumers actually want. There’s one obvious candidate: price. Instead of making everything smaller, Apple could focus on providing the same functionality, beauty, and reliability at lower cost. It would drive innovation just as well as size once did. But given Apple’s history, the chances of that happening seem very slim indeed.
The Ive Mind
Like many readers, I spent much of yesterday working my way through Ian Parker’s massive New Yorker profile of Apple designer Jonathan Ive. Over the years, we’ve seen plenty of extended feature pieces on Ive, who somehow manages to preserve his reputation as an intensely private man, but it feels like Parker set out to write the one to end them all: it’s well over fifteen thousand words long, and there were times, as I watched my progress creeping slowly by in the scroll bar, when I felt like I was navigating an infinite loop of my own. (It also closes in that abrupt New Yorker way that takes apparent pride in ending articles at the most arbitrary place possible, as if the writer had suffered a stroke before finishing the last paragraph.) Still, it’s a fine piece, crammed with insights, and I expect that I’ll read it again. I’ve become slightly less enamored of Apple ever since my latest MacBook started to disintegrate a few months after I bought it—by which I mean its screws popped out one by one and its plastic casing began to bubble alarmingly outward—but there’s no doubting Ive’s vision, intelligence, and ability to articulate his ideas.
Like a lot of Apple coverage, Parker’s article builds on the company’s mythology while making occasional stabs at deflating it, with paragraphs of almost pornographic praise alternating with a skeptical sentence or two. (“I noticed that, at this moment in the history of personal technology, Cook still uses notifications in the form of a young woman appearing silently from nowhere to hold a sheet of paper in his line of sight.”) And he’s best not so much at talking about Apple’s culture as at talking about how they talk about it. Here’s my favorite part:
[Ive] linked the studio’s work to NASA’s: like the Apollo program, the creation of Apple products required “invention after invention after invention that you would never be conscious of, but that was necessary to do something that was new.” It was a tic that I came to recognize: self-promotion driven by fear that one’s self-effacement might be taken too literally. Even as Apple objects strive for effortlessness, there’s clearly a hope that the effort required—the “huge degree of care,” the years of investigations into new materials, the months spent enforcing cutting paths in Asian factories—will be acknowledged.
I love this because it neatly encapsulates the neurosis at the heart of so much creative work, from fiction to industrial design. We’re constantly told that we ought to strive for simplicity, and that the finished product, to use one of Ive’s favorite terms, should seem “inevitable.” Yet we’re also anxious that the purity of the result not be confused with the ease of its creation. Writers want readers to accept a novel as a window onto reality while simultaneously noticing the thousands of individual choices and acts of will that went into fashioning it, which is inherently impossible. And it kills us. Writing a novel is a backbreaking process that wants to look as simple as life, and that contradiction goes a long way toward explaining why authors never feel as if they’ve received enough love: the terms of the game that they’ve chosen ensure that most of their work remains invisible. Novels, even mediocre ones, consist of “invention after invention after invention,” a daunting series, as T.H. White noted, of “nouns, verbs, prepositions, adjectives, reeling across the page.” And even when a story all but begs us to admire the brilliance of its construction, we’ll never see more than a fraction of the labor it required.
So what’s a creative artist to do? Well, we can talk endlessly about process, as Ive does, and dream of having a profile in The Paris Review, complete with images of our discarded drafts. Or we can push complexity to the forefront, knowing at least that it will be acknowledged, even if it goes against what we secretly believe about the inevitability of great art. (“The artist, like the God of creation, remains within or behind or beyond or above his handiwork, invisible, refined out of existence, indifferent, paring his fingernails,” James Joyce writes, and yet few other authors have been so insistent that we recognize his choices, even within individual words.) Or, if all else fails, we can rail against critics who seem insufficiently appreciative of how much work is required to make something feel obvious, or who focus on some trivial point while ignoring the agonies that went into a story’s foundations. None of which, of course, prevents us from taking the exact same attitude toward works of art made by others. Ultimately, the only solution is to learn to live with your private store of effort, uncertainty, and compromise, never advertising it or pointing to all your hard work as an excuse when it falls short. Because in the end, the result has to stand on its own, even if it’s the apple of your eye.
A lifetime’s work in one gigabyte
My laptop died yesterday. Shortly after I finished my usual rounds of The A.V. Club and Reddit over breakfast, it began making a pronounced clicking noise, and when I restarted it, I saw the image above, which is never a good sign. None of the recommended fixes seemed to work, and although I’m going to try reinstalling my system software at some point, it seems likely that this computer has given up the ghost. This isn’t entirely unexpected: it’s a refurbished MacBook, now discontinued, that I bought more than five years ago, and I’ve since replaced it with a newer model. The one that expired this week currently lives on my kitchen counter, where I don’t use it for much besides browsing the web and playing DVDs. (As an aside, I want to let Apple know that I’m never going to buy a laptop without an optical drive until I find an alternate way of playing my Simpsons commentary tracks, which means that any replacement I get will probably be a basic model running Linux or, sigh, Windows.)
Fortunately, everything on that particular hard drive had already been copied to my newer laptop, so I’m not worried about losing anything meaningful. Still, in a burst of belated backup paranoia, I spent a few minutes copying my most important files to a flash drive. These include drafts of all my novels and Analog stories, messages from earlier email accounts, various boring personal documents, like tax returns, and most of the written ephemera of my life since I before started college. Taken together, all these files amounted to a ridiculously small amount of disk space—maybe a gigabyte all told. And for a moment, I found myself somewhat discouraged by how little there was. Fifteen years of nonstop writing had resulted in significantly less information than exists in, say, a single episode of Top Chef on iTunes. And maybe ten percent of this reflects material that I’d regret losing, with the rest consisting of duplicate drafts or college term papers that I don’t remember writing in the first place.
Yet there’s something weirdly encouraging about the compactness of the written word. The drafts of my novels, which are about 100,000 words long, each amount to roughly 800 kilobytes. You couldn’t save a single rendered frame of animation in a file that size, but somehow each of those Word versions contains an entire story, with characters and events that mean a lot to me, and hopefully mean something to a few other readers. Text, after all, is the most efficient medium of communication we have. A single line—”I know a bank where the wild thyme blows”—can evoke endless associations, and it can be easily stored, transmitted, or memorized. And although most of us compose our stories these days on an absurdly expensive piece of technology, the fact that the product is so modest in size results directly from the fact that its underlying form, a string of alphabetic characters strung together in a particular order, can easily be generated with a pencil and paper.
And this is something that deserves to be celebrated. The fact that a story written using a universally available medium can be distributed and experienced online, in print, or on someone’s phone or Kindle is pretty amazing, and it’s a big reason why I wanted to become a novelist, in preference to any other artistic career. Roughly speaking, the more disk space a work of art needs to be stored or transmitted, the greater the expense required to create it—although “expense” isn’t the same thing as effort or time. Good writing is just as cheap to produce, at least on a material level, as its laughable storage needs would imply. It allows a writer to work alone, making plenty of false starts and wrong turns, but always reassured by the fact that even a major revision costs nothing except time and sanity. The emotional and spiritual costs are beyond calculation, but the startup costs will always be zero. That gigabyte of content I’ve produced doesn’t seem like much, but if you look at it from another angle, it starts to look like freedom.
“To be truly simple, you have to go really deep”
Simplicity isn’t just a visual style. It’s not just minimalism or the absence of clutter. It involves digging through the depth of the complexity. To be truly simple, you have to go really deep. For example, to have no screws on something, you can end up having a product that is so convoluted and complex. The better way is to go deeper with simplicity, to understand everything about it and how it’s manufactured. You have to deeply understand the essence of a product in order to be able to get rid of the parts that are not essential.
—Apple designer Jonathan Ive, quoted by Walter Isaacson in Steve Jobs
Steve Jobs and “the hippie Wikipedia”
With the unexpected resignation of Steve Jobs as chief executive of Apple, many of us, including me, have probably been inspired to revisit the legendary commencement address he gave at Stanford in 2005, which has deservedly become one of the most famous speeches of its kind. The entire address is worth reading, of course, but in particular, I’ve always loved its closing appreciation of The Whole Earth Catalog, which Jobs describes as “sort of like Google in paperback form.” More recently, a New York Times article on Jobs referred to it as “a kind of hippie Wikipedia.” Both characterizations are fairly accurate, but The Whole Earth Catalog is much more. For as long as I can remember, I’ve found it to be an invaluable guide and source of inspiration, and I can sincerely say that it deserves to be a part of every thinking person’s life.
Of course, I’m somewhat biased, because The Whole Earth Catalog is a product of a time and place that is close to my heart: the Bay Area of the 1970s, centered in particular on Berkeley, Sausalito, and Menlo Park. Stewart Brand, another singular visionary, founded the Catalog to provide access to tools for those interested in exploring a wide range of issues that remain important today, notably sustainable living, simplicity, and ecology in its original sense, which spans everything from environmentalism to the most straightforward kind of home economics. Above all, the Catalog was the expression of the same restless curiosity that informed the early years of Apple. It gave you the tools to investigate space exploration, personal computing, art, literature, anthropology, architecture, health, backpacking, mysticism, and much more, almost without end. And the most useful tools were books.
As a lifelong obsessive reader, I’m always looking for new things to read, and the classic editions of the Catalog have pointed me toward more great books, many neglected or out of print, than any other source. First and foremost is Christopher Alexander’s A Pattern Language, the best nonfiction book of the past fifty years, which gets a page of its own in the Catalog, with R.H. Blyth’s great, eccentric Zen in English Literature and Oriental Classics close behind. There’s The Plan of St. Gall in Brief; D’Arcy Wentworth Thompson’s classic On Growth and Form; and such odd, essential books as Soil and Civilization; Form, Function, and Design; Structures; The Prodigious Builders; The Natural Way to Draw; Poker: A Guaranteed Income for Life; Japanese Homes and Their Surroundings; and the works of Lewis Mumford and Buckminster Fuller. All these I owe to the Catalog.
And the Catalog itself is full of wisdom that doesn’t date: original essays, tidbits of advice in the writeups of individual books, ideas and inspirations all but tucked into the margin. I own three editions, but my favorite is The Next Whole Earth Catalog, which, at five pounds and fifteen by eleven inches, is as big as a paperback book can get. Opening it to any page reminds me at once of what really matters, a world of books, ideas, and simple living, and it has always steered me back on track whenever I’ve been tempted to stray. And Steve Jobs can probably say the same thing. At the end of his address at Stanford, he quotes four words from the back cover of the 1974 edition of the Catalog, which many have since misattributed to Jobs himself: “Stay hungry. Stay foolish.” And if the career of Steve Jobs is merely the most striking illustration of what these words can do, we can thank the Catalog for this as well.