Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Archive for the ‘Writing’ Category

The Uber Achievers

leave a comment »

In 1997, the computer scientist Niklaus Wirth, best known as the creator of Pascal, conducted a fascinating interview with the magazine Software Development, which I’ve quoted here before. When asked if it would be better to design programming languages with “human issues” in mind, Wirth replied:

Software development is technical activity conducted by human beings. It is no secret that human beings suffer from imperfection, limited reliability, and impatience—among other things. Add to it that they have become demanding, which leads to the request for rapid, high performance in return for the requested high salaries. Work under constant time pressure, however, results in unsatisfactory, faulty products.

When I read this quotation now, I think of Uber. As a recent story by Caroline O’Donovan and Priya Anand of Buzzfeed makes clear, the company that seems to have alienated just about everyone in the world didn’t draw the line at its own staff: “Working seven days a week, sometimes until 1 or 2 a.m., was considered normal, said one employee. Another recalled her manager telling her that spending seventy to eighty hours a week in the office was simply ‘how Uber works.’ Someone else recalled working eighty to one hundred hours a week.” One engineer, who is now in therapy, recalled: “It’s pretty clear that giving that much of yourself to any one thing is not healthy. There were days where I’d wake up, shower, go to work, work until midnight or so, get a free ride home, sleep six hours, and go back to work. And I’d do that for a whole week.”

“I feel so broken and dead,” one employee concluded. But while Uber’s internal culture was undoubtedly bad for morale, it might seem hard at first to make the case that the result was an “unsatisfactory, faulty” product. As a source quoted in the article notes, stress at the company led to occasional errors: “If you’ve been woken up at 3 a.m. for the last five days, and you’re only sleeping three to four hours a day, and you make a mistake, how much at fault are you, really?” Yet the Uber app itself is undeniably elegant and reliable, and the service that it provided is astonishingly useful—if it weren’t, we probably wouldn’t even be talking about it now. When we look at what else Wirth says, though, the picture becomes more complicated. All italics in the following are mine:

Generally, the hope is that corrections will not only be easy, because software is immaterial, but that the customers will be willing to share the cost. We know of much better ways to design software than is common practice, but they are rarely followed. I know of a particular, very large software producer that explicitly assumes that design takes twenty percent of developers’ time, and debugging takes eighty percent. Although internal advocates of an eighty percent design time versus twenty percent debugging time have not only proven that their ratio is realistic, but also that it would improve the company’s tarnished image. Why, then, is the twenty-percent design time approach preferred? Because with twenty-percent design time your product is on the market earlier than that of a competitor consuming eighty-percent design time. And surveys show that the customer at large considers a shaky but early product as more attractive than a later product, even if it is stable and mature.

This description applies perfectly to Uber, as long as we remember that its “product” isn’t bounded by its app alone, but extends to its impact on drivers, employees, competitors, and the larger community in which it exists—or what an economist would call its externalities. Taken as a closed system, the Uber experience is perfect, but only because it pushes its problems outside the bounds of the ride itself. When you look at the long list of individuals and groups that its policies have harmed, you discern the outlines of its true product, which can be described as the system of interactions between the Uber app and the world. You could say this of most kinds of software, but it’s particularly stark for a service that is tied to the problem of physically moving its customers from one point to another on the earth’s surface. By that standard, “shaky but early” describes Uber beautifully. It certainly isn’t “stable and mature.” The company expanded to monstrous proportions before basic logistical, political, and legal matters had been resolved, and it acted as if it could simply bull its way through any obstacles. (Its core values, let’s not forget, included “stepping on toes” and “principled confrontation.”) Up to a point, it worked, but something had to give, and economic logic dictated that the stress fall on the human factor, which was presumably resilient enough to absorb punishment from the design and technology sides. One of the most striking quotes in the Buzzfeed article comes from Uber’s chief human resources officer: “Many employees are very tired from working very, very hard as the company grew. Resources were tight and the growth was such that we could never hire sufficiently, quickly enough, in order to keep up with the growth.” To assert that “resources were tight” at the most valuable startup on the planet seems like a contradiction in terms, and it would be more accurate to say that Uber decided to channel massive amounts of capital in certain directions while neglecting those that it cynically thought could take it.

But it was also right, until it wasn’t. Human beings are extraordinarily resilient, as long as you can convince them to push themselves past the limits of their ability, or at least to do work at rates that you can afford. In the end, they burn out, but there are ways to postpone that moment or render it irrelevant. When it came to its drivers, Uber benefited from a huge pool of potential contractors, which made turnover a statistical, rather than an individual, problem. With its corporate staff and engineers, there was always the power of money, in the form of equity in the company, to persuade people to stay long past the point where they would have otherwise quit. The firm gambled that it would lure in plenty of qualified hires willing to trade away their twenties for the possibility of future wealth, and it did. (As the Buzzfeed article reveals, Uber seems to have approached compensation for its contractors and employees in basically the same way: “Uber acknowledges that it pays less than some of its top competitors for talent…The Information reported that Uber uses an algorithm to estimate the lowest possible compensation employees will take in order to keep labor costs down.”) When the whole system finally failed, it collapsed spectacularly, and it might help to think of Uber’s implosion, which unfolded over less than six months, as a software crash, with bugs that were ignored or patched cascading in a chain reaction that brings down the entire program. And the underlying factor wasn’t just a poisonous corporate culture or the personality of its founder, but the sensibility that Wirth identified two decades ago, as a company rushed to get a flawed idea to market on the assumption that consumers—or society as a whole—would bear the costs of correcting it. As Wirth asks: “Who is to blame for this state of affairs? The programmer turned hacker; the manager under time pressure; the business man compelled to extol profit wherever possible; or the customer believing in promised miracles?”

Written by nevalalee

July 20, 2017 at 8:29 am

Children of the Lens

with 3 comments

During World War II, as the use of radar became widespread in battle, the U.S. Navy introduced the Combat Information Center, a shipboard tactical room with maps, consoles, and screens of the kind that we’ve all seen in television and the movies. At the time, though, it was like something out of science fiction, and in fact, back in 1939, E.E. “Doc” Smith had described a very similar display in the serial Gray Lensman:

Red lights are fleets already in motion…Greens are fleets still at their bases. Ambers are the planets the greens took off from…The white star is us, the Directrix. That violet cross way over there is Jalte’s planet, our first objective. The pink comets are our free planets, their tails showing their intrinsic velocities.

After the war, in a letter dated June 11, 1947, the editor John W. Campbell told Smith that the similarity was more than just a coincidence. Claiming to have been approached “unofficially, and in confidence” by a naval officer who played an important role in developing the C.I.C., Campbell said:

The entire setup was taken specifically, directly, and consciously from the Directrix. In your story, you reached the situation the Navy was in—more communications channels than integration techniques to handle it. You proposed such an integrating technique, and proved how advantageous it could be…Sitting in Michigan, some years before Pearl Harbor, you played a large share in the greatest and most decisive naval action of the recent war!

Unfortunately, this wasn’t true. The naval officer in question, Cal Laning, was indeed a science fiction fan—he was close friends with Robert A. Heinlein—but any resemblance to the Directrix was coincidental, or, at best, an instance of convergence as fiction and reality addressed the same set of problems. (An excellent analysis of the situation can be found in Ed Wysocki’s very useful book An Astounding War.)

If Campbell was tempted to overstate Smith’s influence, this isn’t surprising—the editor was disappointed that science fiction hadn’t played the role that he had envisioned for it in the war, and this wasn’t the first or last time that he would gently exaggerate it. Fifteen years later, however, Smith’s fiction had a profound impact on a very different field. In 1962, Steve Russell of M.I.T. developed Spacewar, the first video game to be played on more than one computer, with two spaceships dueling with torpedoes in the gravity well of a star. In an article for Rolling Stone written by my hero Stewart Brand, Russell recalled:

We had this brand new PDP-1…It was the first minicomputer, ridiculously inexpensive for its time. And it was just sitting there. It had a console typewriter that worked right, which was rare, and a paper tape reader and a cathode ray tube display…Somebody had built some little pattern-generating programs which made interesting patterns like a kaleidoscope. Not a very good demonstration. Here was this display that could do all sorts of good things! So we started talking about it, figuring what would be interesting displays. We decided that probably you could make a two-dimensional maneuvering sort of thing, and decided that naturally the obvious thing to do was spaceships…

I had just finished reading Doc Smith’s Lensman series. He was some sort of scientist but he wrote this really dashing brand of science fiction. The details were very good and it had an excellent pace. His heroes had a strong tendency to get pursued by the villain across the galaxy and have to invent their way out of their problem while they were being pursued. That sort of action was the thing that suggested Spacewar. He had some very glowing descriptions of spaceship encounters and space fleet maneuvers.

The “somebody” whom he mentions was Marvin Minsky, another science fiction fan, and Russell’s collaborator Martin Graetz elsewhere cited Smith’s earlier Skylark series as an influence on the game.

But the really strange thing is that Campbell, who had been eager to claim credit for Smith when it came to the C.I.C., never made this connection in print, at least not as far as I know, although he was hugely interested in Spacewar. In the July 1971 issue of Analog, he published an article on the game by Albert W. Kuhfeld, who had developed a variation of it at the University of Minnesota. Campbell wrote in his introductory note:

For nearly a dozen years I’ve been trying to get an article on the remarkable educational game invented at M.I.T. It’s a great game, involving genuine skill in solving velocity and angular relation problems—but I’m afraid it will never be widely popular. The playing “board” costs about a quarter of a megabuck!

Taken literally, the statement “nearly a dozen years” implies that the editor heard about Spacewar before it existed, but the evidence legitimately implies that he learned of it almost at once. Kuhfeld writes: “Although it uses a computer to handle orbital mechanics, physicists and mathematicians have no great playing advantage—John Campbell’s seventeen-year-old daughter beat her M.I.T. student-instructor on her third try—and thereafter.” Campbell’s daughter was born in 1945, which squares nicely with a visit around the time of the game’s first appearance. It isn’t implausible that Campbell would have seen and heard about it immediately—he had been close to the computer labs at Harvard and M.I.T. since the early fifties, and he made a point of dropping by once a year. If the Lensman series, the last three installments of which he published, had really been an influence on Spacewar, it seems inconceivable that nobody would have told him. For some reason, however, Campbell, who cheerfully promoted the genre’s impact on everything from the atomic bomb to the moon landing, didn’t seize the opportunity to do the same for video games, in an article that he badly wanted to publish. (In a letter to the manufacturers of the PDP-1, whom he had approached unsuccessfully for a writeup, he wrote: “I’ve tried for years to get a story on Spacewar, and I’ve repeatedly had people promise one…and not deliver.”)

So why didn’t he talk about it? The obvious answer is that he didn’t realize that Spacewar, which he thought would “never be widely popular,” was anything more than a curiosity, and if he had lived for another decade—he died just a few months after the article came out—he would have pushed the genre’s connection to video games as insistently as he did anything else. But there might have been another factor at play. For clues, we can turn to the article in Rolling Stone, in which Brand visited the Stanford Artificial Intelligence Laboratory with Annie Leibovitz, which is something that I wish I could have seen. Brand opens with the statement that computers are coming to the people, and he adds: “That’s good news, maybe the best since psychedelics.” It’s a revealing comparison, and it indicates the extent to which the computing movement was moving away from everything that Campbell represented. A description of the group’s offices at Stanford includes a detail that, if Campbell had read it, would only have added insult to injury:

Posters and announcements against the Vietnam War and Richard Nixon, computer printout photos of girlfriends…and signs on every door in Tolkien’s elvish Fëanorian script—the director’s office is Imladris, the coffee room The Prancing Pony, the computer room Mordor. There’s a lot of hair on those technicians, and nobody seems to be telling them where to scurry.

In the decade since the editor first encountered Spacewar, a lot had changed, and Campbell might have been reluctant to take much credit for it. The Analog article, which Brand mentions, saw the game as a way to teach people about orbital mechanics; Rolling Stone recognized it as a leading indicator of a development that was about to change the world. And even if he had lived, there might not have been room for Campbell. As Brand concludes:

Spacewar as a parable is almost too pat. It was the illegitimate child of the marrying of computers and graphic displays. It was part of no one’s grand scheme. It served no grand theory. It was the enthusiasm of irresponsible youngsters. It was disreputably competitive…It was an administrative headache. It was merely delightful.

The long con

with 4 comments

In 1969, Jay W. Forrester, a computer scientist and professor of management at M.I.T., published the paper “Planning Under the Dynamic Influences of Complex Social Systems.” Despite its daunting title, it’s really about the limits of intuition, which is a subject that I’ve often discussed here before. Forrester, who died just last year, specialized in the analysis of complex systems, which he defined as “high-order, multiple-loop, nonlinear, feedback structures,” such as cities, governments, or economies. And he pointed out that when it comes to dealing with such structures, our usual cognitive tools aren’t just useless, but actively misleading:

These complex systems have characteristics which are commonly unknown. They are far different from the simple systems on which our intuitive responses have been sharpened. They are different from the behavior studied in science and mathematics where only simple systems have received orderly attention and analysis…As a first characteristic of complex systems, their behavior appears counterintuitive to the average person. Intuition and judgment, generated by a lifetime of experience with the simple systems that surround one’s every action, create a network of expectations and perceptions which could hardly be better designed to mislead the unwary when he moves into the realm of complex systems.

Forrester, who was deeply influenced by the cybernetics of Norbert Wiener, noted that we’re used to dealing with simple systems in which the connection between cause and effect is relatively clear. From such everyday activities as picking up an object from a table or driving a car, Forrester wrote, we learn “the obvious and ever-present fact that cause and effect are closely related in time and in space. A difficulty or failure of the simple system is observed immediately. The cause is obvious and immediately precedes the consequences.” This isn’t true of complex systems. And it isn’t just a matter of finding the correct answer, but of resisting those that are plausible, but wrong:

When one goes to complex systems all of these facts become fallacies. Cause and effect are no longer closely related either in time or in space. Causes of a symptom may actually lie in some far distant sector of a social system. Furthermore, symptoms may appear long after the primary causes. But the complex system is far more devious and diabolical than merely being different from the simple systems with which we have had experience. Not only is it truly different, but it appears to be the same. Having been conditioned to look close by for the cause of the trouble, the complex system provides a plausible relationship and pattern for us to discover. When we look nearby in time and in location, we find what appears to be a cause, but actually it is only a coincident symptom. Variables in complex systems are highly correlated, but time correlation means little in distinguishing cause and effect.

He concluded: “With a high degree of confidence we can say that the intuitive solutions to the problems of complex social systems will be wrong most of the time.”

This seems true enough, if we go by the track record of politicians, economists, and social scientists. But the most striking thing about Forrester’s discussion, at least to my eyes, was how he personified the system itself. He described it as “devious and diabolical,” writing a little later: “Again the complex system is cunning in its ability to mislead.” It sounds less like an impersonal problem than an active adversary, foiling any attempt to control it by distracting us with red herrings and tantalizing false hypotheses. To put it another way, it’s a con artist who presents us with one obvious—and tempting—interpretation of events while concealing the truth, which is often removed in time and space. You could even turn its argument on its head, and say that the most effective liars and deceivers are the ones who take their cues from the way in which the world misleads us, doing consciously what the universe does on account of its own complexity. The successful con artist is the one who can trick us into using our intuitions about patterns of behavior to draw erroneous conclusions. And even if this is nothing but a metaphor, it reminds us to actively distrust our own inclinations. Con artists prey on hope and greed, assuming that our usual standards of conduct and rationality go out the window as soon as we see some benefit for ourselves. As a result, we should be particularly suspicious of solutions that we want to believe. If an answer seems too good to be true, it probably is. (One good example is the Laffer curve, a drawing on a napkin that has influenced tax policy for over forty years with minimal evidence, simply because it tells people what they want to hear.)

So how do we beat the long con of complexity? Forrester’s answer, which was innovative at the time, was to use computer models, which removed analysis from the fallible realm of human intuition. Even if we accept this on principle, it isn’t an option that most of us have—but we also engage in a kind of mental modeling at all times. Elsewhere, Forrester wrote that in generating models, you should start with one question: “Where is the boundary that encompasses the smallest number of components within which the dynamic behavior under study is generated?” Break it down, and you’re left with what we all do, consciously or otherwise, when we try to figure out the world around us. We draw a boundary around the problem that ideally includes all relevant information while excluding irrelevancies and noise. If we’ve done it properly, it contains just the right number of pieces: not so many that we become overwhelmed, and not so few that we miss something important. We do this whenever we make an outline or a prototype, and, less systematically, whenever we try to predict the future in any way. And the nice thing about a model, which basically consists of a set of rules, isn’t just that it can handle more variables than we can comfortably grasp, but that it keeps us at arm’s length from our own assumptions. Many of the most useful creative activities involve the manipulation of symbols according to fixed procedures, which is sometimes taken as a means of encouraging intuition, but is really a way to move past our initial hunches to ones that are less obvious. (Randomness, the most powerful creative tool of all, wrenches us out of familiar patterns of cause and effect.) It doesn’t always work, but sometimes it does. As George Box said a decade after Forrester: “All models are wrong, but some are useful.”

Written by nevalalee

July 12, 2017 at 8:45 am

The saucer people

leave a comment »

Seventy years ago this week, a newspaper report in the Roswell Daily Record stated that the Air Force had captured a flying saucer on a ranch in New Mexico. For most of the next three decades, however, if you had mentioned this incident to your average science fiction fan, you probably would have gotten a blank stare. Roswell didn’t become a cultural touchstone until the late seventies, and it was overshadowed that same year by a more famous sighting by the pilot Kenneth Arnold, who claimed to have seen nine flying objects near Mount Rainier on June 24, 1947. Arnold’s account was so widely covered that the editor John W. Campbell felt obliged to write about it in Astounding Science Fiction. His editorial on the subject, which appeared in the October 1947 issue, ruled out the possibility of a secret government project—“They’d have been test-flown off some small Pacific island, where none but a few selected personnel, plus a few thousand fish, would have been around to report”—and speculated idly that flying saucers might be a form of surveillance. If we wanted to study another planet without being seen by the natives, he noted, we would take much the same approach:

For several months, our investigation would be conducted by non-contact observation; until we know much more about the people, we’ll do well to stay clear of them…A stealthy raid might kidnap a few inhabitants for general questioning and investigation…Investigation of local animals can give all the necessary basic biological science for the preliminary understanding of the local race…After several months of watching, listening, and picking up radio broadcasts, plus investigation of kidnapees, there would be a lot of material to digest. Captured books, particularly children’s books, would give adequate keys to the languages. At that point, we would be smart to clear out for at least a year of concentrated study of the material at hand…It might be a year or five years before any further steps were taken.

Years later, Isaac Asimov, who had a horror of unidentified flying objects, would list “flying saucers” among the causes on which Campbell became increasingly irrational, along with psionics and dianetics. In reality, Campbell wasn’t particularly interested in the subject, and he only covered it in the magazine when outside events obliged him to weigh in. Arnold’s sighting was one such incident, and the McMinnville photographs, which caused a sensation, prompted him to treat it again in the October 1950 issue, at a time when he was far more interested in other matters. Campbell said that flying saucers represented a problem of “no data,” and he compared such sightings to the “background count” picked up by a radiation detector—most of it was just noise. But there were some intriguing patterns:

Some type of real artifact, referred to as flying saucers, appears to exist; the incidence of reports far exceeds any reasonable level of “background count.” Too many observers—too many places—too many simultaneous observations of the same unknown. Something real exists; that we can file as real, valid data…But—this is an important datum—the planet-wide reports do not noticeably exceed the normal level of what we have called here the “background count”…It has been suggested that the flying saucers are interplanetary visitors. But the frequency of occurrence does not show an even planetary distribution; there is a background-count level around the world, with a high peak level in the United States.

He pointed out that most of the sightings took place in the Pacific Northwest, but not in adjacent regions of Canada or Mexico, and he concluded: “It’s a remarkable interplanetary visitor that shows such keen awareness of political boundaries.”

In a reversal of his earlier stance, Campbell hinted strongly that he thought that it was some kind of military program, which he confirmed in another editorial in January 1953: “Their marked tendency to confine their operations to the western United States was, in fact, a factor which made me feel, for a long time, that they were a United States military secret weapon. I was at a loss to explain why an alien, an extraterrestrial, would show such marked preference for that particular geographic area.” (As editor of the magazine Air Trails, he implied to a friend that he had seen a “flying disc,” although he wasn’t able to talk about it.) Campbell then playfully suggested another possible reason why such sightings tended to take place over “the square states,” proposing that aliens were telepathically sensitive, and that they would avoid big cities—where the psychic tension was high—in favor of less inhabited areas that showed a high degree of technological development. For the most part, however, his attitude remained cautious. At the end of the decade, in the April 1959 issue, he wrote:

To date, despite reams of argument and statements, the only sure, positive statement about UFOs that can be made is, “There is a phenomenon. Its nature and cause are totally indeterminable from the data and the technical understanding available to us at the time.” They might be scout ships of interstellar visitors…and they might be giant plasmids of ionized gases of our own atmosphere. They are not the result of any phenomenon adequately known to modern science.

Campbell, who had featured a crashed spacecraft in “Who Goes There?”, was also reluctant to publish fiction on the subject, writing in a rejection letter in 1952: “I’m afraid I can’t touch this one; the flying saucers scare me. Essentially, the point is this: science fiction is speculation based on science. That’s sane, sensible, and helpful. The flying saucers aren’t science; they’re speculation. Now speculation built on speculation is not sane, sensible, and helpful—it’s wild-blue-yonder stuff. It’s insane, and confuses the issue.”

What’s funny, of course, is that Campbell was perfectly willing to publish “speculation built on speculation” about countless other subjects, such as psionics. In October 1953, in the editorial “Unwise Knowledge,” he even made the case that science fiction was the best possible place to talk about such strange matters, since speculation for the purposes of entertainment had a purely positive value. (It’s also worth noting that he didn’t avoid the topic entirely. In the March 1961 issue of Analog, he published a very odd piece by Arthur W. Orton titled “The Four-Faced Visitors of Ezekiel,” which made the case that the vision of the merkabah was really a visit from ancient astronauts. The article generated a lot of attention, and even Asimov liked it, writing a complimentary letter that was printed a few issues later.) Yet Campbell was mostly uninterested in flying saucers, despite the fact that he was naturally sympathetic to such accounts—as a college student at Duke, he had seen a display of ball lightning at his house in Durham, North Carolina, only to have it dismissed by one of his old professors, and he later suggested that a similar phenomenon might lie behind UFO sightings. So why did he steer clear? One reason is that the territory had already been aggressively claimed by Raymond A. Palmer, the former editor of Amazing, who wrote The Coming of the Saucers with Kenneth Arnold and even founded an entire magazine devoted to it. Campbell may have figured that there wasn’t room for two editors. But there was also a more important consideration. In 1954, he wrote to a correspondent:

The flying saucers aren’t facts; they’re Somethings. They may be optical illusions, interstellar travelers, St. Elmo’s Fire, weather balloons…who knows what. To speculate on something so vague and unsure itself is the essence of insane thinking. Quite literally, it would be dangerous for a mind to speculate on unknown-somethings…I don’t know what causes “flying saucer” reports. As soon as sound data as to what it is that causes the reports is available, I’ll be willing to discuss the implications!

The italics are mine. Campbell liked data, even if it came from questionable sources, which was why he was so enthusiastic about such devices as the Hieronymus Machine. He wanted something that he could control. And when it came to flying saucers, he just couldn’t get on board.

Written by nevalalee

July 7, 2017 at 9:18 am

Revenge of the nerds

leave a comment »

“Those cops know who you are,” [Starling] said. “They look at you to see how to act.” She stood steady, shrugged her shoulders, opened her palms. There it was, it was true.

—Thomas Harris, The Silence of the Lambs

Over the last six months, a pattern of behavior within the technology world has been coming into focus. It arguably began with Susan J. Fowler, a software engineer who published a post on her personal blog with the pointedly neutral title “Reflecting on One Very, Very Strange Year at Uber,” which, with its account of sexism, harassment, and the dismissal of her concerns, set off a chain of events that culminated in the resignation of Uber founder Travis Kalanick. More recently, we’ve seen similar reports about the venture capital firm Binary Capital, the investment incubator 500 Startups, and now the electric car company Tesla. Even at a glance, we can draw a few obvious conclusions. The first is that most companies still have no idea how to deal with these accusations. By now, it should be abundantly clear that the only acceptable response to such allegations is to say that you’re taking them seriously. Instead, we get the likes of Binary’s original statement, which said that the partner in question “has in the past occasionally dated or flirted with women he met in a professional capacity.” (The firm quickly reversed itself, and it’s now being rewarded with the possibility that it may simply cease to exist.) Another inference is that the number of cases will only grow, as more women come forward to share their stories. And a third takeaway is that most of these companies have certain qualities in common. They’re founded and run by intelligent, ambitious men who may not have had a lot of romantic success early in life, but who now find themselves in a position of power over women. It’s a dynamic not unlike that of, say, a graduate department in philosophy. And it’s worth wondering if we’re fated to hear similar stories whenever male overachievers with poor social skills become gatekeepers in industries where women are at a numerical disadvantage.   

As it happens, an experiment along those lines has been ongoing for over ninety years, in a closed setting with ample documentation. It’s the science fiction fandom. Most of the evidence is anecdotal, but this doesn’t make it any less compelling. In the anthology The Hugo Winners, which was published in 1962, for instance, Isaac Asimov wrote of Anne McCaffrey: “She’s a woman in a man’s world and it doesn’t bother her a bit.” He explained:

Science fiction is far less a man’s world than it used to be as far as the readers are concerned. Walk into any convention these days and the number of shrill young girls fluttering before you (if you are Harlan Ellison) or backing cautiously away (if you are me) is either frightening or fascinating, depending on your point of view. (I am the fascinated type.)

The writers, however, are still masculine by a heavy majority. What’s more, they are a particularly sticky kind of male, used to dealing with males, and a little perturbed at having to accept a woman on an equal basis.

Asimov concluded: “It’s not so surprising. Science is a heavily masculine activity (in our society, anyway); so science fiction writing is, or should be. Isn’t that the way it goes?” But Anne McCaffrey, with her “Junoesque measurements and utter self-confidence,” was doing just fine. He added: “I have the most disarming way of goggling at Junoesque measurements which convinces any woman possessing them that I have good taste.” As an illustration, he told an amusing story of how McCaffrey beat him in a singing competition, prompting him to point at her chest and protest: “It’s not fair. She has spare lungs!” How could any woman possibly feel out of place?

You could excuse this by saying that Asimov is joking, using the bantering tone that he employs in all of his essays about the fandom, but that’s problematic in itself. Asimov consciously mastered an informal style that made readers feel as if he were confiding in them, telling his publisher, who had expressed doubts about his approach: “They will feel themselves inside the world of science fiction.” And they did. At a time when the genre was rapidly expanding into the mass culture, he made it seem as close and intimate as it had been in the thirties. But he also gave hints to fans about how they were supposed to talk about themselves, and sometimes it wasn’t particularly funny. (It also had a way of excluding anyone who wasn’t in on the joke, as in Asimov’s infamous quip to Samuel R. Delany.) This wasn’t a new development, either. A quarter of a century earlier, as an unknown fan in the letters column of Astounding, Asimov had written: “When we want science fiction, we don’t want swooning dames…Come on, men, make yourself heard in favor of less love mixed with our science.” Later, he doubled down on his position: “Let me point out that women never affected the world directly. They always grabbed hold of some poor, innocent man, worked their insidious wiles on him…and then affected history through him.” He concluded that he should probably stop before he inspired a “vendetta” of all the female fans in the country: “There must be at least twenty of them!” If this was a joke, it persisted for decades, and he wasn’t the only one. When you look back at those letters, their suspicion or bemusement toward women practically oozes off the page, and you get a sense of how hard it must have been for “a female woman”—as one identifies herself in 1931—to enter that world. There was a debate about whether women even belonged, and Asimov cheerfully participated: “The great philosophers and the great religious leaders of the world—the ones who taught truth and virtue, kindliness and justice—were all, all men.”

This doesn’t even get to Asimov’s own behavior with women, which deserves a full post in itself, although I’m frankly not ready to tackle that yet. And while I don’t mean to pick on Asimov on particular, maybe, in a way, I do. In The Hugo Winners, Asimov describes himself as “a ‘Women’s Lib’ from long before there was one,” and his political views were unimpeachably progressive. (I’m sure you could say much the same thing about the founders and employees of most of the firms mentioned above.) He was also the most visible ambassador of a subculture that continues to have a troubling track record with women and minorities, in ways both explicit and implicit, and he wasn’t just symptomatic of its attitudes, but one of its shapers. Fans looked to Asimov for cues about how to behave, because he was exactly what they wanted to become—a shy, lonely kid who grew up to be famous and beloved. And we don’t need to look far for parallels. In an internal email sent two days after the termination of the woman who says that she was fired in retaliation for her claims, Elon Musk wrote:

If you are part of a less represented group, you don’t get a free pass on being a jerk yourself. We have had a few cases at Tesla where someone in a less represented group was actually given a job or promoted over more qualified highly represented candidates and then decided to sue Tesla for millions of dollars because they felt they weren’t promoted enough. That is obviously not cool.

It certainly isn’t. And although Tesla has said that “this email in fact did not reference Ms. Vandermeyden or her case,” it doesn’t matter. The assumption that the presence of “jerks” among less represented groups—who allegedly benefit from special treatment “over more highly qualified represented candidates”—is pervasive enough to be singled out like this sends a message in itself. Musk is a hero to many young men inside and outside his company, just as Asimov, whose books he deeply admires, was to his fans. Many are bright but socially confused, and they’re looking to be told how to act. And as Clarice Starling once said under similar circumstances: “It matters, Mr. Crawford.”

The closed circle

leave a comment »

In his wonderful book The Nature of Order, the architect Christopher Alexander lists fifteen properties that characterize places and buildings that feel alive. (“Life” itself is a difficult concept to define, but we can come close to understanding it by comparing any two objects and asking the one question that Alexander identifies as essential: “Which of the two is a better picture of my self?”) These properties include such fundamentals of design as “Levels of Scale,” “Local Symmetries,” and “Positive Space,” and elements that are a bit trickier to pin down, including “Echoes,” “The Void,” and “Simplicity and Inner Calm.” But the final property, and the one that Alexander suggests is the most important, bears the slightly clunky name of “Not-Separateness.” He points to the Tower of the Wild Goose in China as an example of this quality at its best, and he says of its absence:

When a thing lacks life, is not whole, we experience it as being separate from the world and from itself…In my experiments with shapes and buildings, I have discovered that the other fourteen ways in which centers come to life will make a center which is compact, beautiful, determined, subtle—but which, without this fifteenth property, can still often somehow be strangely separate, cut off from what lies around it, lonely, awkward in its loneliness, too brittle, too sharp, perhaps too well delineated—above all, too egocentric, because it shouts, “Look at me, look at me, look how beautiful I am.”

The fact that he refers to this property as “Non-Separateness,” rather than the more obvious “Connectedness,” indicates that he sees it as a reaction against the marked tendency of architects and planners to strive for distinctiveness and separation. “Those unusual things which have the power to heal…are never like this,” Alexander explains. “With them, usually, you cannot really tell where one thing breaks off and the next begins, because the thing is smokily drawn into the world around it, and softly draws this world into itself.” It’s a characteristic that has little to do with the outsized personalities who tend to be drawn to huge architectural projects, and Alexander firmly skewers the motivations behind it:

This property comes about, above all, from an attitude. If you believe that the thing you are making is self-sufficient, if you are trying to show how clever you are, to make something that asserts its beauty, you will fall into the error of losing, failing, not-separateness. The correct connection to the world will only be made if you are conscious, willing, that the thing you make be indistinguishable from its surroundings; that, truly, you cannot tell where one ends and the next begins, and you do not even want to be able to do so.

This doesn’t happen by accident, particularly when millions of dollars and correspondingly inflated egos are involved. (The most blatant way of separating a building from its surroundings is to put your name on it.) And because it explicitly asks the designer to leave his or her cleverness behind, it amounts to the ultimate test of the subordination of the self to the whole. You can do great work and still falter at the end, precisely because of the strengths that allowed you to get that far in the first place.

It’s hard for me to read these words without thinking of Apple’s new headquarters in Cupertino, variously known as the Ring and the Mothership, which is scheduled to open later this year. A cover story in Wired by Steven Levy describes it in enraptured terms, in which you can practically hear Also Sprach Zarathustra:

As we emerge into the light, the Ring comes into view. As the Jeep orbits it, the sun glistens off the building’s curved glass surface. The “canopies”—white fins that protrude from the glass at every floor—give it an exotic, retro-­future feel, evoking illustrations from science fiction pulp magazines of the 1950s. Along the inner border of the Ring, there is a walkway where one can stroll the three-quarter-mile perimeter of the building unimpeded. It’s a statement of openness, of free movement, that one might not have associated with Apple. And that’s part of the point.

There’s a lot to unpack here, from the reference to pulp science fiction to the notion of “orbiting” the building to the claim that the result is “a statement of openness.” As for the contrary view, here’s what another article in Wired, this one by Adam Rogers, had to say about it a month later:

You can’t understand a building without looking at what’s around it—its site, as the architects say. From that angle, Apple’s new [headquarters] is a retrograde, literally inward-looking building with contempt for the city where it lives and cities in general. People rightly credit Apple for defining the look and feel of the future; its computers and phones seem like science fiction. But by building a mega-headquarters straight out of the middle of the last century, Apple has exacerbated the already serious problems endemic to twenty-first-century suburbs like Cupertino—transportation, housing, and economics. Apple Park is an anachronism wrapped in glass, tucked into a neighborhood.

Without delving into the economic and social context, which a recent article in the New York Times explores from another perspective, I think it’s fair to say that Apple Park is an utter failure from the point of view of “Not-Separateness.” But this isn’t surprising. Employees may just be moving in now, but its public debut dates back to June 7, 2011, when Steve Jobs himself pitched it to the Cupertino City Council. Jobs was obsessed by edges and boundaries, both physical and virtual, insisting that the NeXT computer be a perfect cube and introducing millions of consumers to the word “bezel.” Compare this to what Alexander writes of boundaries in architecture:

In things which have not-separateness, there is often a fragmented boundary, an incomplete edge, which destroys the hard line…Often, too, there is a gradient of the boundary, a soft edge caused by a gradient in which scale decreases…so that at the edge it seems to melt indiscernibly into the next thing…Finally, the actual boundary is sometimes rather careless, deliberately placed to avoid any simple complete sharp cutting off of the thing from its surroundings—a randomness in the actual boundary line which allows the thing to be connected to the world.

The italics are mine, because it’s hard to imagine anything less like Jobs or the company he created. Apple Park is being positioned as Jobs’s posthumous masterpiece, which reminds me of the alternate wording to Alexander’s one question: “Which one of these two things would I prefer to become by the day of my death?” (If the building is a monument to Jobs, it’s also a memorial to the ways in which he shaded imperceptibly into Trump, who also has a fixation with borders.) It’s the architectural equivalent of the design philosophy that led Apple to glue in its batteries and made it impossible to upgrade the perfectly cylindrical Mac Pro. Apple has always loved the idea of a closed system, and now its employees get to work in one.

Written by nevalalee

July 5, 2017 at 8:59 am

The act of cutting

leave a comment »

In a recent article in The New Yorker on Ernest Hemingway, Adam Gopnik evocatively writes: “The heart of his style was not abbreviation but amputation; not simplicity but mystery.” He explains:

Again and again, he creates his effects by striking out what would seem to be essential material. In “Big Two-Hearted River,” Nick’s complicated European experience—or the way that fishing is sanity-preserving for Nick, the damaged veteran—is conveyed clearly in the first version, and left apparent only as implication in the published second version. In a draft of the heartbreaking early story “Hills Like White Elephants,” about a man talking his girlfriend into having an abortion, Hemingway twice uses the words “three of us.” This is the woman’s essential desire, to become three rather than two. But Hemingway strikes both instances from the finished story, so the key image remains as ghostly subtext within the sentences. We feel the missing “three,” but we don’t read it.

Gopnik concludes: “The art comes from scissoring out his natural garrulousness, and the mystery is made by what was elided. Reading through draft and then finished story, one is repeatedly stunned by the meticulous rightness of his elisions.” Following Hemingway’s own lead, Gopnik compares his practice to that of Cézanne, but it’s also reminiscent of Shakespeare, who frequently omits key information from his source material while leaving the other elements intact. Ambiguity, as I’ve noted here before, emerges from a network of specifics with one crucial piece removed.

Over the last two weeks, I’ve been ruthlessly cutting the first draft of my book, leaving me highly conscious of the effects that can come out of compression. In his fascinating notebooks, which I quoted here yesterday, Samuel Butler writes: “I have always found compressing, cutting out, and tersifying a passage suggests more than anything else does. Things pruned off in this way are like the heads of the hydra, two grow for every two that is lopped off.” This squares with my experience, and it reflects how so much of good writing depends on juxtaposition. By cutting, you’re bringing the remaining pieces closer together, which allows them to resonate. Butler then makes a very interesting point:

If a writer will go on the principle of stopping everywhere and anywhere to put down his notes, as the true painter will stop anywhere and everywhere to sketch, he will be able to cut down his works liberally. He will become prodigal not of writing—any fool can be this—but of omission. You become brief because you have more things to say than time to say them in. One of the chief arts is that of knowing what to neglect and the more talk increases the more necessary does this art become.

I love this passage because it reveals how two of my favorite activities—taking notes and cutting—are secretly the same thing. On some level, writing is about keeping the good stuff and removing as much of the rest as possible. The best ideas are likely to occur spontaneously when you’re doing something unrelated, which is why you need to write them down as soon as they come to you. When you’re sitting at your desk, you have little choice but to write mechanically in hopes that something good will happen. And in the act of cutting, the two converge.

Cutting can be a creative act in itself, which is why you sometimes need to force yourself to do it, even when you’d rather not. You occasionally see a distinction drawn between the additive and subtractive arts, but any work often partakes of both at various stages, which confer different benefits. In Behind the Seen, Charles Koppelman says of editing a movie in postproduction:

The orientation over the last six months has been one of accumulation, a building-up of material. Now the engines are suddenly thrown into full reverse. The enterprise will head in the opposite direction, shedding material as expeditiously as possible.

We shouldn’t disregard how challenging that mental switch can be. It’s why an editor like Walter Murch rarely visits the set, which allows him to maintain a kind of Apollonian detachment from the Dionysian process of filmmaking: he doesn’t want to be dissuaded from the need to cut a scene by the knowledge of how hard it was to make it. Writers and other artists working alone don’t have that luxury, and it can be difficult to work yourself up to the point where you’re ready to cut a section that took a long time to write. Time creates its own sort of psychological distance, which is why you’re often advised to put aside the draft for a few weeks, or even longer, before starting to revise it. (Zadie Smith writes deflatingly: “A year or more is ideal—but even three months will do.”) That isn’t always possible, and sometimes the best compromise is to work briefly on another project, like a short story. A change is as good as a rest, and in this case, you’re trying to transform into your future self as soon as possible, which will allow you to perform clinical surgery on the past.

The result is a lot like the old joke: you start with a block of marble, and you cut away everything that doesn’t look like an elephant. When I began to trim my manuscript, I set myself the slightly arbitrary goal of reducing it, at this stage, by thirty percent, guided by the editing rule that I mentioned here a month ago:

Murch also has his eye on what he calls the “thirty percent factor”—a rule of thumb he developed that deals with the relationship between the length of the film and the “core content” of the story. In general, thirty percent of a first assembly can be trimmed away without affecting the essential features of the script: all characters, action, story beats will be preserved and probably, like a good stew, enhanced by the reduction in bulk. But passing beyond the thirty percent barrier can usually be accomplished only by major structural alterations: the reduction or elimination of a character, or whole sequences—removing vital organs rather than trimming fat.

There’s no particular reason why the same percentage should hold for a book as well as a film, but I’ve found that it’s about right. (It also applies to other fields, like consumer electronics.) Really, though, it could have been just about any number, as long as it gave me a clear numerical goal at which to aim, and as long as it hurt a little. It’s sort of like physical exercise. If you want to lose weight, the best way is to eat less, and if you want to write a short book, ideally, you’d avoid writing too much in the first place. But the act of cutting, like exercise, has rewards of its own. As Elie Wiesel famously said: “There is a difference between a book of two hundred pages from the very beginning, and a book of two hundred pages which is the result of an original eight hundred pages. The six hundred pages are there. Only you don’t see them.” And the best indication that you’re on the right track is when it becomes physically painful. As Hemingway writes in A Farewell to Arms: “The world breaks everyone and afterward many are strong at the broken places.” That’s also true of books.

Written by nevalalee

June 29, 2017 at 8:38 am

%d bloggers like this: