Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘The Atlantic

The men who sold the moonshot

with 3 comments

When you ask Google whether we should build houses on the ocean, it gives you a bunch of results like these. If you ask Google X, the subsidiary within the company responsible for investigating “moonshot” projects like self-driving cars and space elevators, the answer that you get is rather different, as Derek Thompson reports in the cover story for this month’s issue of The Atlantic:

Like a think-tank panel with the instincts of an improv troupe, the group sprang into an interrogative frenzy. “What are the specific economic benefits of increasing housing supply?” the liquid-crystals guy asked. “Isn’t the real problem that transportation infrastructure is so expensive?” the balloon scientist said. “How sure are we that living in densely built cities makes us happier?” the extradimensional physicist wondered. Over the course of an hour, the conversation turned to the ergonomics of Tokyo’s high-speed trains and then to Americans’ cultural preference for suburbs. Members of the team discussed commonsense solutions to urban density, such as more money for transit, and eccentric ideas, such as acoustic technology to make apartments soundproof and self-driving housing units that could park on top of one another in a city center. At one point, teleportation enjoyed a brief hearing.

Thompson writes a little later: “I’d expected the team at X to sketch some floating houses on a whiteboard, or discuss ways to connect an ocean suburb to a city center, or just inform me that the idea was terrible. I was wrong. The table never once mentioned the words floating or ocean. My pitch merely inspired an inquiry into the purpose of housing and the shortfalls of U.S. infrastructure. It was my first lesson in radical creativity. Moonshots don’t begin with brainstorming clever answers. They start with the hard work of finding the right questions.”

I don’t know why Thompson decided to ask about “oceanic residences,” but I read this section of the article with particular interest, because about two years ago, I spent a month thinking about the subject intensively for my novella “The Proving Ground.” As I’ve described elsewhere, I knew early on in the process that it was going to be a story about the construction of a seastead in the Marshall Islands, which was pretty specific. There was plenty of background material available, ranging from general treatments of the idea in books like The Millennial Project by Marshall T. Savage—which had been sitting unread on my shelf for years—to detailed proposals for seasteads in the real world. The obvious source was The Seasteading Institute, a libertarian pipe dream funded by Peter Thiel that generated a lot of useful plans along the way, as long as you saw it as the legwork for a science fiction story, rather than as a project on which you were planning to actually spend fifty billion dollars. The difference between most of these proposals and the brainstorming session that Thompson describes is that they start with a floating city and then look for reasons to justify it. Seasteading is a solution in search of a problem. In other words, it’s science fiction, which often starts with a premise or setting that seems like it would lead to an exciting story and then searches for the necessary rationalizations. (The more invisible the process, the better.) And this can lead us to troubling places. As I’ve noted before, Thiel blames many of this country’s problems on “a failure of imagination,” and his nostalgia for vintage science fiction is rooted in a longing for the grand gestures that it embodied: the flying car, the seastead, the space colony. As he famously said six years ago to The New Yorker: “The anthology of the top twenty-five sci-fi stories in 1970 was, like, ‘Me and my friend the robot went for a walk on the moon,’ and in 2008 it was, like, ‘The galaxy is run by a fundamentalist Islamic confederacy, and there are people who are hunting planets and killing them for fun.'”

Google X isn’t immune to this tendency—Google Glass was, if anything, a solution in search of a problem—and some degree of science-fictional thinking is probably inherent to any such enterprise. In his article, Thompson doesn’t mention science fiction by name, but the whole division is clearly reminiscent of and inspired by the genre, down to the term “moonshot” and that mysterious letter at the end of its name. (Company lore claims that the “X” was chosen as “a purposeful placeholder,” but it’s hard not to think that it was motivated by the same impulse that gave us Dimension X, X Minus 1, Rocketship X-M, and even The X-Files.) In fact, an earlier article for The Atlantic looked at this connection in depth, and its conclusions weren’t altogether positive. Three years ago, in the same publication, Robinson Meyer quoted a passage from an article in Fast Company about the kinds of projects favored by Google X, but he drew a more ambivalent conclusion:

A lot of people might read that [description] and think: Wow, cool, Google is trying to make the future! But “science fiction” provides but a tiny porthole onto the vast strangeness of the future. When we imagine a “science fiction”-like future, I think we tend to picture completed worlds, flying cars, the shiny, floating towers of midcentury dreams. We tend, in other words, to imagine future technological systems as readymade, holistic products that people will choose to adopt, rather than as the assembled work of countless different actors, which they’ve always really been. The futurist Scott Smith calls these “flat-pack futures,” and they infect “science fictional” thinking.

He added: “I fear—especially when we talk about “science fiction”—that we miss the layeredness of the world, that many people worked to build it…Flying through space is awesome, but if technological advocates want not only to make their advances but to hold onto them, we have better learn the virtues of incrementalism.” (The contrast between Meyer’s skepticism and Thompson’s more positive take feels like a matter of access—it’s easier to criticize Google X’s assumptions when it’s being profiled by a rival magazine.)

But Meyer makes a good point, and science fiction’s mixed record at dealing with incrementalism is a natural consequence of its origins in popular fiction. A story demands a protagonist, which encourages writers to see scientific progress in terms of heroic figures. The early fiction of John W. Campbell returns monotonously to the same basic plot, in which a lone genius discovers atomic power and uses it to build a spaceship, drawing on the limitless resources of a wealthy and generous benefactor. As Isaac Asimov noted in his essay “Big, Big, Big”:

The thing about John Campbell is that he liked things big. He liked big men with big ideas working out big applications of their big theories. And he liked it fast. His big men built big weapons within days; weapons that were, moreover, without serious shortcomings, or at least, with no shortcomings that could not be corrected as follows: “Hmm, something’s wrong—oh, I see—of course.” Then, in two hours, something would be jerry-built to fix the jerry-built device.

This works well enough in pulp adventure, but after science fiction began to take itself seriously as prophecy, it fossilized into the notion that all problems can be approached as provinces of engineering and solved by geniuses working alone or in small groups. Elon Musk has been compared to Tony Stark, but he’s really the modern incarnation of a figure as old as The Skylark of Space, and the adulation that he still inspires shades into beliefs that are even less innocuous—like the idea that our politics should be entrusted to similarly big men. Writing of Google X’s Rapid Evaluation team, Thompson uses terms that would have made Campbell salivate: “You might say it’s Rapid Eval’s job to apply a kind of future-perfect analysis to every potential project: If this idea succeeds, what will have been the challenges? If it fails, what will have been the reasons?” Science fiction likes to believe that it’s better than average at this kind of forecasting. But it’s just as likely that it’s worse.

Written by nevalalee

October 11, 2017 at 9:02 am

The monotonous periodicity of genius

leave a comment »

Yesterday, I read a passage from the book Music and Life by the critic and poet W.J. Turner that has been on my mind ever since. He begins with a sentence from the historian Charles Sanford Terry, who says of Bach’s cantatas: “There are few phenomena in the record of art more extraordinary than this unflagging cataract of inspiration in which masterpiece followed masterpiece with the monotonous periodicity of a Sunday sermon.” Turner objects to this:

In my enthusiasm for Bach I swallowed this statement when I first met it, but if Dr. Terry will excuse the expression, it is arrant nonsense. Creative genius does not work in this way. Masterpieces are not produced with the monotonous periodicity of a Sunday sermon. In fact, if we stop to think we shall understand that this “monotonous periodicity ” was exactly what was wrong with a great deal of Bach’s music. Bach, through a combination of natural ability and quite unparalleled concentration on his art, had arrived at the point of being able to sit down at any minute of any day and compose what had all the superficial appearance of being a masterpiece. It is possible that even Bach himself did not know which was a masterpiece and which was not, and it is abundantly clear to me that in all his large-sized works there are huge chunks of stuff to which inspiration is the last word that one could apply.

All too often, Turner implies, Bach leaned on his technical facility when inspiration failed or he simply felt indifferent to the material: “The music shows no sign of Bach’s imagination having been fired at all; the old Leipzig Cantor simply took up his pen and reeled off this chorus as any master craftsman might polish off a ticklish job in the course of a day’s work.”

I first encountered the Turner quotation in The New Listener’s Companion and Record Guide by B.H. Haggin, who cites his fellow critic approvingly and adds: “This seems to me an excellent description of the essential fact about Bach—that one hears always the operation of prodigious powers of invention and construction, but frequently an operation that is not as expressive as it is accomplished.” Haggin continues:

Listening to the six sonatas or partitas for unaccompanied violin, the six sonatas or suites for unaccompanied piano, one is aware of Bach’s success with the difficult problem he set himself, of contriving for the instrument a melody that would imply its underlying harmonic progressions between the occasional chords. But one is aware also that solving this problem was not equivalent to writing great or even enjoyable music…I hear only Bach’s craftsmanship going through the motions of creation and producing the external appearances of expressiveness. And I suspect that it is the name of Bach that awes listeners into accepting the appearance as reality, into hearing an expressive content which isn’t there, and into believing that if the content is difficult to hear, this is only because it is especially profound—because it is “the passionate, yet untroubled meditation of a great mind” that lies beyond “the composition’s formidable technical frontiers.”

Haggins confesses that he regards many pieces in The Goldberg Variations or The Well-Tempered Clavier as “examples of competent construction that are, for me, not interesting pieces of music.” And he sums up: “Bach’s way of exercising the spirit was to exercise his craftsmanship; and some of the results offer more to delight an interest in the skillful use of technique than to delight the spirit.”

As I read this, I was inevitably reminded of Christopher Orr’s recent article in The Atlantic, “The Remarkable Laziness of Woody Allen,” which I discussed here last week. Part of Orr’s case against Allen involves “his frenetic pace of one feature film a year,” which can only be described as monotonous periodicity. This isn’t laziness, of course—it’s the opposite—but Orr implies that the director’s obsession with productivity has led him to cut corners in the films themselves: “Ambition simply isn’t on the agenda.” Yet the funny thing is that this approach to making art, while extreme, is perfectly rational. Allen writes, directs, and releases three movies in the time it would take most directors to finish one, and when you look at his box office and awards history, you see that about one in three breaks through to become a financial success, an Oscar winner, or both. And Orr’s criticism of this process, like Turner’s, could only have been made by a professional critic. If you’re obliged to see every Woody Allen movie or have an opinion on every Bach cantata, it’s easy to feel annoyed by the lesser efforts, and you might even wish that that the artist had only released the works in which his inspiration was at its height. For the rest of us, though, this really isn’t an issue. We get to skip Whatever Works or Irrational Man in favor of the occasional Match Point or Midnight in Paris, and most of us are happy if we can even recognize the cantata that has “Jesu, Joy of Man’s Desiring.” If you’re a fan, but not a completist, a skilled craftsman who produces a lot of technically proficient work in hopes that some of it will stick is following a reasonable strategy. As Malcolm Gladwell writes of Bach:

The difference between Bach and his forgotten peers isn’t necessarily that he had a better ratio of hits to misses. The difference is that the mediocre might have a dozen ideas, while Bach, in his lifetime, created more than a thousand full-fledged musical compositions. A genius is a genius, [Dean] Simonton maintains, because he can put together such a staggering number of insights, ideas, theories, random observations, and unexpected connections that he almost inevitably ends up with something great.

As Simonton puts it: “Quality is a probabilistic function of quantity.” But if there’s a risk involved, it’s that an artist will become so used to producing technically proficient material on a regular basis that he or she will fall short when the circumstances demand it. Which brings us back to Bach. Turner’s remarks appear in a chapter on the Mass in B minor, which was hardly a throwaway—it’s generally considered to be one of Bach’s major works. For Turner, however, the virtuosity expressed in the cantatas allowed Bach to take refuge in cleverness even when there was more at stake: “I say that the pretty trumpet work in the four-part chorus of the Gloria, for example, is a proof that Bach was being consciously clever and brightening up his stuff, and that he was not at that moment writing with the spontaneity of those really creative moments which are popularly called inspired.” And he writes of the Kyrie, which he calls “monotonous”:

It is still impressive, and no doubt to an academic musician, with the score in his hands and his soul long ago defunct, this charge of monotony would appear incredible, but then his interest is almost entirely if not absolutely technical. It is a source of everlasting amazement to him to contemplate Bach’s prodigious skill and fertility of invention. But what do I care for Bach’s prodigious skill? Even such virtuosity as Bach’s is valueless unless it expresses some ulterior beauty or, to put it more succinctly, unless it is as expressive as it is accomplished.

And I’m not sure that he’s even wrong. It might seem remarkable to make this accusation of Bach, who is our culture’s embodiment of technical skill as an embodiment of spiritual expression, but if the charge is going to have any weight at all, it has to hold at the highest level. William Blake once wrote: “Mechanical excellence is the only vehicle of genius.” He was right. But it can also be a vehicle, by definition, for literally everything else. And sometimes the real genius lies in being able to tell the difference.

Shoot the piano player

with 2 comments

In his flawed but occasionally fascinating book Bambi vs. Godzilla, the playwright and director David Mamet spends a chapter discussing the concept of aesthetic distance, which is violated whenever viewers remember that they’re simply watching a movie. Mamet provides a memorable example:

An actor portrays a pianist. The actor sits down to play, and the camera moves, without a cut, to his hands, to assure us, the audience, that he is actually playing. The filmmakers, we see, have taken pains to show the viewers that no trickery has occurred, but in so doing, they have taught us only that the actor portraying the part can actually play the piano. This addresses a concern that we did not have. We never wondered if the actor could actually play the piano. We accepted the storyteller’s assurances that the character could play the piano, as we found such acceptance naturally essential to our understanding of the story.

Mamet imagines a hypothetical dialogue between the director and the audience: “I’m going to tell you a story about a pianist.” “Oh, good: I wonder what happens to her!” “But first, before I do, I will take pains to reassure you that the actor you see portraying the hero can actually play the piano.” And he concludes:

We didn’t care till the filmmaker brought it up, at which point we realized that, rather than being told a story, we were being shown a demonstration. We took off our “audience” hat and put on our “judge” hat. We judged the demonstration conclusive but, in so doing, got yanked right out of the drama. The aesthetic distance had been violated.

Let’s table this for now, and turn to a recent article in The Atlantic titled “The Remarkable Laziness of Woody Allen.” To prosecute the case laid out in the headline, the film critic Christopher Orr draws on Eric Lax’s new book Start to Finish: Woody Allen and the Art of Moviemaking, which describes the making of Irrational Man—a movie that nobody saw, which doesn’t make the book sound any less interesting. For Orr, however, it’s “an indictment framed as an encomium,” and he lists what he evidently sees as devastating charges:

Allen’s editor sometimes has to live with technical imperfections in the footage because he hasn’t shot enough takes for her to choose from…As for the shoot itself, Allen has confessed, “I don’t do any preparation. I don’t do any rehearsals. Most of the times I don’t even know what we’re going to shoot.” Indeed, Allen rarely has any conversations whatsoever with his actors before they show up on set…In addition to limiting the number of takes on any given shot, he strongly prefers “master shots”—those that capture an entire scene from one angle—over multiple shots that would subsequently need to be edited together.

For another filmmaker, all of these qualities might be seen as strengths, but that’s beside the point. Here’s the relevant passage:

The minimal commitment that appearing in an Allen film entails is a highly relevant consideration for a time-strapped actor. Lax himself notes the contrast with Mike Leigh—another director of small, art-house films—who rehearses his actors for weeks before shooting even starts. For Damien Chazelle’s La La Land, Stone and her co-star, Ryan Gosling, rehearsed for four months before the cameras rolled. Among other chores, they practiced singing, dancing, and, in Gosling’s case, piano. The fact that Stone’s Irrational Man character plays piano is less central to that movie’s plot, but Allen didn’t expect her even to fake it. He simply shot her recital with the piano blocking her hands.

So do we shoot the piano player’s hands or not? The boring answer, unfortunately, is that it depends—but perhaps we can dig a little deeper. It seems safe to say that it would be impossible to make The Pianist with Adrian Brody’s hands conveniently blocked from view for the whole movie. But I’m equally confident that it doesn’t matter the slightest bit in Irrational Man, which I haven’t seen, whether or not Emma Stone is really playing the piano. La La Land is a slightly trickier case. It would be hard to envision it without at least a few shots of Ryan Gosling playing the piano, and Damien Chazelle isn’t above indulging in exactly the camera move that Mamet decries, in which it tilts down to reassure us that it’s really Gosling playing. Yet the fact that we’re even talking about this gets down to a fundamental problem with the movie, which I mostly like and admire. Its characters are archetypes who draw much of their energy from the auras of the actors who play them, and in the case of Stone, who is luminous and moving as an aspiring actress suffering through an endless series of auditions, the film gets a lot of mileage from our knowledge that she’s been in the same situation. Gosling, to put it mildly, has never been an aspiring jazz pianist. This shouldn’t even matter, but every time we see him playing the piano, he briefly ceases to be a struggling artist and becomes a handsome movie star who has spent three months learning to fake it. And I suspect that the movie would have been elevated immensely by casting a real musician. (This ties into another issue with La La Land, which is that it resorts to telling us that its characters deserve to be stars, rather than showing it to us in overwhelming terms through Gosling and Stone’s singing and dancing, which is merely passable. It’s in sharp contrast to Martin Scorsese’s New York, New York, one of its clear spiritual predecessors, in which it’s impossible to watch Liza Minnelli without becoming convinced that she ought to be the biggest star in the world. And when you think of how quirky, repellent, and individual Minnelli and Robert De Niro are allowed to be in that film, La La Land starts to look a little schematic.)

And I don’t think I’m overstating it when I argue that the seemingly minor dilemma of whether to show the piano player’s hands shades into the larger problem of how much we expect our actors to really be what they pretend that they are. I don’t think any less of Bill Murray because he had to employ Terry Fryer as a “hand double” for his piano solo in Groundhog Day, and I don’t mind that the most famous movie piano player of them all—Dooley Wilson in Casablanca—was faking it. And there’s no question that you’re taken out of the movie a little when you see Richard Chamberlain playing Tchaikovsky’s Piano Concerto No. 1 in The Music Lovers, however impressive it might be. (I’m willing to forgive De Niro learning to mime the saxophone for New York, New York, if only because it’s hard to imagine how it would look otherwise. The piano is just about the only instrument in which it can plausibly be left at the director’s discretion. And in his article, revealingly, Orr fails to mention that none other than Woody Allen was insistent that Sean Penn learn the guitar for Sweet and Lowdown. As Allen himself might say, it depends.) On some level, we respond to an actor playing the piano much like the fans of Doctor Zhivago, whom Pauline Kael devastatingly called “the same sort of people who are delighted when a stage set has running water or a painted horse looks real enough to ride.” But it can serve the story as much as it can detract from it, and the hard part is knowing how and when. As one director notes:

Anybody can learn how to play the piano. For some people it will be very, very difficult—but they can learn it. There’s almost no one who can’t learn to play the piano. There’s a wide range in the middle, of people who can play the piano with various degrees of skill; a very, very narrow band at the top, of people who can play brilliantly and build upon a technical skill to create great art. The same thing is true of cinematography and sound mixing. Just technical skills. Directing is just a technical skill.

This is Mamet writing in On Directing Film, which is possibly the single best work on storytelling I know. You might not believe him when he says that directing is “just a technical skill,” but if you do, there’s a simple way to test if you have it. Do you show the piano player’s hands? If you know the right answer for every scene, you just might be a director.

Updike’s ladder

with 2 comments

In the latest issue of The Atlantic, the author Anjali Enjeti has an article titled “Why I’m Still Trying to Get a Book Deal After Ten Years.” If just reading those words makes your palms sweat and puts your heart through a few sympathy palpitations, congratulations—you’re a writer. No matter where you might be in your career, or what length of time you can mentally insert into that headline, you can probably relate to Enjeti when she writes:

Ten years ago, while sitting at my computer in my sparsely furnished office, I sent my first email to a literary agent. The message included a query letter—a brief synopsis describing the personal-essay collection I’d been working on for the past six years, as well as a short bio about myself. As my third child kicked from inside my pregnant belly, I fantasized about what would come next: a request from the agent to see my book proposal, followed by a dream phone call offering me representation. If all went well, I’d be on my way to becoming a published author by the time my oldest child started first grade.

“Things didn’t go as planned,” Enjeti says drily, noting that after landing and leaving two agents, she’s been left with six unpublished manuscripts and little else to show for it. She goes on to share the stories of other writers in the same situation, including Michael Bourne of Poets & Writers, who accurately calls the submission process “a slow mauling of my psyche.” And Enjeti wonders: “So after sixteen years of writing books and ten years of failing to find a publisher, why do I keep trying? I ask myself this every day.”

It’s a good question. As it happens, I came across her article while reading the biography Updike by Adam Begley, which chronicles a literary career that amounts to the exact opposite of the ones described above. Begley’s account of John Updike’s first acceptance from The New Yorker—just months after his graduation from Harvard—is like lifestyle porn for writers:

He never forgot the moment when he retrieved the envelope from the mailbox at the end of the drive, the same mailbox that had yielded so many rejection slips, both his and his mother’s: “I felt, standing and reading the good news in the midsummer pink dusk of the stony road beside a field of waving weeds, born as a professional writer.” To extend the metaphor…the actual labor was brief and painless: he passed from unpublished college student to valued contributor in less than two months.

If you’re a writer of any kind, you’re probably biting your hand right now. And I haven’t even gotten to what happened to Updike shortly afterward:

A letter from Katharine White [of The New Yorker] dated September 15, 1954 and addressed to “John H. Updike, General Delivery, Oxford,” proposed that he sign a “first-reading agreement,” a scheme devised for the “most valued and most constant contributors.” Up to this point, he had only one story accepted, along with some light verse. White acknowledged that it was “rather unusual” for the magazine to make this kind of offer to a contributor “of such short standing,” but she and Maxwell and Shawn took into consideration the volume of his submissions…and their overall quality and suitability, and decided that this clever, hard-working young man showed exceptional promise.

Updike was twenty-two years old. Even now, more than half a century later and with his early promise more than fulfilled, it’s hard to read this account without hating him a little. Norman Mailer—whose debut novel, The Naked and the Dead, appeared when he was twenty-five—didn’t pull any punches in “Some Children of the Goddess,” an essay on his contemporaries that was published in Esquire in 1963: “[Updike’s] reputation has traveled in convoy up the Avenue of the Establishment, The New York Times Book Review, blowing sirens like a motorcycle caravan, the professional muse of The New Yorker sitting in the Cadillac, membership cards to the right Fellowships in his pocket.” And Begley, his biographer, acknowledges the singular nature of his subject’s rise:

It’s worth pausing here to marvel at the unrelieved smoothness of his professional path…Among the other twentieth-century American writers who made a splash before their thirtieth birthday…none piled up accomplishments in as orderly a fashion as Updike, or with as little fuss…This frictionless success has sometimes been held against him. His vast oeuvre materialized with suspiciously little visible effort. Where there’s no struggle, can there be real art? The Romantic notion of the tortured poet has left us with a mild prejudice against the idea of art produced in a calm, rational, workmanlike manner (as he put it, “on a healthy basis of regularity and avoidance of strain”), but that’s precisely how Updike got his start.

Begley doesn’t mention that the phrase “regularity and avoidance of strain” is actually meant to evoke the act of defecation, but even this provides us with an odd picture of writerly contentment. As Dick Hallorann says in The Shining, the best movie about writing ever made: “You got to keep regular, if you want to be happy.”

If there’s a larger theme here, it’s that the qualities that we associate with Updike’s career—with its reliable production of uniform hardcover editions over the course of five decades—are inseparable from the “orderly” circumstances of his rise. Updike never lacked a prestigious venue for his talents, which allowed him to focus on being productive. Writers whose publication history remains volatile and unpredictable, even after they’ve seen print, don’t always have the luxury of being so unruffled, and it can affect their work in ways that are almost subliminal. (A writer can’t survive ten years of waiting for a book deal without spending the entire time convinced that he or she is on the verge of a breakthrough, anticipating an ending that never comes, which may partially explain the literary world’s fondness for frustration and unresolved narratives.) The short answer to Begley’s question is that struggle is good for a writer, but so is success, and you take what you can get, even you’re transformed by it. I seem to think on a monthly basis of what Nicholson Baker writes of Updike in his tribute U and I:

I compared my awkward public self-promotion too with a documentary about Updike that I saw in 1983, I believe, on public TV, in which, in one scene, as the camera follows his climb up a ladder at his mother’s house to put up or take down some storm windows, in the midst of this tricky physical act, he tosses down to us some startlingly lucid little felicity, something about “These small yearly duties which blah blah blah,” and I was stunned to recognize that in Updike we were dealing with a man so naturally verbal that he could write his fucking memoirs on a ladder!

We’re all on that ladder. Some are on their way up, some are headed down, and some are stuck for years on the same rung. But you never get anywhere if you don’t try to climb.

Quote of the Day

leave a comment »

Written by nevalalee

February 6, 2017 at 7:30 am

The MAYA prophecy

with 3 comments

Raymond Loewy on the cover of Time Magazine

In this month’s issue of The Atlantic, there’s an excerpt from the upcoming book Hit Makers: The Science of Popularity in an Age of Distraction. Its author, Derek Thompson, argues that success in a wide range of fields results from what researchers have called “optimal newness”—a degree of innovation that is advanced enough to be striking, but also just familiar enough to be accessible. Thompson illustrates his point with numerous examples, from plot formulas in prestige dramas to chord progressions in popular music, and he notes that science and business are vulnerable to it as well:

In 2014, a team of researchers from Harvard University and Northeastern University wanted to know exactly what sorts of proposals were most likely to win funding from prestigious institutions such as the National Institutes of Health—safely familiar proposals, or extremely novel ones? They prepared about 150 research proposals and gave each one a novelty score…The most-novel proposals got the worst ratings. Exceedingly familiar proposals fared a bit better, but they still received low scores. “Everyone dislikes novelty,” Karim Lakhani, a co-author, explained to me, and “experts tend to be overcritical of proposals in their own domain.” The highest evaluation scores went to submissions that were deemed slightly new. There is an “optimal newness” for ideas, Lakhani said—advanced yet acceptable.

Thompson frames his argument with a consideration of the career of the industrial designer Raymond Loewy, who summed up the principle with the acronym MAYA: “Most Advanced Yet Acceptable.” And while this may seem like a tautology—an innovation is acceptable until it isn’t—it’s worth scrutinizing more closely. When we look at business or the arts, we find that they often reward the simulation of innovation, which pushes all the right buttons for novelty while remaining fundamentally conventional. The result can be an entire culture that regards itself as innovative while really only repeating an endless cycle of the same clichés. You can see this clearly in technology, of which Thompson writes:

In Silicon Valley, where venture capitalists also sift through a surfeit of proposals, many new ideas are promoted as a fresh spin on familiar successes. The home-rental company Airbnb was once called “eBay for homes.” The on-demand car-service companies Uber and Lyft were once considered “Airbnb for cars.” When Uber took off, new start-ups began branding themselves “Uber for [anything].”

And when every company is talking about “disruption,” it implies that very little is being disrupted at all—especially when the startups in question are inclined to hire people who look just like the founders.

Charles Atlas

Not surprisingly, I found myself applying this observation to the history of science fiction. When you look at Astounding in the golden age through the lens of “optimal newness,” you find that it fits the definition pretty well. John W. Campbell was famously conservative in many respects, and he was wary of directly engaging such subjects as sex and religion. In 1939, when he first read Robert A. Heinlein’s “If This Goes On—,” he loved it, but he also noted in a letter to a friend that it was “too hot to handle,” and that the references to religion had to be carefully edited. Decades later, he was still saying the same thing—the phrase “too hot to handle” recurs repeatedly in his correspondence. Campbell had a fixed idea of how much change his readers would tolerate. As he later wrote to his father: “[Astounding] is carefully expurgated to suit the most prudish—while I’m busy sawing away at the piling on which the whole crazy structure is resting.” And his caution is visible in other ways. When he took over Astounding, it was still basically a pulp magazine, and in order to retain his readership, he couldn’t depart too far from the original model. Instead, he tweaked it in small but significant ways. Instead of the usual hypermasculine heroes, he introduced a new kind of character—the “competent man” who solved all of his problems using logic and engineering. But he was still a white male. It doesn’t seem to have occurred to Campbell that he could be anything else. And as late as 1967, he was still saying that he didn’t think his readers could accept a black protagonist.

And he had good reasons for believing this. One of the first things that everyone notices about the fans of this era—who were a small subset, but a highly visible one, of the readership as a whole—is that many of them were outsiders. They were poor, sickly, unathletic, and sexually inexperienced, and they craved stories that told them that they could become something more. Like Charles Atlas, whose ads were inescapable in the magazine’s pages, Campbell was selling a vision of transformation, which said that you, too, could become a superman if you worked hard enough at it. And he was telling the truth. The sense of otherness that many young science fiction fans experienced was a temporary one, inseparable from the hell of adolescence, and most of them grew up to become productive members of society. They were competent men in larval form. But the genre had less to say to fans who were set apart by qualities that couldn’t merely be outgrown, like race, gender, or sexuality. Like Silicon Valley, it was pitched to appeal to a particular kind of outcast, and this was reflected in the heroes that it celebrated. As long as the formula remained intact, you could do pretty much as you liked. But it also limited the kinds of stories that could be told, to the point where it created a self-fulfilling prophecy about the writers who were drawn to it. The planets were exotic, but the faces were familiar. It’s no secret that science fiction has always tended to take the approach of optimal newness, and that it innovated within acceptable boundaries. But acceptable to whom?

Written by nevalalee

December 14, 2016 at 9:31 am

The two faces of art

with 2 comments

William Matthews

The situation of the arts is two-faced. One face is the face of equal opportunity. Everybody gets a try: equal opportunity. To that face there should be no guardian at the door—it’s open admissions. The other face is the face that deals not with opportunity and hope but with the quality of the actual work produced and the extremely high standards that are required to sort out the very most enduring and emotionally useful work from the next level down, and the many other levels below that. That doesn’t require a guardian. It requires time, which sorts these things out, cruelly, but with a terrible efficiency.

Both faces are required, particularly for an American. We must try to live up to the glorious rhetoric of the founding fathers and mothers of our country and say, Listen, everybody gets a shot at this, nobody is excluded from it, there’s nobody at the door. Later on, time is at the door, erosion is at the door, forgetfulness is at the door, oblivion is at the door. These are worse than any three-headed dog ever.

William Matthews, to The Atlantic

Written by nevalalee

August 13, 2016 at 7:30 am

%d bloggers like this: