Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Archive for June 2016

“In the lights of the cameras…”

leave a comment »

"Shortly after midnight..."

Note: This post is the fifty-seventh installment in my author’s commentary for Eternal Empire, covering Chapter 56. You can read the previous installments here.

Few creative choices are so central to the writing process as the selection of a point of view, but it’s often a haphazard, instinctive decision. Unless you’re working in an overtly experimental mode, you’re usually stuck with the first or third person, which isn’t as straightforward as it sounds. It helps to visualize your set of options as a scatter plot, with the dots growing denser around two blobs that we call the first- and third-person point of view—although the boundaries are fuzzy, and there’s a wide range of possibilities within each category. When a writer begins a story, he or she usually selects a point of view from the start, but it’s only in the act of writing itself that the style settles into a particular spot on the spectrum, which can be further refined at the revision stage. The first person is slightly more limited in scope, which is why an author like Henry James, who called it “the darkest abyss of romance,” could claim that it was inherently unsuited to the novel. But it clearly has its uses, and there are even signs that genre readers have come to prefer it. It opens up delicious possibilities for unreliable narrators, who threaten to become a cliché in themselves, and the intimacy that it creates, even if it’s an illusion, can encourage a greater identification with the protagonist. (Hence the fact that the Nancy Drew series switched from the third person to the first about a decade ago, which feels like a sign of the times.)

I made the choice long ago to write my fiction in the third person, and it has remained pretty much in place for everything I’ve published, for both short stories and novels. (The one exception is my story “Ernesto,” which is set to be reprinted soon by Lightspeed: I gave it a first-person narrator as an homage to Holmes and Watson and to discourage me from attempting a bad imitation of Hemingway.) By design, it’s a detached style: I never dip into interior monologue, and even strong emotions are described as objectively as possible. For the most part, I’m comfortable with this decision, although I’m also conscious of its limitations. As far as I can recall, I arrived at it as a form of constraint to keep certain unwanted tendencies in check: these novels are violent and sometimes implausible, and I developed a slightly chilly voice that I thought would prevent the action from becoming unduly hysterical or going out of control.  I wanted it to be objective, like a camera, so that the reader would be moved or excited by events, rather than by the manner in which they were related. Looking back, though, I sometimes wish that I’d modified my approach to give me the option of going deeper into the protagonist’s thoughts when necessary, as Thomas Harris sometimes does. By keeping my characters at arm’s length, I’ve limited the kinds of stories I can tell, and while I don’t mind staying within that range, it also means that I didn’t devote time to developing skills that might be useful now.

"In the lights of the cameras..."

That said, I still prefer the third person over the first, and I especially like how it can be imperceptibly nudged in one direction or another to suit the demands of the story. This comes in handy when you’re writing what amounts, in places, to a mystery novel. When you’re working in the first person, it can be hard to conceal information from the reader without it feeling like a gimmick or a cheat—although a few authors, like Agatha Christie, have pulled it off brilliantly. The third person allows you to pull back or zoom in as necessary to manage the reader’s access to the plot, and when you’re working in an omniscient mode that allows you to move between characters at will, you can even cut away entirely. These tricks have been baked into the third person as we’ve come to accept it, so a reader, ideally, will accept such shifts without thinking. (It’s possible to take this kind of switching too far, of course, which is why I try to stick with a single point of view per chapter, and I’m never entirely happy with my attempts to cycle between characters within a single scene.) When an author’s style is inherently objective, we aren’t likely to notice if it retreats or advances a little, any more than it registers when a movie cuts from a medium long shot to a medium shot. And if I’ve remained faithful to that style, it’s largely because it’s more flexible than it seems, and its gradations don’t tend to call attention to themselves.

There’s a good functional example of this in Chapter 56 of Eternal Empire. The first two pages are unusual in that they’re effectively told from nobody’s point of view: they relate a series of events—the explosion of the shadow boat, the movements of reporters, the arrival of the evacuees on shore, and the withdrawal of three unidentified figures to a distant part of the quay—as if recounting them in a news dispatch. (In fact, this is literally what is happening: a big chunk of the section is described as if it were being seen by a viewer on a newscast. If I repeatedly mention the camera crews, it’s to provide an artificial viewpoint through which to narrate the action. The lack of a central character is disguised because a camera has taken its place, which isn’t a tactic that can be extended indefinitely, but it works well enough to get me to the second page.) The reason is obvious: I don’t want to reveal that the three men who have detached themselves from the crowd are Orlov, Ilya, and Tarkovsky, whose fate up to this point has been left up in the air. This wouldn’t work at all in the first person, and if it works here, it’s because I’ve established a style that allows, when the plot calls for it, for the removal of the characters entirely. Very little of this was conscious, but it was all built on a choice of tone that I made two novels earlier, on the hunch that it would lend itself to the kind of story I wanted to tell. A paragraph or two later, we’re back in Ilya’s head. And if I’ve pulled it off properly, the reader should never notice that we left it at all…

Written by nevalalee

June 30, 2016 at 9:07 am

Quote of the Day

leave a comment »

Russell Edson

Any way of writing that isolates the writer from worldly acceptance offers the greatest creative efficiency. Isolation from other writers, and isolation from easy publishing. This gives one the terrible privacy so hard to bear, but necessary to get past the idea one has of oneself in relation to the world.

Russell Edson, “Portrait of the Writer as a Fat Man”

Written by nevalalee

June 30, 2016 at 7:30 am

Astounding Stories #12: “Izzard and the Membrane”

with 7 comments

Izzard and the Membrane

Note: As I dive into the research process for my upcoming book Astounding: John W. Campbell, Isaac Asimov, Robert A. Heinlein, L. Ron Hubbard, and the Golden Age of Science Fiction, I’ll be taking the opportunity to highlight works within the genre that deserve to be rediscovered, reappraised, or simply enjoyed by a wider audience. You can read the earlier installments here

“The Internet is the great masterpiece of civilization,” Virginia Heffernan writes in her new book Magic and Loss, and whether or not you agree with her, it’s hard to deny its importance. It touches every aspect of our lives, at least in the parts of the world where it’s possible for you to read these words now, and any attempt to write about how we live today has to take it into account. For those who like to define science fiction as a predictive literature, its failure to collectively foresee the Internet in a meaningful way—in the sense that it devoted so much energy to such subjects as space travel—is perhaps the genre’s greatest cause for regret. You could say, fairly enough, that it’s easy to point out such shortcomings in hindsight, or even that science fiction’s true strength doesn’t lie in prediction, but in preparing its readers for developments that none of us can see coming. But there’s no denying that the absence of anything like the Internet in the vast majority of science fiction has enormous practical consequences. It means that most visions of the future are inevitably dated, and that we need to continuously suspend disbelief to read stories about galactic empires in which computers or information technology don’t play any part at all. (In some ways, the internal logic of Dune, in which thinking machines have been outlawed, has allowed it to hold up in respects that Frank Herbert himself probably never anticipated.)

Of course, in a literature that constantly spun out wild notions in all directions, there were a few stories that were bound to seem prescient, if only by the law of truly large numbers. The idea of a worldwide machine that runs civilization—and the problems that an ordinary mortal would have in dealing with it—was central to R. DeWitt Miller’s “The Master Shall Not Die,” which was published in 1938. Eight years later, A.E. van Vogt’s visionary novel Slan showed its hero interacting through a computer with a Bureau of Statistics that put “a quadrillion facts” at his disposal. Most impressive of all is Will Jenkins’s “A Logic Named Joe,” which appeared a short time earlier: Jenkins, better known under the pen name Murray Leinster, built the story around an interlinked computer network that can answer any conceivable question, and which has already replaced most of the world’s filing clerks, secretaries, and messenger services. When one of the computers accidentally develops “ambition,” it gleefully provides users with advice on how to murder their wives, shows dirty videos to children, and makes suggestions for other illegal queries they might want to ask. (When faced with the prospect of simply turning the system off, a character objects: “If we shut off logics, we go back to a kind of civilization we have forgotten how to run!”) It not only looks forward with eerie accuracy to the Internet, but speculates about what might come next. And yet the clues it provided went mostly unexplored.

But the story that fills me with the most awe is “Izzard and the Membrane” by Walter M. Miller, Jr., which was published in the May 1951 issue of Astounding. Miller is best known today as the author of A Canticle for Leibowitz, but he was also a prolific author of short fiction, and in a single novelette, he manages to lay out most of the concerns of the contemporary transhumanist movement. It’s about an American cyberneticist who has developed an innovative synaptic relay system—a neural network, in other words—that can be used to build a gigantic computer. After being kidnapped by the Russians, who break his will by showing him faked footage of his wife having an affair, he agrees to build a machine for them, called Izzard, that can analyze itself and suggest improvements to its own architecture. Izzard is designed to oversee the coming invasion of the United States, but it also becomes self-aware and develops a method, not just for reproducing attributes of consciousness, but of uploading an existing brain into its data banks. The hero uses it to replicate his wife, who has died, along with himself, so that his soul merges with its image in the machine. Once inside, he gradually becomes aware of another presence, who turns out to be a member of a race that has achieved transcendence already, and which is closely monitoring his work. In the end, he uses his newfound powers to foil the invasion, and he’s reunited with his wife in a virtual simulation, via a portal called the membrane, that allows him to start a new life in the universe inside his own mind.

The result is one of my ten favorite science fiction stories of all time, and not simply because it predicts a dazzling array of issues—the singularity, mind uploading, simulated reality—that seem to have entered the mainstream conversation only in the last decade or so. It’s also an exciting read, full of action and ingenious plot twists, that takes more than one reading to appreciate. Yet like “A Logic Named Joe,” it was an outlier: it doesn’t seem to have inspired other writers to take up its themes in any significant way. To some extent, that’s because it carries its premise about as far as it could possibly go, and if any story can be truthfully described as ahead of its time, it’s this one. But it’s intriguing to think about an alternative direction that science fiction might have taken if “Izzard and the Membrane” had served as the starting point for a line of speculation that the authors of the time had collaborated in developing, with some of the enthusiasm that the editor John W. Campbell devoted instead to channeling the energies of his writers into psionics. It might not have affected the future directly: in some ways, we’re still catching up to the vision that Miller provides here. But we might be better prepared to confront the coming challenges if we had absorbed them as part of the common language of science fiction over the last sixty years. “The future,” William Gibson famously observed, “is already here—it’s just not very evenly distributed.” And that’s true of science fiction, too.

Quote of the Day

leave a comment »

Donald Hall

Any quality of poetry can be used for a number of purposes, including opposed purposes. Thus, concentration on technique has often been used to trivialize content, by poets afraid of what they will learn about themselves. But concentration on technique can absorb the attention while unacknowledged material enters the language; so technique can facilitate inspiration.

Donald Hall, Breakfast Served Any Time All Day

Written by nevalalee

June 29, 2016 at 7:30 am

Memoirs of an invisible writer

with 2 comments

Without You, There is No Us

Earlier this week, the author Suki Kim wrote an article for The New Republic titled “The Reluctant Memoirist.” It relates how Kim landed a contract to write a nonfiction book about the privileged youth of the upper classes in North Korea, which she researched by going undercover as an English-language teacher at a university in Pyongyang. She called the result Without You, There is No Us: My Time With the Sons of North Korea’s Elite, but when she saw the cover design, she was surprised to see two additional words under the title: A Memoir. Here’s her account of what happened next:

I immediately emailed my editor. “I really do not feel comfortable with my book being called a memoir,” I told her…My editor would not budge. She noted that my book was written in the first person—a device I had employed, like many journalists, to provide a narrative framework for my reporting. To call it journalism, she argued, would limit its potential readership…I tried to push back. “This is no Eat, Pray, Love,” I argued during a phone call with my editor and agent.

“You only wish,” my agent laughed.

Kim says: “But that was the whole point. I did not wish that my book were Eat, Pray, Love…It was a subtle shift, but one familiar to professional women from all walks of life. I was being moved from a position of authority—What do you know?—to the realm of emotion: How did you feel?” And the response, when her “memoir” was released, was much as she had feared. Instead of placing her book in the tradition of such journalists as Ted Conover and Barbara Ehrenreich, which is where it clearly belonged, reviewers read it as an example of the memoir genre, leading to concerns about the author’s “deception.” Kim was labeled as an opportunist because she underwent her experience in order to write a book about it—a charge that could reasonably be leveled at every reporter ever. She found herself appearing on panels with memoirists, fielding questions about her personal growth, rather than about North Korea itself. It’s a story that raises a complicated web of questions about the commercial strategies of mainstream publishing, the way categories affect the way we approach a work of nonfiction, and even how we perceive authors based on gender and race, and it’s hard to disentangle any single factor from the rest. But I’d like to focus on one element in particular: the relationship between the author, the agent, and the editor, and how it seems to have failed in this case.

Suki Kim

I haven’t read Kim’s book, and I’m sure that her editor and agent have their own perspectives on the matter. But most writers who go the route of conventional publishing can probably understand the emotions that she expresses. Kim writes about how she had dreaded the moment when “I would have no control over my fate,” but she was surprised to find that it happened in New York, not Pyongyang. Writing of the struggle over the book’s subtitle, she concludes: “It soon became clear that this was a battle I could not win, and I relented.” Which is really an amazing statement, given the nature of the participants involved. The agent works for the author, a point that is forgotten so often that it seems worth italicizing. Similarly, the editor and the author both work for the publishing house: it isn’t a chain of command. Yet I also understand that perception. The entire process of selling a book to a commercial publisher encourages the author to feel like a supplicant. Most of us find an agent by submitting queries in hopes that just one will land an offer of representation, and even if we get it, the memory of the search can create a perceived power dynamic that persists in the face of all evidence. We pay our agents fifteen percent; they’re here to provide a service. And the obstacle race of finding an editor also obscures the fact that once a book is sold, it creates a partnership of equals, in which the author, if anyone, should have the last word about the packaging of his or her work.

But it’s easy to forget this, and having been through the process myself more than once, I can absolutely see why. In this instance, though, it wasn’t just a theoretical concern. The sense of dependence and obligation it enforces allowed Kim’s agent and editor to fall short in one area where they could have been of real value: as the author’s protectors. Kim recognized that it was a bad call, but she didn’t feel empowered to change it, and even if other issues were involved, the dynamic certainly didn’t help. (Which isn’t to say that editorial feedback on the highest level can’t be valuable. When I went out with the pitch for Astounding, I explicitly sold it as a biography of John W. Campbell, but it was my editor who suggested that the scope be expanded to include other authors from the same era. I was the one, in turn, who brought up the names of Asimov, Heinlein, and Hubbard, and everybody was pleased with the result. That’s how it’s supposed to work.) And Kim’s example is one that every writer should remember. These forces, invisible to most readers, affect every book that we see: the title, the cover, the jacket copy, and its positioning in the marketplace are all conscious decisions, and they don’t always reflect the writer’s wishes or best interests. Sometimes they do, but only when those choices emerge from an atmosphere of trust. The writer often has to fight for it, and the structures of publishing and agenting don’t make it easy. But the author’s voice deserves to take precedence over the agent and the editor—because without us, there is no them.

Written by nevalalee

June 28, 2016 at 9:31 am

Quote of the Day

leave a comment »

W.D. Snodgrass

Although the abstract words—truth, justice, happiness, democracy, love, kindness, etc.—are usually dull, that is because they are normally used to narrow the field of vision, to keep people from seeing. There is no reason they cannot be used to widen vision, if the writer is either more honest or more capable of abstract thought than most of his culture is. It is not impossible to be interesting when talking about ideas or when using ideational language; it is merely improbable. The poet’s chosen vocation is to try something improbable.

W.D. Snodgrass, “Tact and the Poet’s Force”

Written by nevalalee

June 28, 2016 at 7:30 am

Brexit through the gift shop

with 2 comments

Art by Damien Hirst

Last week, as the world struggled to comprehend the scope of the Brexit disaster, I tweeted: “On the bright side, this is all going to make a great ironic counterpoint for the protagonist’s midlife crisis in the next Ian McEwan novel.” I was joking, of course, but the more I think about the idea, the more I like it. (It’s certainly better than the crack that I originally thought about making: “I can’t wait for Peter Morgan’s next play.”) If McEwan’s novels have one central theme, it’s how a single instant—whether it’s an impulsive decision or an act of random violence—can have unpredictable repercussions that continue to echo for years. In Atonement, Briony’s lie, which she invents on the spur of the moment, ruins at least three lives, and she atones for it only in her imagination. A similar web of unforeseeable consequences is bound to unfold from Brexit, which in itself was the result of a much less dramatic decision, apparently reached over pizza at the Chicago airport near my house, to hold a vote on exiting the European Union. What was conceived as a tactical move to stave off an internal political scuffle may lead to the final dissolution of the United Kingdom itself. Forget about McEwan: Frederick Forsyth wouldn’t have dared to use this as a plot point, although he’s probably kicking himself for not having thought of it first.

As it happens, McEwan recently made his own feelings clear, in an opinion piece written earlier this month for the Daily Mail. He wrote: “My fear is that a Brexit will set in train a general disentanglement, and Europe will confront in time all its old and terrifying ghosts…The ahistorical, spoiled children of the EU’s success are pushing us towards a dangerous unraveling.” These two sentences sound a lot like a rehearsal for potential titles: A General Disentanglement and A Dangerous Unraveling would both look great on a book cover, and either one would have worked fine for Enduring Love, which is about nothing less than a long disentanglement in the aftermath of a single bewildering event. McEwan’s favorite trick in recent years, in novels from Saturday to Solar, has been to use global events to comment sardonically on the main character’s inner life, as refracted through the protagonist’s skewed perception of the headlines, which he can’t help but see through the lens of his personal issues. It’s a technique that McEwan picked up from John Updike and the Rabbit series, one of which provides the epigraph to Solar, although the result is a little more studied in his hands than it is in the master’s. Brexit itself is the ultimate McEwan allegory: it’s the perfect parallel to a middle-aged affair, say, in which a moment of passionate folly is followed by a conscious uncoupling. If Brexit hadn’t existed, McEwan might have had to invent it.

Ian McEwan

Brexit is undeniably heartbreaking, and nothing good is likely to come of it, but there’s also a part of me that is anticipating the reaction from artists ranging from Zadie Smith to the Pet Shop Boys to Banksy. Even as the art market worries about the effect on auctions, this is a signal moment for artists in the United Kingdom, and the defining event for a generation of writers. There will be plenty of attempts to confront it directly, but its indirect impact is likely to be even more intriguing, as Norman Mailer noted about a different sort of trauma:

If you never write about 9/11 but were in the vicinity that day, you could conceivably, in time to come, describe a battle in a medieval war and provide a real sense of such a lost event. You could do a horror tale or an account of a plague. Or write about the sudden death of a beloved. Or a march of refugees…What won’t always work is to go at it directly. That kind of writing can be exhausted quickly. And the temptation to drive in head-on is, of course, immense—the event was so traumatic to so many.

We’re entering an era of indefinable uneasiness, which is an environment that inevitably produces memorable artistic reactions. It won’t exactly compensate us for the real economic and social dislocations that are bound to come, but two years of uncertainty and fear followed by a generation of malaise are ideal conditions for nurturing notable careers.

And there’s a genuine opportunity now for the reportage that the novel, in particular, does best: it can capture amorphous social forces and crystalize them in a conflict between a couple of characters, or, even better, in the conflict within one character’s heart. It may take years before the real causes and effects of the Brexit vote become clear, and the novel, with its ability to brood over a subject for long enough that the overall shape appears in a kind of time-lapse photography, is in a unique position to bring us the real news, even if it isn’t for a while. And David Cameron will probably inspire a few novels of his own. I can’t think of another example in recent history in which a seemingly trivial decision so utterly transformed a public figure’s legacy. Before Brexit, Cameron had served a relatively uneventful term of office that seemed destined to be forgotten within a decade or two; now he’s the man who gambled, lost, and threatened the stability of his own country toward the end of his monarch’s reign. (When I think of Queen Elizabeth, who just celebrated what was supposed to be a triumphant ninetieth birthday, I’m reminded of what Solon said to King Croesus: you don’t know whether a life was happy or not until it’s over.) It has stamped Cameron into the imagination of the public forever, even if it isn’t for the reasons he would have liked. We’re going to see works of art about this man’s inner life. Colin Firth just needs to put on about twenty pounds, and then we’ll be ready to go.

Written by nevalalee

June 27, 2016 at 9:09 am

Posted in Books, Writing

Tagged with , ,

Quote of the Day

leave a comment »

Jacques Barzun

The magic of the word “creative” is so broad that no distinct meaning need be attached to it; it fits all situations, pointing to nothing in particular…Creativity has become what divine grace and salvation were to former times. It is incessantly invoked, praised, urged, demanded, hoped for, declared achieved, or found lacking.

Jacques Barzun, “The Paradoxes of Creativity”

Written by nevalalee

June 27, 2016 at 7:30 am

The music of correspondences

leave a comment »

Denise Levertov

Reverence for life, if it is a necessary relationship to the world, must be so for all people, not only for poets. Yes; but it is the poet who has language in his care; the poet who more than others recognizes language as a form of life and a common resource to be cherished and served as we should serve and cherish earth and its waters, animal and vegetable life, and each other. The would-be poet who looks on language merely as something to be used, as the bad farmer or the rapacious industrialist looks on the soil or on rivers merely as things to be used, will not discover a deep poetry; he will only, according to the degree of his skill, construct a counterfeit more or less acceptable—a subpoetry, at best efficiently representative of his thought or feeling—a reference, not an incarnation. And he will be contributing, even if not in any immediately apparent way, to the erosion of language, just as the irresponsible, irreverent farmer and industrialist erode the land and pollute the rivers. All of our common resources, tangible or intangible, need to be given to, not exclusively taken from. They require the care that arises from intellectual love—from an understanding of their perfections.

Moreover, the poet’s love of language must, if language is to reward him with unlooked-for miracles, that is, with poetry, amount to a passion. The passion for things of the world and the passion for naming them must be in him indistinguishable. I think that Wordsworth’s intensity of feeling lay as much in his naming of the waterfall as in his physical apprehension of it, when he wrote:

…The sounding cataract
Haunted me like a passion…

The poet’s task is to hold in trust the knowledge that language, as Robert Duncan has declared, is not a set of counters to be manipulated, but a Power. And only in this knowledge does he arrive at music, at that quality of song within speech which is not the result of manipulations of euphonious parts but of an attention, at once to the organic relationships of experienced phenomena and to the latent harmony and counterpoint of language itself as it is identified with those phenomena. Writing poetry is a process of discovery, revealing inherent music, the music of correspondences, the music of inscape. It parallels what, in a person’s life, is called individuation: the evolution of consciousness toward wholeness, not an isolation of intellectual awareness but an awareness involving the whole self, a knowing (as man and woman ”know” one another), a touching, a “being in touch.”

Denise Levertov, “Origins of a Poem”

Written by nevalalee

June 26, 2016 at 7:30 am

The Devil and Georges Cuvier

leave a comment »

Georges Cuvier

They tell a story about [paleontologist Georges] Cuvier to illustrate his confidence in [the] Law of Correlation. It seems that one of his students, who desired to give the Maître a scare, disguised himself as the devil, with the usual horns and hoofs and barb-tipped tail. He penetrated at midnight to Cuvier’s room and, standing by his bedside, roused him from sleep with the announcement, “Cuvier, Cuvier, wake up! I am the Devil and am come to eat you up.” The scientist gazed at him sleepily, looked him over for a moment, and replied, “Hmm—horns—hoofs—you’re graminivorous. You can’t do it.” Whereupon he turned over and went to sleep again and the student retired discomfited.

William Diller Matthew, in The American Museum Journal

Written by nevalalee

June 25, 2016 at 7:30 am

The pianist and the astronaut

with one comment

Philippe Entremont

“I would try to discourage all but the very gifted from going to the conservatory today,” the pianist Philippe Entremont once said in an interview, “because the competition is very fierce.” He continued:

If young artists in the conservatory realized what they were up against, I am sure they would do something else, right away. Because the average piano student at the conservatory has about as much chance of becoming an internationally known pianist as being the President of the United States.

Entremont said this more than thirty years ago, in a conversation with the pianist and author David Dubal in Reflections from the Keyboard, and if anything, his warning seems even more relevant today. Elsewhere in the same book, which I recently picked up on a whim, another pianist estimates the chances of professional survival at something like one in ten thousand—which, while it isn’t quite as unlikely as becoming the next president, is remote enough that the comparison isn’t totally inappropriate. For obvious reasons, I’ve long been fascinated by the mentality that allows people to irrationally pursue careers in which they have almost no chance of succeeding, and the career of a concert pianist, even more than that of a novelist or ballerina, feels like the ultimate example of a profession that continues to exist only because so many music students refuse to accept the odds. (This applies to their parents as well: the typical pianist has been practicing since the age of six.)

Last weekend, I brought this up with a friend of mine whose perspective is particularly interesting. He’s an astrophysicist at Fermilab, a published science author, and a founding member of a popular Chicago soul band, which means that he knows something about the role of talent, intelligence, and luck in three very different fields. We talked about the difficulty of threading the needle when it comes to publishing a book or succeeding as a musician, and after mentioning the dilemma of the concert pianist, I said something like: “Every kid wants to be an astronaut, but how many make it that far?” My friend responded with what I thought was a remarkably insightful point: “The difference between a pianist and an astronaut is that if you don’t succeed at the latter, your consolation prize is a really good job.” He’s right, of course. We aren’t talking about the people who dream of going into space but lack the skills or ability to do so, but the ones who are smart, driven, and qualified, but who didn’t quite make it because of factors outside their control. After a certain point, the competition in any desirable field comes down to an arbitrary selection between candidates who are all but indistinguishable in terms of qualifications. Tiny external variables become disproportionately more important, and those who fail to make it to the astronaut level are still left with skills that will allow them to do pretty much whatever else they want.

The piano on the International Space Station

This isn’t true of a lot of other dream jobs, which can leave their aspirants looking like Frederik Pohl’s fiddler crabs, with nothing to show for all their efforts but one huge, overdeveloped claw. The skills acquired in the pursuit of a career as a pianist or ballerina aren’t readily transferrable, except perhaps to teaching, and they may even make it more difficult to move into another profession later on. You could argue, in fact, that a truly rational actor would choose his or her goals based on that principle of transferability: you want to aim as high as you can, but in a field in which falling just short at the final stage still leaves you with viable options. (It’s unclear to me, incidentally, how this applies to writers, and in particular to novelists. There’s no doubt that writing a publishable novel leaves you with skills that could, in theory, be applied elsewhere: you can write a clear sentence, structure complex ideas, take a project to completion over the period of many months, work productively in solitude, and keep both granular detail and the big picture in view. Yet the way in which these skills express themselves is often absurdly specialized: writing a novel is so different from any other human activity that it doesn’t lend a clear advantage to most other forms of work, especially at a time when nearly every category of media suffers from an oversupply of qualified writers. It also leaves a glaring hole in your résumé, and it can take you out of the workforce for years. And the fact that so many writers, like pianists or ballerinas, turn to teaching implies that their skills aren’t so transferable after all.)

That said, almost nobody thinks in those terms at the point in his or her life when these decisions really matter. We all lack perspective at precisely the moment when we could use it the most. (As Joan Didion said: “One of the mixed blessings of being twenty and twenty-one and even twenty-three is the conviction that nothing like this, all evidence to the contrary notwithstanding, has ever happened to anyone before.”) On a cultural level, if not an individual one, there’s an advantage to encouraging a degree of irrational optimism at an early age. Otherwise, nobody would ever try for a career in the arts, few of which confer any appreciable advantage, in practical terms, to those who drop out of the game. It’s possible that the most “successful” group of people, on average, consists of those who start out with unrealistic ambitions, use those goals to build discipline and achievement in adolescence, and then transfer out of those fields before that kind of tunnel vision has a chance to do lasting harm. If I’d given up on the idea of being a writer at age twenty, I’d still have acquired a set of skills and habits that would have allowed me to do just fine at something else—as it did, more or less, in the years before I decided to make an effort to write for a living. I’m still confident that I made the right choice, but there were a few close calls along the way, and a writer’s life consists largely of postponing the moment of reckoning. If I’d been more practical, I’d have taken on just enough ambition to inoculate myself with it, and then moved on. But I never would have made it into orbit.

Written by nevalalee

June 24, 2016 at 9:16 am

Quote of the Day

leave a comment »

John von Neumann

The sound procedure [in scientific progress] is to obtain first utmost precision and mastery in a limited field, and then to proceed to another, somewhat wider one, and so on.

John von Neumann, The Theory of Games and Economic Behavior

Written by nevalalee

June 24, 2016 at 7:30 am

“If she was going to run, it had to be now…”

leave a comment »

"Maddy only nodded..."

Note: This post is the fifty-sixth installment in my author’s commentary for Eternal Empire, covering Chapter 55. You can read the previous installments here.

In general, an author should try to write active protagonists in fiction, for much the same reason that it’s best to use the active voice, rather than the passive, whenever you can. It isn’t invariably the right choice, but it’s better often enough that it makes sense to use it when you’re in doubt—which, when you’re writing a story, is frankly most of the time. In The Elements of Style, Strunk and Write list the reasons why the active voice is usually superior: it’s more vigorous and direct, it renders the writing livelier and more emphatic, and it often makes the sentence shorter. It’s a form of insurance that guards against some of the vices to which writers, even experienced ones, are prone to succumbing. There are few stories that wouldn’t benefit from an infusion of force, and since our artistic calculations are always imprecise, a shrewd writer will do what he or she can to err on the side of boldness. This doesn’t mean that the passive voice doesn’t have a place, but John Gardner’s advice in The Art of Fiction, as usual, is on point:

The passive voice is virtually useless in fiction…Needless to say, the writer must judge every case individually, and the really good writer may get away with just about anything. But it must be clear that when the writer makes use of the passive he knows he’s doing it and has good reason for what he does.

And most of the same arguments apply to active characters. All else being equal, an active hero or villain is more engaging than a passive victim of circumstance, and when you’re figuring out a plot, it’s prudent to construct the events whenever possible so that they emerge from the protagonist’s actions. (Or, even better, to come up with an active, compelling central character and figure out what he or she would logically do next.) This is the secret goal behind the model of storytelling, as expounded most usefully by David Mamet in On Directing Film, that conceives of a plot as a series of objectives, each one paired with a concrete action. It’s designed to maintain narrative clarity, but it also results in characters who want things and who take active measures to attain them. When I follow the slightly mechanical approach of laying out the objectives and actions of a scene, one beat after another, it gives the story a crucial backbone, but it also usually leads to the creation of an interesting character, almost by accident. If nothing else, it forces me to think a little harder, and it ensures that the building blocks of the story itself—which are analogous, but not identical, to the sentences that compose it—are written in the narrative equivalent of the active voice. And just as the active voice is generally preferable to the passive voice, in the absence of any other information, it’s advisable to focus on the active side when you aren’t sure what kind of story you’re writing: in the majority of cases, it’s simply more effective.

"If she was going to run, it had to be now..."

Of course, there are times when passivity is an important part of the story, just as the passive voice can be occasionally necessary to convey the ideas that the writer wants to express. The world is full of active and passive personalities, and of people who don’t have control over important aspects of their lives, and there’s a sense in which plots—or genres as a whole—that are built around action leave meaningful stories untold. This is true of the movies as well, as David Thomson memorably observes:

So many American films are pledged to the energy that “breaks out.” Our stories promote the hope of escape, of beginning again, of beneficial disruptions. One can see that energy—hopeful, and often damaging, but always romantic—in films as diverse as The Searchers, Citizen Kane, Mr. Smith Goes to Washington, Run of the Arrow, Rebel Without a Cause, Vertigo, Bonnie and Clyde, Greed, and The Fountainhead. No matter how such stories end, explosive energy is endorsed…Our films are spirals of wish fulfillment, pleas for envy, the hustle to get on with the pursuit of happiness.

One of the central goals of modernist realism has been to give a voice to characters who would otherwise go unheard, precisely because of their lack of conventional agency. And it’s a problem that comes up even in suspense: a plot often hinges on a character’s lack of power, less as a matter of existential helplessness than because of a confrontation with a formidable antagonist. (A conspiracy novel is essentially about that powerlessness, and it emerged as a subgenre largely as a way to allow suspense to deal with these issues.)

So how do you tell a story, or even write a scene, in which the protagonist is powerless? A good hint comes from Kurt Vonnegut, who wrote: “I don’t praise plots as accurate representations of life, but as ways to keep readers reading. When I used to teach creative writing, I would tell the students to make their characters want something right away—even if it’s only a glass of water. Characters paralyzed by the meaninglessness of modern life still have to drink water from time to time.” This draws a useful distinction, I think, between the two functions of the active mode: as a reflection of reality and as a tool to structure the reader’s experience. You can use it in the latter sense even in stories or scenes in which helplessness is the whole point, just as you can use the active voice to increase the impact of prose that is basically static or abstract. In Chapter 55 of Eternal Empire, for example, Maddy finds herself in as vulnerable a position as can be imagined: she’s in the passenger seat of a car being driven by a woman whom she’s just realized is her mortal enemy. There isn’t much she can plausibly do to defend herself, but to keep her from becoming entirely passive, I gave her a short list of actions to perform: she checks her pockets for potential weapons, unlocks the door on her side as quietly as she can, and looks through the windshield to get a sense of their location. Most crucially, at the moment when it might be possible to run, she decides to stay where is. The effect is subtle, but real. Maddy isn’t in control of her situation, but she’s in control of herself, and I think that the reader senses this. And it’s in scenes like this, when the action is at a minimum, that the active mode really pays off…

Quote of the Day

leave a comment »

Written by nevalalee

June 23, 2016 at 7:30 am

Astounding Stories #11: The Moon is Hell

with 2 comments

The Moon is Hell

Note: As I dive into the research process for my upcoming book Astounding: John W. Campbell, Isaac Asimov, Robert A. Heinlein, L. Ron Hubbard, and the Golden Age of Science Fiction, I’ll be taking the opportunity to highlight works within the genre that deserve to be rediscovered, reappraised, or simply enjoyed by a wider audience. You can read the earlier installments here

On May 11, 1953, the science fiction editor John W. Campbell wrote a long letter to his stepmother Helen. He never mailed it, but it was preserved among his papers, and it’s a document of immense biographical interest. Campbell, who was chafing under what he saw as his father’s lack of appreciation for what he had achieved in his career, spent a full page listing his professional accomplishments, and he concluded:

My current plans are long-range; when I took over Astounding seventeen years ago, my plans were long range, too…The next step which literature must take is to develop a novel-like story in which the story shows the development of a culture through various experiences…Science fiction is now trying to develop the presentation techniques whereby an individual can understand and appreciate the developmental processes affecting entire cultures. Naturally, we haven’t completed the development of these techniques yet, and we have, in consequence, a rather patchy, unsuccessful literature. It’s like the first automobiles; they were less reliable, rougher riding, noisier, and smellier than the horse and buggy.

But their developmental stage was well worth the effort; their inadequacies in the early days were properly forgiven, but also properly recognized as inadequacies.

When I read these lines, I found myself thinking of Campbell’s novel The Moon is Hell, which first appeared in book form in 1951. It’s best remembered now as one of the very few stories that Campbell published in the three decades after he became the editor of Astounding Science Fiction. By all indications, it’s an apprentice work that was first written sometime in the early thirties, but it appears to have been carefully revised by its author before publication—the writing is far smoother and more accomplished than anything else Campbell was putting out at that stage. And the timing of its release was significant in itself. Science fiction was in a transitional moment: the impact of dianetics was just beginning to be felt, ambitious new competitors were appearing on newsstands, and authors like Heinlein were making their big push into the mainstream. For Campbell, it must have seemed like a good time for a statement of purpose, which is what The Moon is Hell really is—the quintessential hard science fiction novel, built from the ground up from first principles. As the author P. Schuyler Miller wrote in his review in Astounding:

Surely everyone who has done any science fiction has dreamed of writing a realistic story of the first men on another world, worked out with an absolute minimum of hokum—no green princesses, no ruins of alien civilizations, no hostile high priests. The ultimate would be the story of the first men on the Moon—a world without air, without life, or the possibility of life.

John W. Campbell, Jr.

And that’s exactly what Campbell gives us here. The Moon is Hell is told in the form of a journal kept by Dr. Thomas Ridgley Duncan, a physicist and second in command of the first mission to the dark side of the Moon. After the expedition’s relief ship crashes on landing, the astronauts are left stranded with no way to contact Earth; a steadily diminishing supply of food, air, and water; and the knowledge that it will be months before anyone back home realizes that they need to be rescued. They set to work with admirable discipline to obtain the necessities of life from the rocks around them, extracting hydrogen and oxygen from gypsum, developing new techniques for synthesizing nutrients, building generators and engines, turning the starch in their clothes and books into bread, and finally digging out an entire settlement underground, complete with a library and swimming pool. (Much of the plot anticipates The Martian in its determination to science the shit out of the situation.) The diary format allows Campbell to deliver all of this material unencumbered by any interruptions: long sections of it read like a briefing or an extract from a textbook. It’s a novel written by a chemist for other chemists, posing a series of ingenious scientific problems and solutions, and it has enough good ideas to fuel a dozen hard science fiction stories. Reading it, I was reminded of the joke title of the book on which the three protagonists are working in Foucault’s Pendulum: The Wonderful Adventure of Metals. Because although there are no recognizable characters in sight, this is a calculated choice—the real hero is chemistry itself.

The result, to be honest, can be pretty hard going, and although it gets better toward the end, the pages don’t exactly fly by. I found myself admiring each paragraph while vaguely dreading the next: it’s a relatively short novel, but it seems very long. (In its original edition, it was published together with The Elder Gods, a story that Campbell wrote on assignment for Unknown—its original author, Arthur J. Burks, had failed to deliver a publishable manuscript—that provides a much more engaging display of his talents.) But it’s also exactly the novel that Campbell wanted to publish. It provides as perfect a summation as you could want of its author’s strengths and limitations, as well as those of hard science fiction as a whole. This isn’t a narrative about individuals, but about the scientific method itself, and it succeeds in some respects in his goal of telling a story about a culture: it’s implied that the stranded astronauts are laying the foundations for a permanent presence in space. And although it doesn’t work as a novel by any conventional standard, it’s indispensable as a sort of baseline. It’s as if Campbell decided to stake out the limits of hard science fiction as an example to his readers and writers: this is a novel that nobody ought to imitate, but which provides an essential reference point by which all efforts in that vein can be judged. And it’s no accident that it was published at a moment when Campbell was about to push into dianetics, psionics, and fringe science, as if he had already gone as far in the other direction as he possibly could. As Emerson said of Shakespeare, Campbell wanted to plant the standard of humanity “some furlongs forward into chaos,” but first, he had to give us an ideal of order, even if it was hell to read.

Quote of the Day

leave a comment »

John Haines

[An] insight may come as a sudden revelation, but it is more likely to be achieved slowly as a result of simply living and responding to things in the world, of reading and thinking, and of the daily work of writing poems and thinking about writing poems. After a while, if one is lucky, some pattern emerges; the substance of one’s efforts begins to be clear. The process is all one piece, one way of being. And the value of the idea, once it is formed, is that it furnishes the means by which, in a world lacking unity, all that the poet sees and feels has meaning. Things fall into place.

John Haines, “The Hole in the Bucket”

Written by nevalalee

June 22, 2016 at 7:30 am

The air of unreality

leave a comment »

Shiri Appleby and Constance Zimmer on UnREAL

I’ve often said that a work of art is secretly about the process of its own creation, and that seems especially true of the Lifetime series UnREAL. Reviewing its uneven but compelling first season, which followed a pair of ruthless reality show producers as they manipulated their contestants, their coworkers, and themselves, I wrote:

UnREAL isn’t without its problems, which grow increasingly evident as the season progresses…The love triangle between Rachel, Adam, and her hunky bore of an ex-boyfriend Jeremy never settles into anything more than a gimmick…The plotting is a sometimes uneasy mix of cynicism, soap opera, and narrative convenience…By making [its fictional reality series] into a kind of perfect storm of worst-case scenarios, the show holds our attention for the short term, but it ends up making the entire season less interesting: we don’t want life and death, but the small betrayals and reversals that underlie the shows we take for granted.

I concluded: “At its best, this is a remarkably assured series, with its two halves vibrating against each other in ways that can make you tingle with excitement. But the more it cranks up the drama, the less it implicates us, and it all ends up feeling safely unreal.” And I was especially curious to see how it would handle the transition to its second season.

Having watched the first couple of episodes of its current run, I’m still not sure. But I have the feeling that the show’s co-creator, Sarah Gertrude Shapiro, would agree with many of the criticisms I mentioned above. Here are a few excerpts from the remarkably candid profile of Shapiro by D.T. Max that was published last week in The New Yorker:

Executives at Lifetime offered to buy the idea [of UnREAL] immediately. Afterward, Shapiro had second thoughts worthy of a victorious Bachelor contestant: “I was calling 411, asking, ‘Do you have the main number for HBO?’” She couldn’t reach any executives there—this is her story, anyway—and she proceeded with Lifetime…

The studio also asked the writers to expand the role of Jeremy…He fit the aesthetic of Lifetime movies but was not Shapiro’s type…Jeremy, she told me, was “conceived as a one-season character.” Later, she e-mailed me: “I could not get on board with the idea of Jeremy being Rachel’s ‘Mr. Big’ (which was brought up).” Still, the studio had pushed for Josh Kelly to return. “They can ask you to do it, but they can’t make you,” she told me. Like Rachel, Shapiro frequently has to decide whether she is a bomb-thrower or an inside player with misgivings. In this case, she decided to play nice.

Which all leads up to a vivid moment when Carol Barbee, the showrunner, enters the writers’ room and says: “Come on. Let’s put on our big-boy pants and make a story for Jeremy.”

Sarah Gertrude Shapiro

Reading this, I found myself wondering how Josh Kelly, the actor who plays Jeremy, would respond—or the executives at Lifetime itself. (Elsewhere in the article, Shaprio says of Kelly: “All I can say is we employ a veteran, and he’s a good person.” She continues: “Integrating Jeremy was a small price to pay for having a black bachelor and letting Quinn and Rachel go all the way to darkness.”) Every television show, it seems safe to assume, is the product of similar compromises, but it’s rare to see them discussed in public for a series that hasn’t even aired two full seasons yet, and which hasn’t exactly been an invulnerable ratings juggernaut. A hint of backstage conflict doesn’t necessarily tarnish the brand of UNReal, which is explicitly about the tussles behind the scenes of a troubled series, and if anything, it adds an intriguing layer of subtext. Shaprio says of Rachel, her fictional alter ego: “It’s really about ‘I’m savvy enough and smart enough that I know I have to give the network all the frosting and the froufrou and all the titties that they need, and in the process I’m going to slip them this super-important thing.’” Yet if I were Shaprio, I’d be a little uncomfortable with how the article portrayed my relationship with the collaborators who have enabled this show to exist. This includes co-creator Marti Noxon, who says of her partnership with Shapiro: “I don’t think I’ve had as contentious and fruitful a collaboration since I worked with Matt Weiner on Mad Men.”

That quote, in itself, is a small masterpiece of spin, pairing “contentious” with “fruitful” to imply that one leads to the other, and cleverly dropping the name of the one show that ought to silence any concerns we might have about disquiet on the set. But the comparison also works against the series itself. Matthew Weiner, a notorious perfectionist, had contentious interactions with his cast, his crew, and his network, but the result was a show that was staggeringly consistent in tone and quality. You can’t say this about UnREAL, in which the strain of its competing forces is clearly visible: the new season, especially, has struggled to top the delicious toxicity of its debut while keeping the plot wheels turning, and it sometimes verges on shrill. Thanks to the glimpse that we’ve been given of its travails, I’ll be watching the series with even greater interest than before—although I also run the risk of excusing its flaws because of what we now know about its internal tensions. Such justifications are tempting, but flimsy. Every television show in history has suffered from conflict among its collaborators, network interference, competing incentives, and characters whom the show’s writing staff would prefer to forget. When a series is working, you don’t see any of it, as you so often do with UnREAL. Shapiro knows as well as anyone how much of television is an illusion, and most of the fun of this show lies in how it picks the medium apart. But the result would be even more persuasive if it were better about creating those illusions on its own.

Written by nevalalee

June 21, 2016 at 9:16 am

Quote of the Day

with one comment

Wallace Stevens

[The poem] is what I wanted it to be without knowing before it was written what I wanted it to be, even though I knew before it was written what I wanted to do.

Wallace Stevens, on the poem “The Old Woman and the Statue”

Written by nevalalee

June 21, 2016 at 7:30 am

The great scene theory

with 2 comments

The Coronation of Napoleon by Jacques-Louis David

“The history of the world is but the biography of great men,” Thomas Carlyle once wrote, and although this statement was criticized almost at once, it accurately captures the way many of us continue to think about historical events, both large and small. There’s something inherently appealing about the idea that certain exceptional personalities—Alexander the Great, Julius Caesar, Napoleon—can seize and turn the temper of their time, and we see it today in attempts to explain, say, the personal computing revolution though the life of someone like Steve Jobs. The alternate view, which was expressed forcefully by Herbert Spencer, is that history is the outcome of impersonal social and economic forces, in which a single man or woman can do little more than catalyze trends that are already there. If Napoleon had never lived, the theory goes, someone very much like him would have taken his place. It’s safe to say that any reasonable view of history has to take both theories into account: Napoleon was extraordinary in ways that can’t be fully explained by his environment, even if he was inseparably a part of it. But it’s also worth remembering that much of our fascination with such individuals arises from our craving for narrative structures, which demand a clear hero or villain. (The major exception, interestingly, is science fiction, in which the “protagonist” is often humanity as a whole. And the transition from the hard science fiction of the golden age to messianic stories like Dune, in which the great man reasserts himself with a vengeance, is a critical turning point in the genre’s development.)

You can see a similar divide in storytelling, too. One school of thought implicitly assumes that a story is a delivery system for great scenes, with the rest of the plot serving as a scaffold to enable a handful of awesome moments. Another approach sees a narrative as a series of small, carefully chosen details designed to create an emotional effect greater than the sum of its parts. When it comes to the former strategy, it’s hard to think of a better example than Game of Thrones, a television series that often seems to be marking time between high points: it can test a viewer’s patience, but to the extent that it works, it’s because it constantly promises a big payoff around the corner, and we can expect two or three transcendent set pieces per season. Mad Men took the opposite tack: it was made up of countless tiny but riveting choices that gained power from their cumulative impact. Like the theories of history I mentioned above, neither type of storytelling is necessarily correct or complete in itself, and you’ll find plenty of exceptions, even in works that seem to fall clearly into one category or the other. It certainly doesn’t mean that one kind of story is “better” than the other. But it provides a useful way to structure our thinking, especially when we consider how subtly one theory shades into the other in practice. The director Howard Hawks famously said that a good movie consisted of three great scenes and no bad scenes, which seems like a vote for the Game of Thrones model. Yet a great scene doesn’t exist in isolation, and the closer we look at stories that work, the more important those nonexistent “bad scenes” start to become.

Leo Tolstoy

I got to thinking about this last week, shortly after I completed the series about my alternative movie canon. Looking back at those posts, I noticed that I singled out three of these movies—The Night of the Hunter, The Limey, and Down with Love—for the sake of one memorable scene. But these scenes also depend in tangible ways on their surrounding material. The river sequence in The Night of the Hunter comes out of nowhere, but it’s also the culmination of a language of dreams that the rest of the movie has established. Terence Stamp’s unseen revenge in The Limey works only because we’ve been prepared for it by a slow buildup that lasts for more than twenty minutes. And Renée Zellweger’s confessional speech in Down with Love is striking largely because of how different it is from the movie around it: the rest of the film is relentlessly active, colorful, and noisy, and her long, unbroken take stands out for how emphatically it presses the pause button. None of the scenes would play as well out of context, and it’s easy to imagine a version of each movie in which they didn’t work at all. We remember them, but only because of the less showy creative decisions that have already been made. And at a time when movies seem more obsessed than ever with “trailer moments” that can be spliced into a highlight reel, it’s important to honor the kind of unobtrusive craft required to make a movie with no bad scenes. (A plot that consists of nothing but high points can be exhausting, and a good story both delivers on the obvious payoffs and maintains our interest in the scenes when nothing much seems to be happening.)

Not surprisingly, writers have spent a lot of time thinking about these issues, and it’s noteworthy that one of the most instructive examples comes from Leo Tolstoy. War and Peace is nothing less than an extended criticism of the great man theory of history: Tolstoy brings Napoleon onto the scene expressly to emphasize how insignificant he actually is, and the novel concludes with a lengthy epilogue in which the author lays out his objections to how history is normally understood. History, he argues, is a pattern that emerges from countless unobservable human actions, like the sum of infinitesimals in calculus, and because we can’t see the components in isolation, we have to content ourselves with figuring out the laws of their behavior in the aggregate. But of course, this also describes Tolstoy’s strategy as a writer: we remember the big set pieces in War and Peace and Anna Karenina, but they emerge from the diligent, seemingly impersonal collation of thousands of tiny details, recorded with what seems like a minimum of authorial interference. (As Victor Shklovsky writes: “[Tolstoy] describes the object as if he were seeing it for the first time, an event as if it were happening for the first time.”) And the awesome moments in his novels gain their power from the fact that they arise, as if by historical inevitability, from the details that came before them. Anna Karenina was still alive at the end of the first draft, and it took her author a long time to reconcile himself to the tragic climax toward which his story was driving him. Tolstoy had good reason to believe that great scenes, like great men, are the product of invisible forces. But it took a great writer to see this.

Quote of the Day

leave a comment »

Louis Simpson

It may actually do more harm than good [to urge] the writer to strain his powers of invention. Rather than try to work himself up to a pitch of imagination, the poet would do well to discover what is there, in the subject. Let him immerse himself in the scene and wait for something to happen…the right, true thing.

Louis Simpson, “Reflections on Narrative Poetry”

Written by nevalalee

June 20, 2016 at 7:30 am

%d bloggers like this: