Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘James Joyce

The knock at the door

leave a comment »

Once or twice [James Joyce] dictated a bit of Finnegans Wake to [Samuel] Beckett, though dictation did not work very well for him; in the middle of one such session there was a knock at the door which Beckett didn’t hear. Joyce said, “Come in,” and Beckett wrote it down. Afterwards he read back what he had written and Joyce said, “What’s that ‘Come in?’” “Yes, you said that,” said Beckett. Joyce thought for a moment, then said, “Let it stand.” He was quite willing to accept coincidence as his collaborator.

Richard Ellmann, James Joyce

Written by nevalalee

April 1, 2017 at 7:30 am

The cyborg on the page

with 3 comments

Locutus of Borg

In an excellent anthology of his short stories, the author Joe Haldeman describes an exercise that he used to give to his students at M.I.T., where he taught a course on science fiction for many years. Reading it, I found myself wishing—for just about the first time ever—that I could have taken that class. Here’s what Haldeman says:

For this assignment, I gave each student a random number between 8 and 188, which corresponded to page numbers in the excellent sourcebook The Science in Science Fiction, by Peter Nicholls, with David Langford and Brian Stableford. They had to come up with a story using that scientific device or principle. I further restricted them by saying they had to use a story structure from one of the stories in our textbook The Science Fiction Hall of Fame, edited by Robert Silverberg. The point of the assignment was partly to demonstrate that art thrives under restrictions. (It was also to give them a starting point; many had never written fiction before, and a blank page or screen is a terrible thing.)

Haldeman notes that he always does his own assignments, at least to demonstrate the concept for a couple of pages, and that in this case, he was given the word “cyborg” and the structure of Daniel Keyes’s “Flowers for Algernon.” The result was a solid short story, “More Than the Sum of His Parts,” which was later published in Playboy.

Not surprisingly, I love this idea, for reasons that longtime readers of this blog will probably be able to guess. Constraints, as Haldeman observes, are where fiction flourishes. This partially because of the aforementioned tyranny of the blank page: any starting point, even a totally random one, is better than nothing at all, and a premise that is generated by chance can be more stimulating than one of great personal significance. (When you’re trying to write about something important to you, you’re often too intimidated by the possibilities to start, while it’s easy to get started on a premise that has been handed to you for free. As Trump might put it, what have you got to lose?) There’s also the fact that a kind of synergy results when you pair a story structure with a concept: the dialogue between form and content yields ideas that neither one could have generated in isolation. Nearly every story I’ve ever written has resulted from a pairing of two or more notions, and I’ve developed a fairly reliable intuition about which combinations will be the most fruitful. But I haven’t really experimented with structure in the same way, which is why this exercise is so useful. When I brought it up with Haldeman, he said that the assignment is designed to make students think of form as a tool—or a toy—that can be explored and enjoyed independently of plot, which is a point subtle enough that a lot of writers, including me, never get around to playing with it. But when I take my scheduled break in a couple of months to work out a new story, I’m going to give it a try.

Schema for Joyce's Ulysses

It’s revealing, too, that the story that Haldeman uses to illustrate his point is about a cyborg, since that’s what we’re really talking about here—a mixture of artificial and organic parts that theoretically forms a single viable organism. (In the actual story, it doesn’t turn out well.) Sometimes you start with a few components from off the shelf, or an assortment of discrete pieces of information, and once you start to combine them, they knit themselves together with newly grown tissue. In other cases, you begin with something more natural, like the chain of logical events that follow from a dramatic situation, and then add parts as needed. And incorporating a bit of randomness at an early stage results in solutions that never would have occurred to you otherwise. There’s a famous design exercise in which students are told to draw the human body in a state of movement, and then to construct an apparatus that will support the body in that position. At the end, the teacher points out that they’ve been designing furniture. That’s how writing works, too. Writers are frequently drawn to metaphors from carpentry, as when Gabriel García Marquez compares writing to making a table, or when José Saramago says that any chair he makes has to have four stable feet. But the result is more interesting when you don’t think in terms of making a table or a chair, but of creating a support system that will hold up the bodies you’ve set in motion. A cyborg carries his essential furniture with him at all times, stripped down to its purest functional form. And that’s also true of a story.

If every story is a cyborg, there’s also a range of approaches to how visible the parts should be. Some wear their artificial components openly, like Locutus of Borg, so that the result is a style in itself, while others keep their enhancements hidden. A book like Joyce’s Ulysses, with its endless experiments and pastiches in form, looks like a manufacturer’s catalog, or a fashion spread in which the same handful of models show off various possible outfits. I don’t recall offhand if Joyce assigned the various epic episodes, literary styles, and symbols to the chapters of Ulysses at random, but I’d like to believe that he did, simply because it’s such a pragmatic tool: “Let the bridge blow up,” Joyce once said, “provided I have got my troops across.” Sometimes the writer takes pleasure in making the joints between the pieces as invisible as possible, and sometimes it’s more fun to play up the artifice, or even to encourage the reader to spot the references—although a little of this goes a long way. It’s a matter of taste, which is another reason why the use of randomness at an early stage can be a good thing: the more detached you are from the big conceptual blocks of the plot, the more likely you are to make the right decisions when it comes to the details. If you’re the kind of writer who wants to crank out a story a week for a year, as Ray Bradbury once advised, Haldeman’s exercise is invaluable. (As Bradbury says: “I dare any young writer to write fifty-two stories that are all bad.”) I wouldn’t want to take the same approach for every story, since there comes a point at which the author himself starts to resemble a machine. But when used wisely, it’s a nice reminder that every story is more than the sum of its parts.

Written by nevalalee

August 25, 2016 at 8:56 am

Quote of the Day

leave a comment »

Written by nevalalee

May 20, 2016 at 7:30 am

Quote of the Day

with 4 comments

James Joyce

I am really one of the greatest engineers, if not the greatest, in the world besides being a musicmaker, philosophist and heaps of other things. All the engines I know are wrong. Simplicity. I am making an engine with only one wheel. No spokes of course. The wheel is a perfect square.

James Joyce, in a letter to Harriet Shaw Weaver

Written by nevalalee

August 11, 2015 at 6:59 am

Introduction to finality

leave a comment »

Community

Yesterday, while I was out of the house, my wife asked our daughter: “Do you want your veggie snacks in a bowl?” My daughter, who is two years old, replied: “Sure!” I wasn’t there to see it, but when I got back, I was assured by all involved that it was hilarious. Then, this morning, in response to another question, my daughter said: “Sure!” Without thinking twice, I said: “That’s a good callback, honey. Is that your new catchphrase?” Which made me realize how often I talk about myself and those around me as if we were on a television show. We’ve always used terms from art and literature to describe the structure of our own lives: when we talk about “starting a new chapter” or “turning the page,” we’re implicitly comparing ourselves to the characters in novels. Even a phrase like “midlife crisis” is indebted, almost without knowing it, to the language of literary criticism. It was perhaps inevitable, then, that we’d also appropriate the grammar of television, which is the art form that has the most in common with the way our lives tend to unfold. When I moved to New York after college, I thought of myself as the star of a spinoff featuring a breakout character from the original series, a supporting player who ended up being my roommate. And a friend once told me that he felt that the show jumped the shark after I got a job in finance. (He wasn’t wrong.)

Which goes a long way toward explaining why Community has exercised such a hold over the imaginations of its viewers. For the past six years, it hasn’t always been the funniest sitcom around, or the most consistent, or even the most creative. But it’s the show that thought most urgently about the ways in which we use television to understand ourselves. For most of the show’s run, these themes centered on the character of Abed, but as last night’s season finale—which feels an awful lot like it ought to be the last episode of the entire series—clearly demonstrated, their real impact was on Jeff. Community could sometimes be understood as a dialogue between Abed and Jeff, with one insisting on seeing events in terms of narrative conventions while the other brought him down to earth, but in the end, Jeff comes to see these tropes as a way of making sense of his own feelings of loss. We’re aware, of course, that these people are characters on a television series, which is why Abed’s commentary on the action was often dismissed as a winking nod to the audience. But it wouldn’t be so powerful, so compelling, and ultimately so moving if we didn’t also sense that seeing ourselves through that lens, at least occasionally, is as sane a way as any of giving a shape to the shapelessness of our lives.

Danny Pudi on Community

Community may not endure as a lasting work of our culture—it’s more than enough that it was a great sitcom about half the time—but it’s part of a long tradition of stories that offer us metaphors drawn from their own artistic devices. Most beautifully, we have Shakespeare’s seven ages of man, which are framed as acts in a play. (Shakespeare returned to such images with a regularity that implies that he saw such comparisons as more than figures of speech: “Life’s but a walking shadow, a poor player / That struts and frets his hour upon the stage…” “These our actors, / As I foretold you, were all sprits and / Are melted into air, into thin air…”) Joyce used The Odyssey to provide a framework for his characters’ lives, as Proust did, more subtly, with The Thousand and One Nights. These are all strategies for structuring a literary work, but we wouldn’t respond to them so profoundly if they didn’t also reflect how we felt about ourselves. You could even say that fiction takes the form of a shapely sequence of causal events, at least in the western tradition, because we see our lives in much the same way. When you stand back, everyone’s life looks more or less the same, even as they differ in the details, and as we grow older, we see how much we’re only repeating patterns that others before us have laid down.

This might seem like a lot of pressure to place on a show that included a self-conscious fart joke—with repeated callbacks—in its final episode. But it’s the only way I can explain why Community ended up meaning more to me than any other sitcom since the golden age of The Simpsons. The latter show also ended up defining our lives as completely as any work of art can, mostly because its sheer density and longevity allowed it to provide a reference point to every conceivable situation. Community took a clever, almost Borgesian shortcut by explicitly making itself its own subject, and on some weird level, it benefited from the cast changes, creator firings, cancellations, and unexpected revivals that put its viewers through the wringer almost from the start. It was a show that was unable to take anything for granted, no more than any of us can, and even if it sometimes strained to keep itself going through its many incarnations, it felt like a message to those of us who struggle to impose a similar order on our own lives. Life, like a television show on the brink, has to deal with complications that weren’t part of the plan. If those ups and downs pushed Community into darker and stranger places, it’s a reminder that life gains much of its meaning, not from our conscious intentions, but as an emergent property of the compromises we’re forced to make. And like any television show, it’s defined largely by the fact that it ends.

Written by nevalalee

June 3, 2015 at 9:33 am

The Ive Mind

with 3 comments

Jonathan Ive and Steve Jobs

Like many readers, I spent much of yesterday working my way through Ian Parker’s massive New Yorker profile of Apple designer Jonathan Ive. Over the years, we’ve seen plenty of extended feature pieces on Ive, who somehow manages to preserve his reputation as an intensely private man, but it feels like Parker set out to write the one to end them all: it’s well over fifteen thousand words long, and there were times, as I watched my progress creeping slowly by in the scroll bar, when I felt like I was navigating an infinite loop of my own. (It also closes in that abrupt New Yorker way that takes apparent pride in ending articles at the most arbitrary place possible, as if the writer had suffered a stroke before finishing the last paragraph.) Still, it’s a fine piece, crammed with insights, and I expect that I’ll read it again. I’ve become slightly less enamored of Apple ever since my latest MacBook started to disintegrate a few months after I bought it—by which I mean its screws popped out one by one and its plastic casing began to bubble alarmingly outward—but there’s no doubting Ive’s vision, intelligence, and ability to articulate his ideas.

Like a lot of Apple coverage, Parker’s article builds on the company’s mythology while making occasional stabs at deflating it, with paragraphs of almost pornographic praise alternating with a skeptical sentence or two. (“I noticed that, at this moment in the history of personal technology, Cook still uses notifications in the form of a young woman appearing silently from nowhere to hold a sheet of paper in his line of sight.”) And he’s best not so much at talking about Apple’s culture as at talking about how they talk about it. Here’s my favorite part:

[Ive] linked the studio’s work to NASA’s: like the Apollo program, the creation of Apple products required “invention after invention after invention that you would never be conscious of, but that was necessary to do something that was new.” It was a tic that I came to recognize: self-promotion driven by fear that one’s self-effacement might be taken too literally. Even as Apple objects strive for effortlessness, there’s clearly a hope that the effort required—the “huge degree of care,” the years of investigations into new materials, the months spent enforcing cutting paths in Asian factories—will be acknowledged.

Early patent sketches for Apple handheld device

I love this because it neatly encapsulates the neurosis at the heart of so much creative work, from fiction to industrial design. We’re constantly told that we ought to strive for simplicity, and that the finished product, to use one of Ive’s favorite terms, should seem “inevitable.” Yet we’re also anxious that the purity of the result not be confused with the ease of its creation. Writers want readers to accept a novel as a window onto reality while simultaneously noticing the thousands of individual choices and acts of will that went into fashioning it, which is inherently impossible. And it kills us. Writing a novel is a backbreaking process that wants to look as simple as life, and that contradiction goes a long way toward explaining why authors never feel as if they’ve received enough love: the terms of the game that they’ve chosen ensure that most of their work remains invisible. Novels, even mediocre ones, consist of “invention after invention after invention,” a daunting series, as T.H. White noted, of “nouns, verbs, prepositions, adjectives, reeling across the page.” And even when a story all but begs us to admire the brilliance of its construction, we’ll never see more than a fraction of the labor it required.

So what’s a creative artist to do? Well, we can talk endlessly about process, as Ive does, and dream of having a profile in The Paris Review, complete with images of our discarded drafts. Or we can push complexity to the forefront, knowing at least that it will be acknowledged, even if it goes against what we secretly believe about the inevitability of great art. (“The artist, like the God of creation, remains within or behind or beyond or above his handiwork, invisible, refined out of existence, indifferent, paring his fingernails,” James Joyce writes, and yet few other authors have been so insistent that we recognize his choices, even within individual words.) Or, if all else fails, we can rail against critics who seem insufficiently appreciative of how much work is required to make something feel obvious, or who focus on some trivial point while ignoring the agonies that went into a story’s foundations. None of which, of course, prevents us from taking the exact same attitude toward works of art made by others. Ultimately, the only solution is to learn to live with your private store of effort, uncertainty, and compromise, never advertising it or pointing to all your hard work as an excuse when it falls short. Because in the end, the result has to stand on its own, even if it’s the apple of your eye.

Written by nevalalee

February 18, 2015 at 9:50 am

Beethoven, Freud, and the mystery of genius

leave a comment »

Beethoven

“The joy of listening to Beethoven is comparable to the pleasure of reading Joyce,” writes Alex Ross in a recent issue of The New Yorker: “The most paranoid, overdetermined interpretation is probably the correct one.” Even as someone whose ear for classical music is underdeveloped compared to his interest in other forms of art, I have to agree. Great artists come in all shapes and sizes, but the rarest of all is the kind whose work can sustain the most meticulous level of scrutiny because we’re aware that every detail is a conscious choice. When we interpret an ordinary book or a poem, our readings are often more a reflection of our own needs than the author’s intentions; even with a writer like Shakespeare, it’s hard to separate the author’s deliberate decisions from the resonances that naturally emerge from so much rich language set into motion. With Beethoven, Joyce, and a handful of others—Dante, Bach, perhaps Nabokov—we have enough information about the creative process to know that little, if anything, has happened by accident. Joyce explicitly designed his work to “keep professors busy for centuries,” and Beethoven composed for a perfect, omniscient audience that he seemed to will into existence.

Or as Colin Wilson puts it: “The message of the symphonies of Beethoven could be summarized: ‘Man is not small; he is just bloody lazy.'” When you read Ross’s perceptive article, which reviews much of the recent scholarship on Beethoven and his life, you’re confronted by the same tension that underlies any great body of work made within historical memory. On the one hand, Beethoven has undergone a kind of artistic deification, and there’s a tradition, dating back to E.T.A. Hoffmann, that there are ideas and emotions being expressed in his music that can’t be matched by any other human production; on the other, there’s the fact that Beethoven was a man like any other, with a messy personal life and his own portion of pettiness, neediness, and doubt. As Ross points out, before Beethoven, critics were accustomed to talk of “genius” as a kind of impersonal quality, but afterward, the concept shifted to that of “a genius,” which changes the terms of the conversation without reducing its underlying mystery. Beethoven’s biography provides tantalizing clues about the origins of his singular greatness—particularly his deafness, which critics tend to associate with his retreat to an isolated, visionary plane—but it leaves us with as many questions as before.

Sigmund Freud

As it happens, I read Ross’s article in parallel with Howard Markel’s An Anatomy of Addiction, which focuses on the early career of another famous resident of Vienna. Freud seems to have been relatively indifferent to music: he mentions Beethoven along with Goethe and Leonardo Da Vinci as “great men” who have produced “splendid creations,” although this feels more like a rhetorical way of filling out a trio than an expression of true appreciation. Otherwise, his relative silence on the subject is revealing in itself: if he wanted to interpret an artist’s work in psychoanalytic terms, Beethoven’s life would have afforded plenty of material, and he didn’t shy from doing the same for Leonardo and Shakespeare. It’s possible that Freud avoided Beethoven because of the same godlike intentionality that makes him so fascinating to listeners and critics. If we’ve gotten into the habit of drawing a distinction between what a creative artist intends and his or her unconscious impulses, it’s largely thanks to Freud himself. Beethoven stands as a repudiation, or at least a strong counterexample, to this approach: however complicated Beethoven may have been as a man, it’s hard to make a case that there was ever a moment when he didn’t know what he was doing.

This may be why Freud’s genius—which was very real—seems less mysterious than Beethoven’s: we know more about Freud’s inner life than just about any other major intellectual, thanks primarily to his own accounts of his dreams and fantasies, and it’s easy to draw a line from his biography to his work. Markel, for instance, focuses on the period of Freud’s cocaine use, and although he stops short of suggesting that all of psychoanalysis can be understood as a product of addiction, as others have, he points out that Freud’s early publications on cocaine represent the first time he publicly mined his own experiences for insight. But of course, there were plenty of bright young Jewish doctors in Vienna in the late nineteenth century, and while many of the ideas behind analysis were already in the air, it was only in Freud that they found the necessary combination of obsessiveness, ambition, and literary brilliance required for their full expression. Freud may have done his best to complicate our ideas of genius by introducing unconscious factors into the equation, but paradoxically, he made his case in a series of peerlessly crafted books and essays, and their status as imaginative literature has only been enhanced by the decline of analysis as a science. Freud doesn’t explain Freud any more than he explains Beethoven. But this doesn’t stop him, or us, from trying.

%d bloggers like this: