Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Archive for March 2015

Writing on tablecloths

with 4 comments

Napkin drawing by Jay Park of Facebook

Writers will jot down ideas on any flat surface that happens to be handy, but one habit that seems to have vanished—along with cloth linens at most homes and restaurants—is that of doodling on tablecloths. I first encountered it as a young reader in Madeleine L’Engle’s A Wind in the Door:

Mrs. Murry looked down at the checked tablecloth, and at the remains of an equation which had not come out in the wash; doodling equations on anything available was a habit of which she could not break her husband.

As a child who had been strictly warned against writing on the furniture, I was struck enough by this detail to never forget it. Yet older novels and books are full of characters writing on a tablecloths, and it’s mentioned so casually that it seems to have been commonplace, as we see in Jack Olsen’s nonfiction classic The Girls in the Office:

He took me to Lüchow’s, and in the course of explaining something to me he began writing on the tablecloth with a pen. I had never seen that in my whole life; I was shocked! I said quietly, “Please don’t write on the tablecloth!” He said, “All businessmen write on tablecloths.” I said, “Well, I’ve never seen it before. Please don’t! It’s very hard to clean ink out of a tablecloth, and somebody will have to do it.”

In fact, you could write an entire history of the role of tablecloths in creative thought, although it appears to have flourished for a relatively short time, after the proliferation of cheap, easily laundered table linens and before their disappearance from most dining tables and restaurants. (It would have gone back at least to the early nineteenth century: legend has it that Schubert began composing his Octet in F Major on the tablecloth in a Viennese café. Much later, the songwriter Matt Dennis wrote the Sinatra standard “Violets for Your Furs” on a tablecloth while watching Billie Holiday perform at Kelly’s Stable in New York.) And it’s no mystery as to why the tablecloth provided such an attractive surface for noodling. With its large size and alluring emptiness, it was the whiteboard of its day, and it was right there at the diner’s elbow to capture any passing insight. Writing on cloth also offers certain tactile pleasures. And there’s the not inconsiderable fact that whatever notes you made weren’t meant to be kept. The tablecloth was a surface for daydreaming in two dimensions, not for writing anything that was supposed to last, and unless you begged or bribed the restaurant to take it with you, what you wrote down would literally come out in the wash.

Back of the envelope

Tablecloths at casual eateries can be hard to find these days, and when they do appear, they’re usually at restaurants that would frown on a customer defacing the linens. Fortunately, they were supplanted just as another convenient surface arrived on the scene: the paper napkin. A napkin, of course, is the proverbial canvas for unstructured thinking, or for rapid estimates that are only designed to guide more systematic calculations to come. (There’s even an entire book, The Back of the Napkin by Dan Roam, devoted to making quick sketches for brainstorming, and although it’s a little too focused on business considerations to be useful for writers, Roam’s initial piece of advice is a good one: “To start, draw a circle and give it a name.”) What’s funny about writing on the back of the napkin is that it doesn’t really have a back: both sides are equally blank. The phrase seems to have arisen by analogy with the back of the envelope, where the distinction between front and back makes more sense, but it’s also appealing in its own right. A napkin is already disposable, and there’s no sense of permanence inhibiting us from thinking freely, but we aren’t even writing on the front of it. Both sides might look the same, but the back is where the real, unencumbered action takes place.

These days, if we don’t have a piece of scrap paper handy, we’re just as capable of taking notes on a tablet or smartphone, but we shouldn’t underestimate the importance of the physical medium. Like a tablecloth, a napkin is an ideal surface for scribbling, especially with a felt marker or Pilot ballpoint pen, since the ink’s viscosity allows it to sink pleasantly into the surface. (Going back even further in time, we discover mentions of notes jotted down on blotting paper, which would have afforded many of the same delights.) And its folds, like those of an envelope, add an appealing sense of three-dimensionality: I’ve found that a blank piece of paper immediately becomes a more attractive medium for notes when it’s simply folded in half. I’ve gotten into the habit of bringing an envelope along whenever I’m going to be out of the house for more than a few minutes, and I’ll often prime it beforehand with a few notes on what I want to be thinking about that day. Sometimes I’ll even write down an Oblique Strategy or two to guide whatever brainstorming takes place. I could do something similar on my phone, or on the business cards I hoard for the same reason, but with its slightly larger area and foldability, an envelope all but begs to be used. Which only means that if you ever have a writer—or a composer—over for dinner, you’d better hide your tablecloths.

Quote of the Day

leave a comment »

Written by nevalalee

March 31, 2015 at 7:30 am

Posted in Quote of the Day

Tagged with

The essential strangeness of fairy tales

with 4 comments

Rapunzel by Paul O. Zelinsky

Over the last few months, I’ve been telling my daughter a lot of fairy tales. My approach has been largely shaped, for better or worse, by Bruno Bettelheim’s book The Uses of Enchantment: I happened to read it last year as part of an unrelated writing project, but it also contained insights that I felt compelled to put to use almost at once in my own life. Bettelheim is a controversial figure for good reason, and he’s not a writer whose ideas we need to accept at face value, but he makes several points that feel intuitively correct. When it comes to fairy tales, it seems best to tell the oldest versions of each story we have, as refined through countless retellings, rather than a more modern interpretation that hasn’t been as thoroughly tested; and, when possible, it’s preferable to tell them without a book or pictures, which gets closer to the way in which they were originally transmitted. And the results have been really striking. Stories like “Little Red Riding Hood” and “Jack and the Beanstalk” have seized my daughter’s imagination, to the point where we’ll discuss them as if they happened to her personally, and she isn’t fazed by some of their darker aspects. (In “Hansel and Gretel,” when I tell her that the parents wanted to take their children into the woods and leave them there, she’ll cheerfully add: “And kill dem dere!”)

There’s no denying that the traditional versions of these fairy tales contain elements that most contemporary parents find disturbing or inexplicable, like the red-hot shoes that the evil queen in “Snow White” is forced to wear to dance herself to death, or the willingness of the father in “Hansel and Gretel” to abandon his children in the forest. (When my wife told me that she thought that the real villain in that story is the father, I replied: “Actually, I think the real villain is the witch who cooks and eats little kids.”) It’s tempting to tone down the originals a bit, sometimes to the point of insipidity: I recently came across a retelling of “Little Red Riding Hood” in which the wolf doesn’t eat the grandmother at all—she just gets scared and runs off to hide behind the house. But based solely on my own observations, I think it’s a mistake to shy away from the darkness: not, as Bettelheim would have it, for its psychological benefits, which can be hard to pin down anyway, but simply from the perspective of good storytelling. A version of “Little Red Riding Hood” in which the wolf doesn’t eat the grandmother doesn’t just trivialize the wolf, but everybody else involved, and it’s liable to strike both child and parent as equally pointless. And kids who sense that their time is being wasted won’t ask to repeat the experience.

Tangled

And there’s a particularly important point here, which is that the more bizarre or irrational the detail—and the harder it is to extract any clear lesson from it—the more likely it has survived for a reason. Plot points that are simply functional or logical, or which serve an obvious didactic purpose, as in “The Boy Who Cried Wolf,” will be retained out of practical considerations; when something arbitrary, grotesque, or even borderline immoral lingers on through countless retellings, it can only be because it gets at something fundamental. It reminds me a little of the criterion of embarrassment in literary analysis, which states that a historical detail that would have seemed embarrassing or strange to its original authors or readers—like the crucifixion—is likely to be authentic, since they wouldn’t be inclined to invent such an inconvenient fact if they had any choice. (Similarly, in classical philology, when you need to choose between two variants of the same text, the stranger or more unusual form is usually the older one: it’s more probable that a scribal error would smooth out a perceived anomaly to make it more conventional, rather than the other way around.) We may not be able to articulate why these details are there, but the fact that they were selected to survive speaks to their importance and resilience, which would be foolish to underestimate.

And it can be disorienting to move from the older versions of these stories to their more recent, Disneyfied incarnations. In the version of “Rapunzel” recorded by the Brothers Grimm, for instance, the story is set in motion by the mother’s obsessive desire to eat some of the lettuce from the garden of the sorceress next door. We aren’t told why she wants it so badly; she simply tells her husband that if she can’t have it, she fears that she will die. In Tangled, this detail is rationalized and clearly motivated: the lettuce becomes a flower with magical healing properties, and it’s literally used to save the queen’s life. This version has the benefit of explaining away the weirdness in the original tale, and it’s far more acceptable from a narrative point of view, but it also diminishes its mystery and resonance. I like Tangled a lot, and I’m not going to reject the Disney versions of these stories—which have considerable artistic merits of their own—just because they soften or minimize the darker elements of their sources. (Although I do avoid the Disney storybooks, which follow the plot points by rote while losing most of the appeal of both the movie and the original story.) But it’s worth remembering that each version exists to fulfill a different need, and a child’s inner life ought to have room for both.

Written by nevalalee

March 30, 2015 at 9:07 am

Quote of the Day

leave a comment »

Written by nevalalee

March 30, 2015 at 7:30 am

Posted in Quote of the Day

Tagged with

The fog of science

leave a comment »

Elliott Sober

A science may fall short of perfect clarity in different ways. One is relatively benign. A science may move forward, sideways, and backward as if in a fog that sometimes lifts a little and then resettles. The lack of visible landmarks makes progress difficult to gauge. Theories proliferate; models come and go. Whether theory replacement means theoretical improvement is often hard to determine. But a science enveloped in fog has at least one consolation. A fog does not foster the illusion of clarity; the lack of visibility is patent.

More insidious than the fog is the mirage. Fogs are seen for what they are. Mirages are trickier, engendering the mistaken conviction that things are as they seem. On the other side of the horizon there are some palm trees. Light reflected off a layer of clouds makes them seem to be where they are not. In “seeing” the mirage, one sees something real. One sees the trees, but they are not where they seem to be.

Mirages, unlike hallucinations, are produced by things outside ourselves; they are not sheer fabrications arising from within. This explains why mirages can be jointly experienced by sensible people. Walking together in the desert, we share the same illusion, and this gives further credence to the idea that we are not making a mistake. Vision is a reliable guide to the location of objects; intersubjective agreement is an additional check against error. Mirages can be doubly hard to detect because they exploit this twofold assurance.

Elliott Sober, The Nature of Selection

Written by nevalalee

March 29, 2015 at 7:30 am

The art of guessing

with 4 comments

Jacob Bronowski

How does the outstanding scientist come to propose such a decisive axiom, while less imaginative minds go on tinkering with the old system? How did Gregor Mendel leap to conceive the statistical axioms of genetics? What moved Albert Einstein to make the constancy of the speed of light not a consequence but an axiom in the construction of relativity?

An obvious answer is that the great mind, like the small, experiments with different alternatives, works out their consequences for some distance, and thereupon guesses (much like a chess player) that one move will generate richer possibilities than the rest. But this answer only shifts the question from one foot to the other. It still remains to ask how the great mind comes to guess better than another, and to make leaps that turn out to lead further and deeper than yours or mine.

We do not know; and there is no logical way in which can know, or can formalize the pregnant decision. The step by which a new axiom is added cannot itself be mechanized. It is a free play of the mind, an invention outside the logical processes. This is the central act of imagination in science, and it is in all respects like any similar act in literature. In this respect, science and literature are alike: in both of them, the mind decides to enrich the system as it stands by an addition which is made by an unmechanical act of free choice.

Jacob Bronowski

Written by nevalalee

March 28, 2015 at 7:30 am

The opening act dilemma

leave a comment »

Ronald McDonald

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “Have you ever gone to a concert just for the opener?”

Earlier this week, I described the initial stages of creating a brand, whether commercial or artistic, as a kind of charitable enterprise: you’ve got to be willing to lose money for years to produce anything with a chance of surviving. I was speaking primarily of investors and patrons, but of course, it’s also true of artists themselves. A career in the arts requires an enormous initial investment of time, energy, and money—at least in the form of opportunity cost, as you choose not to pursue more remunerative forms of making a living—and a major factor separating those who succeed from those who don’t is the amount of pain they’re willing to endure. David Mamet famously said that everyone gets a break in show business in twenty-five years: some get it at the beginning, others at the end, and all you can really control is how willing you are to stick around after everyone else has gone home. That’s always been true, but more recently, it’s led to a growing assumption that emerging artists should be willing, even eager, to give work away for free. With media of all kinds being squeezed on both sides by increasing competition and diminishing audiences, there’s greater pressure than ever to find cheap content, and the most reliable source has always been hungry unknowns desperate for any kind of exposure.

And that last word is an insidious one. Everybody wants exposure—who wouldn’t?—but its promise is often used to justify arrangements in which artists are working for nothing, or at a net loss, for companies that aren’t in it for charity. Earlier this month, McDonald’s initially declined to pay the bands scheduled to play at its showcase at South by Southwest, saying instead that the event would be “a great opportunity for additional exposure.” (This took the form of the performers being “featured on screens throughout the event, as well as possibly mentioned on McDonald’s social media counts.”) When pressed on this, the company replied sadly: “There isn’t a budget for an artist fee.” Ultimately, after an uproar that canceled out whatever positive attention it might have expected, it backtracked and agreed to compensate the artists. And even if this all sort of went nowhere, it serves as a reminder of how craven even the largest corporations can be when it comes to fishing for free content. McDonald’s always seeks out the cheapest labor it can, cynically passing along the hidden human costs to the rest of society, so there’s no reason to expect it to be any different when it comes to music. As Mamet says of movie producers, whenever someone talks to you about “exposure,” what they’re really saying is: “Let me take that cow to the fair for you, son.”

Ronald McDonald

That said, you can’t blame McDonald’s for seizing an opportunity when it saw one. If there are two groups of artists who have always been willing to work for free, it’s writers and musicians, and it’s a situation that has been all but institutionalized by how the industries themselves are structured. A few months ago, Billboard published a sobering breakdown of the costs of touring for various tiers of performers. For a headliner like Lady Gaga or Katy Perry, an arena performance can net something like $300,000, and even after the costs of production, crew, and transportation are deducted, it’s a profitable endeavor. But an opening act gets paid a flat fee of $15,000 or so, and when you subtract expenses and divide the rest between members of the band, you’re essentially paying for the privilege of performing. As Jamie Cheek, an entertainment business manager, is quoted as saying: “If you get signed to a major label, you’re going to make less money for the next two or three years than you’ve ever made in your life.” And it remains a gamble for everyone except the label itself. Over the years, I’ve seen countless opening acts, but I’d have trouble remembering even one, and it isn’t because they lacked talent. We’re simply less likely to take anything seriously if we haven’t explicitly paid for it.

That’s the opening act dilemma. And it’s worth remembering this if you’re a writer being bombarded with proposals to write for free, even for established publications, for the sake of the great god exposure. For freelancers, it’s created a race to the bottom, as they’re expected to work for less and less just to see their names in print. And we shouldn’t confuse this with the small presses that pay contributors in copies, if at all. These are labors of love, meant for a niche audience of devoted readers, and they’re qualitatively different from commercial sites with an eye on their margins. The best publications will always pay their writers as fairly as they can afford. Circulation for the handful of surviving print science-fiction magazines has been falling for years, for instance, but Analog and Asimov’s recently raised their rate per word by a penny or so. It may not sound like much, but it amounts to a hundred dollars or so that it didn’t need to give its authors, most of whom would gladly write for even less. Financially, it’s hard to justify, but as a sign of respect for its contributors, it speaks volumes, even as larger publications relentlessly cut their budgets for freelancers. As painful as it may be, you have to push back, unless you’re content to remain an opening act for the rest of your life. You’re going to lose money anyway, so it may as well be on your own terms. And if someone wants you to work for nothing now, you can’t expect them to pay you later.

Written by nevalalee

March 27, 2015 at 9:22 am

%d bloggers like this: