Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Archive for July 2017

The conveyor belt

leave a comment »

For all the endless discussion of various aspects of Twin Peaks, one quality that sometimes feels neglected is the incongruous fact that it had one of the most attractive casts in television history. In that respect—and maybe in that one alone—it was like just about every other series that ever existed. From prestige dramas to reality shows to local newscasts, the story of television has inescapably been that of beautiful men and women on camera. A show like The Hills, which was one of my guilty pleasures, seemed to be consciously trying to see how long it could coast on surface beauty alone, and nearly every series, ambitious or otherwise, has used the attractiveness of its actors as a commercial or artistic strategy. (In one of the commentary tracks on The Simpsons, a producer describes how a network executive might ask indirectly about the looks of the cast of a sitcom: “So how are we doing aesthetically?”) If this seemed even more pronounced on Twin Peaks, it was partially because, like Mad Men, it took its conventionally glamorous actors into dark, unpredictable places, and also because David Lynch had an eye for a certain kind of beauty, both male and female, that was more distinctive than that of the usual soap opera star. He’s continued this trend in the third season, which has been populated so far by such striking presences as Chrysta Bell, Ben Rosenfield, and Madeline Zima, and last night’s episode features an extended, very funny scene between a delighted Gordon Cole and a character played by Bérénice Marlohe, who, with her red lipstick and “très chic” spike heels, might be the platonic ideal of his type.

Lynch isn’t the first director to display a preference for actors, particularly women, with a very specific look—although he’s thankfully never taken it as far as his precursor Alfred Hitchcock did. And the notion that a film or television series can consist of little more than following around two beautiful people with a camera has a long and honorable history. My two favorite movies of my lifetime, Blue Velvet and Chungking Express, both understand this implicitly. It’s fair to say that the second half of the latter film would be far less watchable if it didn’t involve Tony Leung and Faye Wong, two of the most attractive people in the world, and Wong Kar-Wai, like so many filmmakers before him, uses it as a psychological hook to take us into strange, funny, romantic places. Blue Velvet is a much darker work, but it employs a similar lure, with the actors made up to look like illustrations of themselves. In a Time cover story on Lynch from the early nineties, Richard Corliss writes of Kyle MacLachlan’s face: “It is a startling visage, as pure of line as an art deco vase, with soft, all-American features and a comic-book hero’s jutting chin—you could park a Packard on it.” It echoes what Pauline Kael says of Isabella Rossellini in Blue Velvet: “She even has the kind of nostrils that cover artists can represent accurately with two dots.” MacLachlan’s chin and Rossellini’s nose would have caught our attention in any case, but it’s also a matter of lighting and makeup, and Lynch shoots them to emphasize their roots in the pulp tradition, or, more accurately, in the subconscious store of images that we take from those sources. And the casting gets him halfway there.

This leaves us in a peculiar position when it comes to the third season of Twin Peaks, which, both by nature and by design, is about aging. Mark Frost said in an interview: “It’s an exercise in engaging with one of the most powerful themes in all of art, which is the ruthless passage of time…We’re all trapped in time and we’re all going to die. We’re all traveling along this conveyor belt that is relentlessly moving us toward this very certain outcome.” One of the first, unforgettable images from the show’s promotional materials was Kyle MacLachlan’s face, a quarter of a century older, emerging from the darkness into light, and our feelings toward these characters when they were younger inevitably shape the way we regard them now. I felt this strongly in two contrasting scenes from last night’s episode. It offers us our first extended look at Sarah Palmer, played by Grace Zabriskie, who delivers a freakout in a grocery store that reminds us of how much we’ve missed and needed her—it’s one of the most electrifying moments of the season. And we also finally see Audrey Horne again, in a brutally frustrating sequence that feels to me like the first time that the show’s alienating style comes off as a miscalculation, rather than as a considered choice. Audrey isn’t just in a bad place, which we might have expected, but a sad, unpleasant one, with a sham marriage and a monster of a son, and she doesn’t even know the worst of it yet. It would be a hard scene to watch with anyone, but it’s particularly painful when we set it against our first glimpse of Audrey in the original series, when we might have said, along with the Norwegian businessman at the Great Northern Hotel: “Excuse me, is there something wrong, young pretty girl?”

Yet the two scenes aren’t all that dissimilar. Both Sarah and Audrey are deeply damaged characters who could fairly say: “Things can happen. Something happened to me.” And I can only explain away the difference by confessing that I was a little in love in my early teens with Audrey. Using those feelings against us—much as the show resists giving us Dale Cooper again, even as it extravagantly develops everything around him—must have been what Lynch and Frost had in mind. And it isn’t the first time that this series has toyed with our emotions about beauty and death. The original dream girl of Twin Peaks, after all, was Laura Palmer herself, as captured in two of its most indelible images: Laura’s prom photo, and her body wrapped in plastic. (Sheryl Lee, like January Jones in Mad Men, was originally cast for her look, and only later did anyone try to find out whether or not she could act.) The contrast between Laura’s lovely features and her horrifying fate, in death and in the afterlife, was practically the motor on which the show ran. Her face still opens every episode of the revival, dimly visible in the title sequence, but it also ended each installment of the original run, gazing out from behind the prison bars of the closing credits to the strains of “Laura Palmer’s Theme.” In the new season, the episodes generally conclude with whatever dream pop band Lynch feels like showcasing, usually with a few cool women, and I wouldn’t want to give that up. But I also wonder whether we’re missing something when we take away Laura at the end. This season began with Cooper being asked to find her, but she often seems like the last thing on anyone’s mind. Twin Peaks never allowed us to forget her before, because it left us staring at her photograph each week, which was the only time that one of its beautiful faces seemed to be looking back at us.

Quote of the Day

leave a comment »

[Science fiction] is, then, a literary genre whose necessary and sufficient conditions are the presence and interaction of estrangement and cognition, and whose main formal device is an imaginative framework alternative to the author’s empirical environment.

Darko Suvin, “Estrangement and Cognition”

Written by nevalalee

July 31, 2017 at 7:30 am

Finding out the lay of the land

leave a comment »

I am trying to express an attitude toward the building of very simple models. I don’t think that models…lead directly to prescription for policy or even to detailed diagnosis. But neither are they a game. They are more like reconnaissance exercises. If you want to know what it’s like out there, it’s all right to send two or three fellows in sneakers to find out the lay of the land and whether it will support human life. If it turns out to be worth settling, then that requires an altogether bigger operation. The job of building usable larger-scale econometric models on the basis of whatever analytical insights come from simple models is much more difficult and less glamorous. But it may be what God made graduate students for. Presumably he had something in mind.

Robert M. Solow, Growth Theory

Written by nevalalee

July 30, 2017 at 7:30 am

A blazoned book of language

leave a comment »

Individuals recognize in the use of many words an original and a transferred meaning, and good speakers and poets have in all times, now more, now less consciously, refreshed and intensified these transferences, or imitated them. Thus poetic metaphor is an outgrowth of the natural transferences of normal speech. It was a general transition, no doubt, when people spoke of ruffled or of deep or of stormy feelings; this general usage was revived and deepened when, to quote a very well chosen example, Wordsworth wrote:

The gods approve
The depth and not the tumult of the soul.

The usual poetic metaphors, then, are individual creations on the model of the regular linguistic transference. The picturesque saying that “Language is a book of faded metaphors” is exactly the reverse of reality, where poetry is rather a blazoned book of language.

Leonard Bloomfield, An Introduction to the Study of Language

Written by nevalalee

July 29, 2017 at 7:30 am

Posted in Quote of the Day

Tagged with

The fanfic disposition

with 6 comments

Yesterday, I mentioned Roxane Gay’s insightful opinion piece on the proposed HBO series Confederate, which was headlined “I Don’t Want to Watch Slavery Fan Fiction.” I’m still sorting out my own feelings toward this show, an alternate history set in the present day in which the South won the Civil War, but I found myself agreeing with just about everything that Gay writes, particularly when she confesses to her own ambivalence:

As a writer, I never wish to put constraints upon creativity nor do I think anything is off limits to someone simply because of who they are. [Creators] Mr. Benioff and Mr. Weiss are indeed white and they have as much a right to create this reimagining of slavery as anyone. That’s what I’m supposed to say, but it is not at all how I feel.

And I was especially struck by Gay’s comparison of the show’s premise to fanfic. Her essay, which appeared in the New York Times, only uses the phrase “fan fiction” once, linking to a tweet from the critic Pilot Bacon, and while its use in reference to Confederate isn’t literally true—at least not if we define fanfic as a derivative work based on characters or ideas by another author—its connotations are clear. Fairly or not, it encapsulates the notion that David Benioff and D.B. Weiss are appropriating existing images and themes to further their own artistic interests.

Even if we table, for now, the question of whether the criticism is justified, it’s worth looking at the history of the word “fanfic” as a pejorative term. I’ve used it that way here myself, particularly in reference to works of art that amount to authorial wish fulfillment toward the characters, like the epilogue to Ann Patchett’s Bel Canto. (Looking back at my old posts, I see that I even once used it to describe a scene in one of my own novels.) Watching The Hobbit: The Battle of the Five Armies recently with my wife, I commented that certain scenes, like the big fight at Dol Guldur, felt like fanfic, except that Peter Jackson was somehow able to get Cate Blanchett, Ian McKellen, Hugo Weaving, and Christopher Lee to reprise all their old roles. And you often see such comparisons made by critics. Gavia Baker-Whitelaw devoted an entire article on The Daily Dot to the ways in which J.K. Rowling’s Harry Potter and the Cursed Child resembled a wok of “badfic,” while Ian Crouch of The New Yorker tried to parse the difference between fanfic and such works as Jean Rhys’s Wide Sargasso Sea:

Fan fiction is surely not a new phenomenon, nor is it an uninteresting one, but it is different in kind and quality from a work like Rhys’s, or, to take a recent example, Cynthia Ozick’s remarkable new novel, Foreign Bodies, which reimagines the particulars of The Ambassadors, by Henry James. Not only do these books interpret texts in the public domain…but they do so with an admirable combination of respect and originality.

As a teenager, I wrote a lot of X-Files fanfic, mostly because I knew that it would give me a readily available audience for the kind of science fiction that I liked, and although I look back on that period in my life with enormous affection—I think about it almost every day—I’m also aware of the limitations that it imposed on my development as a writer. The trouble with fanfic is that it allows you to produce massive amounts of material while systematically avoiding the single hardest element of fiction: the creation of imaginary human beings capable of sustaining our interest and sympathy. It begins in an enviable position, with a cast of characters to which the reader is already emotionally attached. As a result, the writer can easily be left in a state of arrested development, with superb technical skills when it comes to writing about the inner life of existing characters, but little sense of how to do it from scratch. This even holds true when the writer is going back to characters that he or she originally created or realized onscreen. When J.K. Rowling revisits her most famous series or Peter Jackson gives us a fight scene with Elrond and the Ringwraiths, there’s an inescapable sense that all of the heavy lifting took place at an earlier stage. These artists are trading on the affection that we hold toward narrative decisions made years ago, instead of drawing us into the story in the moment. And even when the name on the title page or the director’s credit is the same, readers and viewers can sense when creators are indulging themselves, rather than following the logic of the underlying material.

This all means that fanfic, at its worst, is a code word for a kind of sentimentality, as John Gardner describes it in The Art of Fiction:

If the storyteller appears to stock response (our love of God or country, our pity for the downtrodden, the presumed warm feelings all decent people have for children and small animals)…then the effect is sentimentality, and no reader who’s experienced the power of real fiction will be pleased by it.

Replace “children and small animals” with Harry Potter and Gandalf, and you have a concise description of how fanfic works, encouraging readers to plow through tens of thousands of words because of the hard work of imaginative empathy that someone else did long ago. When Gay and Bacon compare Confederate to fan fiction, I think that this is what they mean. It isn’t drawing on existing characters, but on a collection of ideas, images, and historical events that carry an overwhelming emotional charge before Benioff and Weiss have written a line. You could argue that countless works of art have done the same thing—the canonical work of Civil War fanfic has got to be Gone With the Wind—but if slavery seems somehow different now, it’s largely because of the timing, as Gay notes: “We do not make art in a vacuum isolated from sociopolitical context. We live in a starkly divided country with a president who is shamefully ill equipped to bridge that divide.” Benioff and Weiss spent years developing their premise, and when they began, they couldn’t have anticipated the environment in which their announcement would be received. I don’t want the project to be canceled, which would have a freezing effect throughout the industry, but they should act as if they’re going to be held to a higher standard. Because they will be.

Quote of the Day

leave a comment »

Written by nevalalee

July 28, 2017 at 7:30 am

The driver and the signalman

leave a comment »

In his landmark book Design With Nature, the architect Ian L. McHarg shares an anecdote from the work of an English biologist named George Scott Williamson. McHarg, who describes Williamson as “a remarkable man,” mentions him in passing in a discussion of the social aspects of health: “He believed that physical, mental, and social health were unified attributes and that there were aspects of the physical and social environment that were their corollaries.” Before diving more deeply into the subject, however, McHarg offers up an apparently unrelated story that was evidently too interesting to resist:

One of the most endearing stories of this man concerns a discovery made when he was undertaking a study of the signalmen who maintain lonely vigils while operating the switches on British railroads. The question to be studied was whether these lonely custodians were subject to boredom, which would diminish their dependability. It transpired that lonely or not, underpaid or not, these men had a strong sense of responsibility and were entirely dependable. But this was not the major perception. Williamson learned that every single signalman, from London to Glasgow, could identify infallibly the drivers of the great express trains which flashed past their vision at one hundred miles per hour. The drivers were able to express their unique personalities through the unlikely and intractable medium of some thousand tons of moving train, passing in a fraction of a second. The signalmen were perceptive to this momentary expression of the individual, and Williamson perceived the power of the personality.

I hadn’t heard of Williamson before reading this wonderful passage, and all that I know about him is that he was the founder of the Peckham Experiment, an attempt to provide inexpensive health and recreation services to a neighborhood in Southeast London. The story of the signalmen seems to make its first appearance in his book Science, Synthesis, and Sanity: An Inquiry Into the Nature of Living, which he cowrote with his wife and collaborator Innes Hope Pearse. They relate:

Or again, sitting in a railway signal box on a dark night, in the far distance from several miles away came the rumble of the express train from London. “Hallo,” said my friend the signalman. “Forsyth’s driving her—wonder what’s happened to Courtney?” Next morning, on inquiry of the stationmaster at the junction, I found it was true. Courtney had been taken ill suddenly and Forsyth had deputized for him—all unknown, of course, to the signalman who in any case had met neither Forsyth nor Courtney. He knew them only as names on paper and by their “action-pattern” impressed on a dynamic medium—a unique action-pattern transmitted through the rumble of an unseen train. Or, in a listening post with nothing visible in the sky, said the listener: “That’s ‘Lizzie,’ and Crompton’s flying her.” “Lizzie” an airplane, and her pilot imprinting his action-pattern on her course.

And while Williamson and Pearse are mostly interested in the idea of an individual’s “action-pattern” being visible in an unlikely medium, it’s hard not to come away more struck, like McHarg, by the image of the lone signalman, the passing machine, and the transient moment of connection between them.

As I read over this, it occurred to me that it perfectly encapsulated our relationship with a certain kind of pop culture. We’re the signalmen, and the movie or television show is the train. As we sit in our living rooms, lonely and relatively isolated, something passes across our field of vision—an episode of Game of Thrones, say, which often feels like a locomotive to the face. This is the first time that we’ve seen it, but it represents the end result of a process that has unfolded for months or years, as the episode was written, shot, edited, scored, and mixed, with the contributions of hundreds of men and women we wouldn’t be able to name. As we experience it, however, we see the glimmer of another human being’s personality, as expressed through the narrative machine. It isn’t just a matter of the visible choices made on the screen, but of something less definable, a “style” or “voice” or “attitude,” behind which, we think, we can make out the amorphous factors of influence and intent. We identify an artist’s obsessions, hangups, and favorite tricks, and we believe that we can recognize the mark of a distinctive style even when it goes uncredited. Sometimes we have a hunch about what happened on the set that day, or the confluence of studio politics that led to a particular decision, even if we have no way of knowing it firsthand. (This was one of the tics of Pauline Kael’s movie reviews that irritated Renata Adler: “There was also, in relation to filmmaking itself, an increasingly strident knowingness: whatever else you may think about her work, each column seemed more hectoringly to claim, she certainly does know about movies. And often, when the point appeared most knowing, it was factually false.”) We may never know the truth, but it’s enough if a theory seems plausible. And the primary difference between us and the railway signalman is that we can share our observations with everyone in sight.

I’m not saying that these inferences are necessarily incorrect, any more than the signalmen were wrong when they recognized the personal styles of particular drivers. If Williamson’s account is accurate, they were often right. But it’s worth emphasizing that the idea that you can recognize a driver from the passage of a train is no less strange than the notion that we can know something about, say, Christopher Nolan’s personality from Dunkirk. Both are “unlikely and intractable” mediums that serve as force multipliers for individual ability, and in the case of a television show or movie, there are countless unseen variables that complicate our efforts to attribute anything to anyone, much less pick apart the motivations behind specific details. The auteur theory in film represents an attempt to read movies like novels, but as Thomas Schatz pointed out decades ago in his book The Genius of the System, trying to read Casablanca as the handiwork of Michael Curtiz, rather than that of all of its collaborators taken together, is inherently problematic. And this is easy to forget. (I was reminded of this by the recent controversy over David Benioff and D.B. Weiss’s pitch for their Civil War alternate history series Confederate. I agree with the case against it that the critic Roxane Gay presents in her opinion piece for the New York Times, but the fact that we’re closely scrutinizing a few paragraphs for clues about the merits of a show that doesn’t even exist only hints at how fraught the conversation will be after it actually premieres.) There’s a place for informed critical discussion about any work of art, but we’re often drawing conclusions based on the momentary passage of a huge machine before our eyes, and we don’t know much about how it got there or what might be happening inside. Most of us aren’t even signalmen, who are a part of the system itself. We’re trainspotters.

Quote of the Day

leave a comment »

If we can abstract pathogenicity and hygiene from our notion of dirt, we are left with the old definition of dirt as matter out of place. This is a very suggestive approach. It implies two conditions: a set of of ordered relations and a contravention to that order. Dirt, then, is never a unique, isolated event. Where there is dirt there is system.

Mary Douglas, Purity and Danger

Written by nevalalee

July 27, 2017 at 7:30 am

The Battle of Dunkirk

leave a comment »

During my junior year in college, I saw Christopher Nolan’s Memento at the Brattle Theatre in Cambridge, Massachusetts, for no other reason except that I’d heard it was great. Since then, I’ve seen all of Nolan’s movies on their initial release, which is something I can’t say of any other director. At first, it was because I liked his work and his choices intrigued me, and it only occurred to me around the time of The Dark Knight that I witnessing a career like no other. It’s tempting to compare Nolan to his predecessors, but when you look at his body of work from Memento to Dunkirk, it’s clear that he’s in a category of his own. He’s directed nine theatrical features in seventeen years, all mainstream critical and commercial successes, including some of the biggest movies in recent history. No other director alive comes close to that degree of consistency, at least not at the same level of productivity and scale. Quality and reliability alone aren’t everything, of course, and Nolan pales a bit compared to say, Steven Spielberg, who over a comparable stretch of time went from The Sugarland Express to Hook, with Jaws, Close Encounters, E.T., and the Indiana Jones trilogy along the way, as well as 1941 and Always. By comparison, Nolan can seem studied, deliberate, and remote, and the pockets of unassimilated sentimentality in his work—which I used to assume were concessions to the audience, but now I’m not so sure—only point to how unified and effortless Spielberg is at his best. But the conditions for making movies have also changed over the last four decades, and Nolan has threaded the needle in ways that still amaze me, as I continue to watch his career unfold in real time.

Nolan sometimes reminds me of the immortal Byron the Bulb in Gravity’s Rainbow, of which Thomas Pynchon writes: “Statistically…every n-thousandth light bulb is gonna be perfect, all the delta-q’s piling up just right, so we shouldn’t be surprised that this one’s still around, burning brightly.” He wrote and directed one of the great independent debuts, leveraged it into a career making blockbusters, and slowly became a director from whom audiences expected extraordinary achievements while he was barely out of the first phase of his career. And he keeps doing it. For viewers of college age or younger, he must feel like an institution, while I can’t stop thinking of him as an outlier that has yet to regress to the mean. Nolan’s most significant impact, for better or worse, may lie in the sheer, seductive implausibility of the case study that he presents. Over the last decade or so, we’ve seen a succession of young directors, nearly all of them white males, who, after directing a microbudgeted indie movie, are handed the keys to a huge franchise. This has been taken as an instance of category selection, in which directors who look a certain way are given opportunities that wouldn’t be offered to filmmakers of other backgrounds, but deep down, I think it’s just an attempt to find the next Nolan. If I were an executive at Warner Bros. whose career had overlapped with his, I’d feel toward him what Goethe felt of Napoleon: “[It] produces in me an impression like that produced by the Revelation of St. John the Divine. We all feel there must be something more in it, but we do not know what.” Nolan is the most exciting success story to date of a business model that he defined and that, if it worked, would solve most of Hollywood’s problems, in which independent cinema serves as a farm team for directors who can consistently handle big legacy projects that yield great reviews and box office. And it’s happened exactly once.

You can’t blame Hollywood for hoping that lightning will strike twice, but it’s obvious now that Nolan is like nobody else, and Dunkirk may turn out to be the pivotal film in trying to understand what he represents. I don’t think it’s his best or most audacious movie, but it was certainly the greatest risk, and he seems to have singlehandedly willed it into existence. Artistically, it’s a step forward for a director who sometimes seemed devoted to complexity for its own sake, telling a story of crystalline narrative and geographical clarity with a minimum of dialogue and exposition, with clever tricks with time that lead, for once, to a real emotional payoff. The technical achievement of staging a continuous action climax that runs for most of the movie’s runtime is impressive in itself, and Nolan, who has been gradually preparing for this moment for years, makes it look so straightforward that it’s easy to undervalue it. (Nolan’s great insight here seems to have been that by relying on the audience’s familiarity with the conventions of the war movie, he could lop off the first hour of the story and just tell the second half. Its nonlinear structure, in turn, seems to have been a pragmatic solution to the problem of how to intercut freely between three settings with different temporal and spatial demands, and Nolan strikes me as the one director both to whom it would have occurred and who would have actually been allowed to do it.) On a commercial level, it’s his most brazen attempt, even more than Inception, to see what he could do with the free pass that a director typically gets after a string of hits. And the fact that he succeeded, with a summer box office smash that seems likely to win multiple Oscars, only makes me all the more eager to see what he’ll do next.

It all amounts to the closest film in recent memory to what Omar Sharif once said of Lawrence of Arabia: “If you are the man with the money and somebody comes to you and says he wants to make a film that’s four hours long, with no stars, and no women, and no love story, and not much action either, and he wants to spend a huge amount of money to go film it in the desert—what would you say?” Dunkirk is half as long as Lawrence and consists almost entirely of action, and it isn’t on the same level, but the challenge that it presented to “the man with the money” must have been nearly as great. (Its lack of women, unfortunately, is equally glaring.) In fact, I can think of only one other director who has done anything comparable. I happened to see Dunkirk a few weeks after catching 2001: A Space Odyssey on the big screen, and as I watched the former movie last night, it occurred to me that Nolan has pulled off the most convincing Kubrick impression that any of us have ever seen. You don’t become the next Kubrick by imitating him, as Nolan did to some extent in Interstellar, but by figuring out new ways to tell stories using all the resources of the cinema, and somehow convincing a studio to fund the result. In both cases, the studio was Warner Bros., and I wonder if executives with long memories see Nolan as a transitional figure between Kubrick and the needs of the DC Extended Universe. It’s a difficult position for any director to occupy, and it may well prevent Nolan from developing along more interesting lines that his career might otherwise have taken. His artistic gambles, while considerable, are modest compared to even Barry Lyndon, and his position at the center of the industry can only discourage him from running the risk of being difficult or alienating. But I’m not complaining. Dunkirk is the story of a retreat, but it’s also the latest chapter in the life of a director who just can’t stop advancing.

Written by nevalalee

July 26, 2017 at 9:21 am

Quote of the Day

leave a comment »

Scientific truth, like puristic truth, must come about by controversy. Personally this view is abhorrent to me. It seems to mean that scientific truth must transcend the individual, that the best hope of science lies in its greatest minds being often brilliantly and determinedly wrong, but in opposition, with some third, eclectically minded, middle-of-the-road nonentity seizing the prize while the great fight for it, running off with it, and sticking it into a textbook for sophomores written from no point of view and in defense of nothing whatsoever. I hate this view, for it is not dramatic and it is not fair; and yet I believe that it is the verdict of the history of science.

Edwin Boring, History, Psychology, and Science

Written by nevalalee

July 26, 2017 at 7:30 am

The elements of negation

with 2 comments

In The Elements of Style, William Strunk and E.B. White provide the useful precept: “Put statements in positive form. Make definite assertions. Avoid timid, colorless, hesitating, noncommittal language. Use the word not as a means of denial or in antithesis, never as a means of evasion.” After offering a few illustrations for the sake of comparison, such as “He was not very often on time” as opposed to “He usually came late,” they conclude:

All [these] examples show the weakness inherent in the word not. Consciously or unconsciously, the reader is dissatisfied with being told only what is not; he wishes to be told what it is. Hence, as a rule, it is better to express even a negative in a positive form.

Along with all the other benefits that come with preferring positives over negatives, there’s the subtle point, which Strunk and White don’t mention explicitly, that it forces the writer to think just a little harder at a time when he or she would probably prefer otherwise. The sentence “Shakespeare does not portray Katherine as a very admirable character, nor does Bianca remain long in memory as an important character in Shakespeare’s works” is both longer and less interesting than “Katharine is disagreeable, Bianca significant,” but it’s also easier to write. It’s in that one additional pass, as the writer has to figure out what something is, rather than what it isn’t, that insight tends to happen. All else being equal, the best writing rules are the ones that oblige us to move beyond the obvious answer.

The other problem with negation is that it carries its positive form along with it, like an unwanted ghost or a double exposure. In Philosophical Investigations, Ludwig Wittgenstein writes, with my emphasis: “The feeling is as if the negation of a proposition had to make it true in a certain sense, in order to negate it.” Wittgenstein continues, in an oddly beautiful passage:

“If I say I did not dream last night, still I must know where to look for a dream; that is, the proposition ‘I dreamt,’ applied to this actual situation, may be false, but mustn’t be senseless.”—Does that mean, then, that you did after all feel something, as it were the hint of a dream, which made you aware of the place which a dream would have occupied?

Again: if I say “I have no pain in my arm,” does that mean that I have a shadow of the sensation of pain, which as it were indicates where the pain might be? In what sense does my present painless state contain the possibility of pain?

Or as he puts it a few paragraphs earlier: “A red patch looks different from when it is there from when it isn’t there—but language abstracts from this difference, for it speaks of a red patch whether it is there or not.”

When it comes to conveying meaning, this fact has real practical consequences. As The Stanford Encyclopedia of Philosophy notes: “Not only are negative statements (e.g., ‘Paris isn’t the capital of Spain’) generally less informative than affirmatives (‘Paris is the capital of France’), they are morphosyntactically more marked (all languages have negative markers while few have affirmative markers) and psychologically more complex and harder to process.” In a footnote, it adds:

One consequence of the formal markedness asymmetry is that a negative statement embeds its affirmative counterpart within it; when Nixon famously insisted “I am not a crook” or Clinton “I did not have sex with that woman,” the concealed affirmation was more significant than the surface denial. The same asymmetry is exploited in non-denial denials, such as Republican campaign operative Mary Matalin’s disingenuous protest “We’ve never said to the press that Clinton’s a philandering, pot-smoking draft-dodger.”

Politics is the arena where literary style, like sociology, is tested in the real world, which makes it all the more striking to see how often politicians turn to the negative form when forced to issue denials. Like the phrase “Mistakes were made,” the “I am not a crook” statement has become such a cliché that you’d think that they would avoid it, but it still appears regularly—which implies that it fulfills some deep psychological need.

So what kind of need is it? The philosopher Henri Bergson gets close to the heart of the matter, I think, in a very evocative passage in Creative Evolution, which I’ve highlighted in a few places for emphasis:

Negation is not the work of pure mind, I should say of a mind placed before objects and concerned with them alone. When we deny, we give a lesson to others, or it may be to ourselves. We take to task an interlocutor, real or possible, whom we find mistaken and whom we put on his guard. He was affirming something: we tell him he ought to affirm something else (though without specifying the affirmation which must be substituted). There is no longer then, simply, a person and an object; there is, in face of the object, a person speaking to a person, opposing him and aiding him at the same time; there is a beginning of society. Negation aims at some one, and not only, like a purely intellectual operation, at some thing. It is of a pedagogical and social nature. It sets straight or rather warms—the person warned and set straight being, possibly by a kind of doubling, the very person who speaks.

Politicians are an unusual species because so many of their private utterances become public, and their verbal slips, as on the analyst’s couch, are where they reveal the most. Sometimes it feels as if we’ve overheard them talking to themselves. When Nixon said, “People have got to know whether or not their president is a crook,” he was introducing a word into the conversation that hadn’t been there before, because it had already been rattling around in his brain. And when a politician speaks in the negative, it offers us a peek into the secret conversation that he has been having in his head all along: “I am not a crook,” “I did not have sex with that woman,” “I did not collude.”

Quote of the Day

with one comment

Not a single prophet, during more than a century of prophecies, analyzing the degradation of the romantic culture, or planning the split of the romantic atom, ever imagined anything like fascism. There was, in the lap of the future, communism and syndicalism and whatnot; there was anarchism, and legitimism, and even all-papacy; war, peace, pan-Germanism, pan-Slavism, Yellow Peril, signals to the planet Mars; there was no fascism. It came as a surprise to all, and to themselves, too.

Giuseppe Borgese, “The Intellectual Origins of Fascism”

Written by nevalalee

July 25, 2017 at 7:30 am

The secret museum

leave a comment »

A while back, I published a novel titled The Icon Thief. It was inspired in part by Marcel Duchamp’s enigmatic installation Étant Donnés, which Jasper Johns once called “the strangest work of art in any museum.” From the moment I first saw it, I knew that it was destined to form the basis of a conspiracy thriller, and since someone clearly had to do it eventually, I figured that it might as well be me. (As Lin-Manuel Miranda said to Grantland: “What’s the thing that’s not in the world that should be in the world?”) Here’s how two characters in the book describe it:

“I went to see the installation last year,” Tanya said. “It’s in its own room at the Philadelphia Museum of Art. When you go inside, you see an antique wooden door set into a brick archway. At first, it looks like there’s nothing else there. But if you go closer to the door, you see light coming through a pair of eyeholes. And if you look inside—”

“—you see a headless woman on a bed of dry grass,” Maddy said. “She’s nude, and her face is missing or obscured. In one hand, she’s holding a lamp. There’s a forest with a moving waterfall in the background. Duchamp built the figure himself and covered it in calfskin. The illusion is perfect.”

And while I’ve noted the affinities between David Lynch and Duchamp before, last night’s episode of Twin Peaks, which featured the discovery of a headless body in a field—with one hand raised in a familiar pose—is the clearest indication that I’ve seen so far of an ongoing conversation between these two remarkable artists.

I’m not the first one to propose that Lynch was influenced by Étant Donnés, a connection that the director recently seemed to confirm himself. Five years ago, Lynch produced a lithograph titled E.D., pictured above, which depicts a mirror image of the body from the installation, partially concealed by what looks a lot to me like a velvet curtain. In his spectacularly useful monograph on the piece, the scholar Michael R. Taylor writes:

American filmmaker David Lynch…attended the Pennsylvania Academy of the Fine Arts between 1966 and 1967 and had a solo exhibition in 1969 at the Paley Library Gallery in Philadelphia, a time period that coincided with the public unveiling of Duchamp’s final work. Lynch’s interest in erotic tension and forbidden pleasure are particularly evident in the unsettling yet spellbindingly beautiful film Blue Velvet. In one particularly disturbing scene, the teenage character played by Kyle MacLachlan peers from behind the slats of a wardrobe door to witness a violent sexual encounter between a psychotic criminal (Dennis Hopper) and his female victim (Isabella Rossellini), apparently referencing earlier readings of Étant Donnés as a voyeuristic scene of sadistic violence.

In reality, Blue Velvet seems like less an intentional homage than a case of aesthetic convergence. Lynch has spoken of how the story came out of his youthful fantasies: “I had always wanted to sneak into a girl’s room to watch her into the night, and…maybe, at one point or another, I would see something that would be the clue to a murder mystery.” This is very close to the experience of seeing Étant Donnés itself, although, according to one source, “Lynch states to this day that he has not actually seen the piece in person.” And while I don’t think that he has any reason to lie, I also don’t see any particular reason to believe him.

In short, I was wrong when I wrote two weeks ago: “This might represent the only time in which my love of Twin Peaks will overlap with my professional interests.” And for those who are inclined to dig deeper, there are plenty of parallels between Lynch and Duchamp, aside from their obvious interest in voyeurism and the exposed female body. There’s the waterfall in the background, for one thing, and the fact that no photos of the interior were allowed to be published for fifteen years after it was unveiled—which reminds me a little of Laura telling Cooper that she’ll see him again in twenty-five years. But they also form a line of succession. Temperamentally, the two men couldn’t seem more different: Duchamp may have been “the most intelligent man of the twentieth century,” as Guillaume Apollinaire famously said, but his career came down to a series of chilly, not particularly funny jokes that can be appreciated solely on an intellectual level, not an emotional or visceral one. In other words, he’s very French. By comparison, Lynch is quintessentially American, and even his weirdest visual byways come from a place of real feeling. He’s not as penetrating or rigorous as Duchamp, but far more accessible and likable. On a more fundamental level, though, they can start to seem like brothers. Duchamp spent two decades building Étant Donnés in secret, and there’s something appealingly homemade about the result, with its trompe l’oeil effects cobbled together out of bits of wire and a biscuit tin. Lynch has always been the same sort of tinkerer, and he’s happiest while working on some oddball project at home, which makes it all the more amazing that he’s been entrusted on a regular basis with such huge projects. When you try to imagine Duchamp tackling Dune, you get a sense of how unlikely Lynch’s career has really been.

And the way in which Lynch has quietly revisited Étant Donnés at unpredictable intervals beautifully illustrates how influence works in the real world. When the installation was first put on display in Philadelphia, Lynch was in his early twenties, and even if he didn’t see it in person, it would have been hard to avoid hearing about it at length in art circles. It was a scandal, and a striking image or a work of art encountered at a formative age has a way of coming back into the light at odd times. I should know: I spent my teenage years thinking about Lynch, sketching images from his movies, and listening to Julee Cruise. Every now and then, I’ll see something in my own work that emerges from that undercurrent, even if I wasn’t aware of it at the time. (There’s a scene in The Icon Thief in which Maddy hides in a closet from the villain, and it’s only as I type this that I realize that it’s an amalgam of Lynch and Duchamp—Maddy fights him off with a snow shovel inspired by Duchamp’s In Advance of the Broken Arm.) And Lynch seems to have been haunted by his spiritual predecessor as much as he has haunted me. Lynch has said of his early interest in art: “I had this idea that you drink coffee, you smoke cigarettes, and you paint. And that’s it. Maybe girls come into it a little bit, but basically it’s the incredible happiness of working and living that life.” He claims that it was Robert Henri’s The Art Spirit that inspired him to construct his existence along those lines, but Duchamp was the best possible model. Of the countless artists who followed his example, Lynch just happens to be the one who became rich and famous. And as we enter the closing stretch of Twin Peaks, I can think of no better guide than Duchamp himself, who once said, in response to a question about what his work meant: “There is no solution because there is no problem.”

Written by nevalalee

July 24, 2017 at 8:58 am

Quote of the Day

leave a comment »

Discovery…is a new idea emerging in connection with a fact found by chance or otherwise. Consequently, there can be no method for making discoveries, because philosophic theories can no more give inventive spirit and aptness of mind to men, who do not possess them, than knowledge of the laws of acoustics or optics can give a correct ear or good sight to men deprived of them by nature. But good methods can teach us to develop and use to better purpose the faculties with which nature has endowed us, while poor methods can prevent us from turning them to good account. Thus the genius of inventiveness, so precious in the sciences, may be diminished or even smothered by a poor method, while a good method may increase and develop it.

Claude Bernard, An Introduction to the Study of Experimental Medicine

Written by nevalalee

July 24, 2017 at 7:30 am

The way of the clerk

leave a comment »

Side by side with [the laymen]…there existed until the last half century another, essentially distinct humanity, which to a certain extent acted as a check upon the former. I mean that class of men whom I shall designate “the clerks,” by whom I mean all those whose activity essentially is not the pursuit of practical aims, all those who seek their joy in the practice of an art or a science or metaphysical speculation, in short in the possession of non-material advantage, and hence in a certain manner say: “My kingdom is not of this world.” Indeed, throughout history, for more than two thousand years until modern times, I see an uninterrupted series of philosophers, men of religion, men of literature, artists, men of learning (one might say almost all during this period), whose influence, whose life, were in direct opposition to the realism of the multitudes…

Although these “clerks” founded the modern state to the extent that it dominates the individual egotisms, their activity undoubtedly was chiefly theoretical, and they were unable to prevent the laymen from filling all history with the noise of their hatreds and their slaughters; but the “clerks” did prevent the laymen from setting up their actions as a religion, they did prevent them from thinking themselves great men as they carried out these activities. It may be said that, thanks to the “clerks,” humanity did evil for two thousand years, but honored good. This contradiction was an honor to the human species, and formed the rift whereby civilization slipped into the world.

Julien Benda, The Treason of the Intellectuals

Written by nevalalee

July 23, 2017 at 7:30 am

The magic switch

leave a comment »

If you understand the good magic trick, and I mean really understand it right down to the mechanics at the core of its psychology, the magic trick gets better, not worse…I like stripping things down to the absolute simplicity, and it seems like a ball and a hoop and a person is about as simple as you can get….You can’t look at a half-finished piece of magic and know whether it’s good or not. It has to be perfect before you can evaluate whether it’s good. Magic is a fantastically meticulous form. You forgive other forms. A musician misses a note, moves on, fine. He’ll come to the conclusion of the piece. Magic is an on/off switch. Either it looks like a miracle or it’s stupid.

Teller, to This American Life

Written by nevalalee

July 22, 2017 at 6:51 am

Off the hook

leave a comment »

In his wonderful interview in John Brady’s The Craft of the Screenwriter, Robert Towne—who might best be described as the Christopher McQuarrie of his time—tosses off a statement that is typically dense with insight:

One of the things that people say when they first start writing movies is, “Jeez, I have this idea for a movie. This is the way it opens. It’s a really great opening.” And of course they don’t know where to go from there. That’s true not only of people who are just thinking of writing movies, but very often of people who write them. They’re anxious for a splashy beginning to hook an audience, but then you end up paying for it with an almost mathematical certainty. If you have a lot of action and excitement at the beginning of a picture, there’s going to have to be some explanation, some character development somewhere along the line, and there will be a big sag about twenty minutes after you get into a film with a splashy opening. It’s something you learn. I don’t know if you’d call it technique. It’s made me prefer soft openings for films. It’s been my experience that an audience will forgive you almost anything at the beginning of the picture, but almost nothing at the end. If they’re not satisfied with the end, nothing that led up to it is going to help.

There’s a lot to absorb and remember here, particularly the implication, which I love, that a narrative has a finite amount of energy, and that if you use up too much of it at the start, you end up paying for it later.

For now, though, I’d like to focus on what Towne says about openings. He’s right in cautioning screenwriters against trying to start at a high point, which may not even be possible: I’ve noted elsewhere that few of the great scenes that we remember from movies come at the very beginning, since they require a degree of setup to really pay off. Yet at this very moment, legions of aspiring writers are undoubtedly sweating over a perfect grabber opening for their screenplay. In his interview with Brady, which was published in 1981, Towne blames this on television:

Unlike television, you don’t have to keep people from turning the channel to another network when they’re in the theater. They’ve paid three-fifty or five dollars and if the opening ten or fifteen minutes of a film are a little slow, they are still going to sit a few minutes, as long as it eventually catches hold. I believe in soft openings…Why bother to capture [the audience’s] interest at the expense of the whole film? They’re there. They’re not going anywhere.

William Goldman draws a similar contrast between the two forms in Adventures in the Screen Trade, writing a clumsy opening hook for a screenplay—about a girl being chased through the woods by a “disfigured giant”—and explaining why it’s bad: “Well, among other things, it’s television.” He continues:

This paragraph contains all that I know about writing for television. They need a hook. And they need it fast. Because they’re panicked you’ll switch to ABC. So TV stuff tends to begin with some kind of grabber. But in a movie, and only at the beginning of a movie, we have time. Not a lot, but some.

And while a lot has changed since Towne and Goldman made these statements, including the “three-fifty” that used to be the price of a ticket, the underlying point remains sound. Television calls for a different kind of structure and pacing than a movie, and screenwriters shouldn’t confuse the two. Yet I don’t think that the average writer who is fretting about the opening of his script is necessarily making that mistake, or thinking in terms of what viewers will see in a theater. I suspect that he or she is worrying about a very different audience—the script reader at an agency or production company. A moviegoer probably won’t walk out if the opening doesn’t grab them, but the first reader of a screenplay will probably toss it aside if the first few pages don’t work. (This isn’t just the case with screenwriters, either. Writers of short stories are repeatedly told that they need to hook the reader in the first paragraph, and the result is often a kind of palpable desperation that can actively turn off editors.) One reason, of course, why Towne and Goldman can get away with “soft” openings is that they’ve been successful enough to be taken seriously, both in person and in print. As Towne says:

There have been some shifts in attitudes toward me. If I’m in a meeting with some people, and if I say, “Look, fellas, I don’t think it’s gonna work this way,” there is a tendency to listen to me more. Before, they tended to dismiss a little more quickly than now.

Which, when you think about it, is exactly the same phenomenon as giving the script the benefit of the doubt—it buys Towne another minute or two to make his point, which is all a screenwriter can ask.

The sad truth is that a script trying to stand out from the slush pile and a filmed narrative have fundamentally different needs. In some cases, they’re diametrically opposed. Writers trying to break into the business can easily find themselves caught between the need to hype the movie on the page and their instincts about how the story deserves to be told, and that tension can be fatal. A smart screenwriter will often draw a distinction between the selling version, which is written with an eye to the reader, and the shooting script, which provides the blueprint for the movie, but most aspiring writers don’t have that luxury. And if we think of television as a model for dealing with distracted viewers or readers, it’s only going to get worse. In a recent essay for Uproxx titled “Does Anyone Still Have Time to Wait For Shows to Get Good?”, the legendary critic Alan Sepinwall notes that the abundance of great shows makes it hard to justify waiting for a series to improve, concluding:

We all have a lot going on, in both our TV and non-TV lives, and if you don’t leave enough breadcrumbs in the early going, your viewers will just wander off to watch, or do, something else. While outlining this post, I tweeted a few things about the phenomenon, phrasing it as “It Gets Better After Six Episodes”—to which many people replied with incredulous variations on, “Six? If it’s not good after two, or even one, I’m out, pal.”

With hundreds of shows instantly at our disposal—as opposed to the handful of channels that existed when Towne and Goldman were speaking—we’ve effectively been put into the position of a studio reader with a stack of scripts. If we don’t like what we see, we can move on. The result has been the emotionally punishing nature of so much peak television, which isn’t about storytelling so much as heading off distraction. And if it sometimes seems that many writers can’t do anything else, it’s because it’s all they were ever taught to do.

Quote of the Day

leave a comment »

Every pattern is an obstacle to new patterns, to the extent that the first pattern is inflexible…To make progress, individual originality must be able to express itself. In order that the originality of the idealist whose dreams transcend his century may find expression, it is necessary that the originality of the criminal, who is below the level of his time, shall also be possible. One does not occur without the other.

Émile Durkheim, The Rules of Sociological Method

Written by nevalalee

July 21, 2017 at 7:30 am

The Uber Achievers

leave a comment »

In 1997, the computer scientist Niklaus Wirth, best known as the creator of Pascal, conducted a fascinating interview with the magazine Software Development, which I’ve quoted here before. When asked if it would be better to design programming languages with “human issues” in mind, Wirth replied:

Software development is technical activity conducted by human beings. It is no secret that human beings suffer from imperfection, limited reliability, and impatience—among other things. Add to it that they have become demanding, which leads to the request for rapid, high performance in return for the requested high salaries. Work under constant time pressure, however, results in unsatisfactory, faulty products.

When I read this quotation now, I think of Uber. As a recent story by Caroline O’Donovan and Priya Anand of Buzzfeed makes clear, the company that seems to have alienated just about everyone in the world didn’t draw the line at its own staff: “Working seven days a week, sometimes until 1 or 2 a.m., was considered normal, said one employee. Another recalled her manager telling her that spending seventy to eighty hours a week in the office was simply ‘how Uber works.’ Someone else recalled working eighty to one hundred hours a week.” One engineer, who is now in therapy, recalled: “It’s pretty clear that giving that much of yourself to any one thing is not healthy. There were days where I’d wake up, shower, go to work, work until midnight or so, get a free ride home, sleep six hours, and go back to work. And I’d do that for a whole week.”

“I feel so broken and dead,” one employee concluded. But while Uber’s internal culture was undoubtedly bad for morale, it might seem hard at first to make the case that the result was an “unsatisfactory, faulty” product. As a source quoted in the article notes, stress at the company led to occasional errors: “If you’ve been woken up at 3 a.m. for the last five days, and you’re only sleeping three to four hours a day, and you make a mistake, how much at fault are you, really?” Yet the Uber app itself is undeniably elegant and reliable, and the service that it provided is astonishingly useful—if it weren’t, we probably wouldn’t even be talking about it now. When we look at what else Wirth says, though, the picture becomes more complicated. All italics in the following are mine:

Generally, the hope is that corrections will not only be easy, because software is immaterial, but that the customers will be willing to share the cost. We know of much better ways to design software than is common practice, but they are rarely followed. I know of a particular, very large software producer that explicitly assumes that design takes twenty percent of developers’ time, and debugging takes eighty percent. Although internal advocates of an eighty percent design time versus twenty percent debugging time have not only proven that their ratio is realistic, but also that it would improve the company’s tarnished image. Why, then, is the twenty-percent design time approach preferred? Because with twenty-percent design time your product is on the market earlier than that of a competitor consuming eighty-percent design time. And surveys show that the customer at large considers a shaky but early product as more attractive than a later product, even if it is stable and mature.

This description applies perfectly to Uber, as long as we remember that its “product” isn’t bounded by its app alone, but extends to its impact on drivers, employees, competitors, and the larger community in which it exists—or what an economist would call its externalities. Taken as a closed system, the Uber experience is perfect, but only because it pushes its problems outside the bounds of the ride itself. When you look at the long list of individuals and groups that its policies have harmed, you discern the outlines of its true product, which can be described as the system of interactions between the Uber app and the world. You could say this of most kinds of software, but it’s particularly stark for a service that is tied to the problem of physically moving its customers from one point to another on the earth’s surface. By that standard, “shaky but early” describes Uber beautifully. It certainly isn’t “stable and mature.” The company expanded to monstrous proportions before basic logistical, political, and legal matters had been resolved, and it acted as if it could simply bull its way through any obstacles. (Its core values, let’s not forget, included “stepping on toes” and “principled confrontation.”) Up to a point, it worked, but something had to give, and economic logic dictated that the stress fall on the human factor, which was presumably resilient enough to absorb punishment from the design and technology sides. One of the most striking quotes in the Buzzfeed article comes from Uber’s chief human resources officer: “Many employees are very tired from working very, very hard as the company grew. Resources were tight and the growth was such that we could never hire sufficiently, quickly enough, in order to keep up with the growth.” To assert that “resources were tight” at the most valuable startup on the planet seems like a contradiction in terms, and it would be more accurate to say that Uber decided to channel massive amounts of capital in certain directions while neglecting those that it cynically thought could take it.

But it was also right, until it wasn’t. Human beings are extraordinarily resilient, as long as you can convince them to push themselves past the limits of their ability, or at least to do work at rates that you can afford. In the end, they burn out, but there are ways to postpone that moment or render it irrelevant. When it came to its drivers, Uber benefited from a huge pool of potential contractors, which made turnover a statistical, rather than an individual, problem. With its corporate staff and engineers, there was always the power of money, in the form of equity in the company, to persuade people to stay long past the point where they would have otherwise quit. The firm gambled that it would lure in plenty of qualified hires willing to trade away their twenties for the possibility of future wealth, and it did. (As the Buzzfeed article reveals, Uber seems to have approached compensation for its contractors and employees in basically the same way: “Uber acknowledges that it pays less than some of its top competitors for talent…The Information reported that Uber uses an algorithm to estimate the lowest possible compensation employees will take in order to keep labor costs down.”) When the whole system finally failed, it collapsed spectacularly, and it might help to think of Uber’s implosion, which unfolded over less than six months, as a software crash, with bugs that were ignored or patched cascading in a chain reaction that brings down the entire program. And the underlying factor wasn’t just a poisonous corporate culture or the personality of its founder, but the sensibility that Wirth identified two decades ago, as a company rushed to get a flawed idea to market on the assumption that consumers—or society as a whole—would bear the costs of correcting it. As Wirth asks: “Who is to blame for this state of affairs? The programmer turned hacker; the manager under time pressure; the business man compelled to extol profit wherever possible; or the customer believing in promised miracles?”

Written by nevalalee

July 20, 2017 at 8:29 am

Quote of the Day

with one comment

Written by nevalalee

July 20, 2017 at 7:30 am

%d bloggers like this: