Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘AVQ&A

The unusual suspects

leave a comment »

The Usual Suspects

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What 1995 pop culture would you want to experience again for the first time?”

Yesterday, while discussing a scene from one of my own novels, I mentioned two movies in passing: The Usual Suspects and Seven. These references appeared in separate paragraphs, to illustrate two different ideas, and I don’t think I made any particular connection between them at the time. Obviously, though, they’re a natural pair: they collectively made a star out of Kevin Spacey, and they were released within a month of each other in 1995. (In fact, I vividly remember watching them both for the first time on home video on the same weekend, although this wouldn’t have been until the year after, when Spacey had already won his Oscar. Seven made a greater immediate impression, but I’d go on to watch my tape of The Usual Suspects maybe a dozen times over the next couple of years.) When I cited them here, I didn’t think much about it. I’ve thought about both of these movies a lot, and they served as convenient genre touchstones for the points I wanted to make. And I took for granted that most readers of this blog would have seen them, or at least be familiar enough with them for their examples to be useful.

But this may have been an unwarranted assumption. In one’s own life, twenty years can pass like the blink of an eye, but in pop culture terms, it’s a long time. If we take a modern high school sophomore’s familiarity with the movies of two decades ago as the equivalent of my knowledge of the films of 1975, we soon see that we can’t assume anything at all. I saw myself then as a film buff, and although I can laugh a little now at how superficial any teenager’s grasp of movie history is likely to be, I was genuinely curious about the medium and eager to explore its past. Looking at a list of that year’s most notable movies, though, I’m chagrined at how few of them I’ve seen even now. There was Jaws, of course, and my obsession with Kubrick made me one of the few teens who willingly sat through all of Barry Lyndon. I’m fairly sure I’d seen One Flew Over the Cuckoo’s Nest and Nashville at that point, although the chronology is a bit muddled, and both were films I had to actively seek out, as I did later with Amarcord. The Rocky Horror Picture Show had premiered on television a few years earlier on Fox, and I watched it, although I don’t have the slightest idea what I thought of it at the time. And I didn’t rent Dog Day Afternoon until after college.

Anthony Hopkins in Nixon

In fact, I’d guess that the only two movies from that year that your average teenage boy is likely to have seen, then and now, are Jaws and Monty Python and the Holy Grail. Even today, there are big gaps in my own knowledge of the year’s top grossers: I’ve still never seen Shampoo, despite its status as one of the three great Robert Towne scripts, and I hadn’t even heard of Aloha, Bobby, and Rose. When we advance the calendar by two decades, the situation looks much the same. Toy Story, the biggest hit of that year, is still the one that most people have seen. I’m guessing that Heat and Die Hard With a Vengeance hold some allure for budding genre fans, as do Clueless and Sense and Sensibility for a somewhat different crowd. The Usual Suspects and Seven are safe. And I’d like to think that Casino still draws in younger viewers out of its sheer awesomeness, which makes even The Wolf of Wall Street seem slightly lame. But many of the other titles here are probably just names, the way Funny Lady or The Apple Dumpling Gang are to me, and it would take repeated acts of diligence to catch up with some of these movies, now that another twenty years of cinema have flowed under the bridge. Awards completists will check out Braveheart, Apollo 13, Babe, and Leaving Las Vegas, but there are countless other worthy movies that risk being overlooked.

Take Nixon, for example. At the time, I thought it was the best film of its year, and while I wouldn’t rank it so highly these days, it’s still a knockout: big, ambitious, massively entertaining, and deeply weird. It has one of the greatest supporting casts in movies, with an endlessly resourceful lead performance by Anthony Hopkins that doesn’t so much recall Nixon himself as create an indelible, oddly sympathetic monster of its own. But even on its initial release, it was a huge flop, and it hasn’t exactly inspired a groundswell of reappraisal. Even if you’re an Oliver Stone fan—and I don’t know how many devotees he has under the age of thirty—it’s probably not one of his top five movies that anyone is likely to check off. (The rough equivalent would be a diehard Coppola enthusiast deciding it was time to watch The Cotton Club.) The only reason I’ve seen it is because I was old enough to catch in theaters, when I’ve never made time to rent Salvador or Talk Radio. And if I were talking to a bright fifteen year old who wanted to see some good movies,  I don’t know when Nixon would come up, if ever. But if it’s worth mentioning at all, it’s less for its own merits than as part of a larger point. Everyone will give you a list of movies to watch, but there’s a lot worth discovering that you’ll have to seek out on your own, once you move past the usual suspects.

Written by nevalalee

June 12, 2015 at 9:51 am

Under the covers

leave a comment »

The Goldfinch by Donna Tartt

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What great albums do you love that have ugly album covers?”

There are two kinds of readers in this world: those who keep the dust jackets on their books, and those who take them off. For most of my life, I’ve been in the latter camp. Whenever I’m out with a hardcover, I’ll usually leave the dust jacket behind, and although I’ll restore it as soon as the book is back on the shelf, I feel more comfortable carrying an anonymous spine in public. The reasons can be a little hard to parse, even for me. On a practical level, an unsecured dust jacket can be cumbersome and inconvenient: it has a way of slipping up or down whenever you’re reading a book that isn’t flat on a table, which leads to rumpled and torn corners. Really, though, it’s a matter of discretion. I don’t necessarily want to advertise what I’m reading for everyone else to see, and a book cover, among other things, is an advertisement, as well as an invitation to judge. Whenever we’re in close proximity to other readers, we all do it, but I prefer to avoid it entirely. Reading, for me, is an immersion in a private world, and what I do there is my own business. And this holds true whether or not the title could be construed as odd or embarrassing. (Only once in my adult life have I ever constructed a paper slipcover to conceal the cover of a book I was reading on the subway. It was the Bible.)

This is particularly true of covers that aggressively sell the contents to the point of active misrepresentation, which seems to be the case pretty often. As I’ve said before in reference to my own novels, a book’s cover art is under tremendous pressure to catch the buyer’s eye: frequently, it’s the only form of advertising a book ever gets. Hence the chunky fonts, embossed letters, and loud artwork that help a book stand out on shelves, but feel vaguely obscene when held in the hand. And the cover image need bear little resemblance to the material inside. Back in the heyday of pulp fiction, seemingly every paperback original was sold with the picture of a girl with a gun, even if the plot didn’t include any women at all. Hard Case Crime, the imprint founded by my friend and former colleague Charles Ardai, has made a specialty of publishing books with covers that triangulate camp, garishness, and allure, and sometimes it gleefully pushes the balance too far. I was recently tempted to pick up a copy of their reprint of Michael Crichton’s Binary, an early pulp thriller written under the pseudonym John Lange, but the art was about ten percent too lurid: I just couldn’t see myself taking it on a plane. There’s no question that it stood out in the store, but it made me think twice about taking it home.

Binary by John Lange

In theory, once we’ve purchased a book, album, or movie, its cover’s work is done, as with any other kind of packaging. And yet we also have to live with it, even if the degree of that engagement varies a lot from one medium to another. In an ideal world, every book would come with two covers—one to grab the browser’s eye, the other to reside comfortably on a shelf at home—and in fact, a lot of movies take this approach: the boxes for my copies of The Godfather Trilogy and The Social Network, among others, come with a flimsy fake cover to display in stores, designed to be removed to present a more sober front at home. It’s not so different from the original function of a dust jacket, which was meant solely as a protective covering to be thrown away after the book was purchased. In practice, I don’t feel nearly the same amount of ambivalence toward ugly DVD or album covers as I do with books: the experience of watching a movie or listening to music is detachable from the container in which it arrives, while a book is all of a piece. That said, there are a couple of movies in my collection, like Say Anything, that I wish didn’t look so egregiously awful. And like a lot of Kanye fans, I always do a double take when the deliberately mortifying cover art for My Beautiful Dark Twisted Fantasy pops up in my iTunes queue.

But I don’t often think consciously about album art these days, any more than I can recall offhand how the box covers look for most of my movies. And there’s a sense in which such packaging has grown increasingly disposable. For many of us, the only time we’ll see the cover art for a movie or album is as a thumbnail on Amazon before we click on it to download. Even if we still buy physical discs, the jewel case is likely to be discarded or lost in a closet as soon as we’ve uploaded it in digital form. Covers have become an afterthought, and the few beautiful examples that we still see feel more like they’re meant to appeal to the egos of the artists or designers, as well as a small minority of devoted fans. But as long as physical media still survive, the book is the one format in which content and packaging will continue to exist as a unit, and although we’ll sometimes have to suffer through great books with bad covers, we can also applaud the volumes in which form and content tell a unified story. Pick up a novel like The Goldfinch, and you sense at once that you’re in good hands: regardless of how you feel about the book itself, the art, paper, and typesetting are all first-rate—it’s like leafing through a Cadillac. I feel happy whenever I see it on my shelf. And one of these days, I may even finish reading it.

Written by nevalalee

May 29, 2015 at 10:19 am

The middle ground

leave a comment »

The Mad Men episode "In Care Of"

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What series are you waiting to dive into until you can do it all at once?”

Yesterday, while leafing through a recent issue of The New Yorker, I came across the following lines in a book review by James Wood:

[Amit Chaudhuri] has struggled, as an Indian novelist writing in English, with the long shadow of Salman Rushdie’s Booker-winning novel Midnight’s Children…and with the notion, established in part by the success of that book, that fictional writing about Indian life should be noisy, magical, hybrid, multivocally “exotic”—as busy as India itself…He points out that in the Bengali tradition “the short story and novella have predominated at least as much as the novel,” and that there are plenty of Indian writers who have “hoped to suggest India by ellipsis rather than by all-inclusiveness.”

Wood, who is no fan of the “noisy, magical, hybrid” form that so many modern novels have assumed, draws an apt parallel to “the ceaseless quest for the mimetically overfed Great American Novel.” But an emphasis on short, elliptical fiction has been the rule, rather than the exception, in our writing programs for years. And a stark division between big and small seems to be true of most national literatures: think of Russia, for instance, in which Eugene Onegin stands as the only real rival as a secular scripture to the loose, baggy monsters of Tolstoy and Dostoyevsky.

Yet most works of art, inevitably, end up somewhere in the middle. If we don’t tend to write essays or dissertations about boringly midsized novels, which pursue their plot and characters for the standard three hundred pages or so, it’s for much the same reason that we don’t hear much about political moderates: we may be in the majority, but it isn’t news. Our attention is naturally drawn to the extreme, which may be more interesting to contemplate, but which also holds the risk that we’ll miss the real story by focusing on the edges. When we think about film editing, for instance, we tend to focus on one of two trends: the increasingly rapid rate of cutting, on the one hand, and the fetishization of the long take, on the other. In fact, the average shot length has been declining at a more or less linear rate ever since the dawn of the sound era, and over the last quarter of a century, it’s gone from about five seconds to four—a change that is essentially imperceptible. The way a movie is put together has remained surprisingly stable for more than a generation, and whatever changes of pace we do find are actually less extreme than we might expect from the corresponding technical advances. Digital techniques have made it easier than ever to construct a film out of very long or very short shots, but most movies still fall squarely in the center of the bell curve. And in terms of overall length, they’ve gotten slightly longer, but not by much.

Emilia Clarke on Game of Thrones

That’s true of other media as well. Whenever I read think pieces about the future of journalism, I get the impression that we’ve been given a choice between the listicle and the longread: either we quickly skim a gallery of the top ten celebrity pets, or we devote an entire evening to scrolling through a lapbreaker like “Snow Fall.” Really, though, most good articles continue to fall in the middle ground; it’s just hard to quantify what makes the best ones stand out, and it’s impossible to reduce it to something as simple as length or format. Similarly, when it comes to what we used to call television, the two big stories of the last few years have been the dueling models of Vine and Netflix: it seems that either we can’t sit still for more than six seconds at a time, or we’re eager to binge on shows for hours and hours. There are obvious generational factors at play here—I’ve spent maybe six seconds total on Vine—but the division is less drastic than it might appear. In fact, I suspect that most of us still consume content in the way we always have, in chunks of half an hour to an hour. Mad Men was meant to be seen like this; so, in its own way, was Community, which bucked recent trends by releasing an episode per week. But it isn’t all that interesting to talk about how to make a great show that looks more or less like the ones that have come before, so we don’t hear much about it.

Which isn’t to say that the way we consume and think about media hasn’t changed. A few years ago, the idea of waiting to watch a television show until its entire run was complete might have seemed ridiculous; now, it’s an option that many of us seriously consider. (The only series I’ve ever been tempted to wait out like this was Lost, and it backfired: once I got around to starting it, the consensus was so strong that it went nowhere that I couldn’t bring myself to get past the second season.) But as I’ve said before, it can be a mistake for a television show—or any work of art—to proceed solely with that long game in mind, without the pressure of engaging with an audience from week to week. We’re already starting to see some of the consequences in Game of Thrones, which thinks entirely in terms of seasons, but often forgets to make individual scenes worth watching on a level beyond, “Oh, let’s see what this guy is doing.” But a show that focuses entirely on the level of the scene or moment can sputter out after a few seasons, or less: Unbreakable Kimmy Schmidt had trouble sustaining interest in its own premise for even thirteen episodes. The answer, as boring as it may be, lies in the middle, or in the narratives that think hard about telling stories in the forms that have existed before, and will continue to exist. The extremes may attract us. But it’s in the boring middle ground that the future of an art form is made.

An unfinished decade

leave a comment »

Joaquin Phoenix in The Master

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What movie from our best films of the decade so far list doesn’t deserve to be on there?”

Toward the end of the eighties, Premiere Magazine conducted a poll of critics, directors, writers, and industry insiders to select the best films of the previous decade. The winners, in order of the number of votes received, were Raging Bull, Wings of Desire, E.T., Blue Velvet, Hannah and Her Sisters, Platoon, Fanny and Alexander, Shoah, Who Framed Roger Rabbit, and Do the Right Thing, with The Road Warrior, Local Hero, and Terms of Endearment falling just outside the top ten. I had to look up the list to retype it here, but I also could have reconstructed much of it from memory: a battered copy of Premiere’s paperback home video guide—which seems to have vanished from existence, along with its parent magazine, based on my inability, after five minutes of futile searching, to even locate the title online—was one of my constant companions as I started exploring movies more seriously in high school. And if the list contains a few headscratchers, that shouldn’t be surprising: the poll was held a few months before the eighties were technically even over, which isn’t close to enough time for a canon to settle into a consensus.

So how would an updated ranking look? The closest thing we have to a more recent evaluation is the latest Sight & Sound critics’ poll of the best films ever made. Pulling out only the movies from the eighties, the top films are Shoah, Raging Bull, Blade Runner, Blue Velvet, Fanny and Alexander, A City of Sadness, Do the Right Thing, L’Argent, The Shining, and My Neighbor Totoro, followed closely by Come and See, Distant Voices Still Lives, and Once Upon a Time in America. There’s a degree of overlap here, and Raging Bull was already all but canonized when the earlier survey took place, but Wings of Desire, which once came in second, is nowhere in sight, its position taken by a movie—Blade Runner—that didn’t even factor into the earlier conversation. The Shining received the vote of just a single critic in the Premiere poll, and at the time it was held, My Neighbor Totoro wouldn’t be widely seen outside Japan for another three years. Still, if there’s a consistent pattern, it’s hard to see, aside from the obvious point that it takes a while for collective opinion to stabilize. Time is the most remorseless, and accurate, critic of them all.

Inception

And carving up movies by decade is an especially haphazard undertaking. A decade is an arbitrary division, much more so than a single year, in which the movies naturally engage in a kind of accidental dialogue. It’s hard to see the release date of Raging Bull as anything more than a quirk of the calendar: it’s undeniably the last great movie of the seventies. You could say much the same of The Shining. And there’s pressure to make any such list conform to our idea of what a given decade was about. The eighties, at least at the time, were seen as a moment in which the auteurism of the prior decade was supplanted by a blockbuster mentality, encouraged, as Tony Kushner would have it, by an atmosphere of reactionary politics, but of course the truth is more complicated. Blue Velvet harks back to the fifties, but the division at its heart feels like a product of Reaganism, and the belated ascent of Blade Runner is an acknowledgment of the possibilities of art in the era of Star Wars. (As an offhand observation, I’d say that we find it easier to characterize decades if their first years happen to coincide with a presidential election. As a culture, we know what the sixties, eighties, and aughts were “like” far more than the seventies or nineties.)

So we should be skeptical of the surprising number of recent attempts to rank works of art when the decade in question is barely halfway over. This week alone, The A.V. Club did it for movies, while The Oyster Review did it for books, and even if we discount the fact that we have five more years of art to anticipate, such lists are interesting mostly in the possibilities they suggest for later reconsideration. (The top choices at The A.V. Club were The Master, A Separation, The Tree of Life, Frances Ha, and The Act of Killing, and looking over the rest of the list, about half of which I’ve seen, I’d have to say that the only selection that really puzzled me was Haywire.) As a culture, we may be past the point where a consensus favorite is even possible: I’m not sure if any one movie occupies the same position for the aughts that Raging Bull did for the eighties. If I can venture one modest prediction, though, it’s that Inception will look increasingly impressive as time goes on, for much the same reason as Blade Runner does: it’s our best recent example of an intensely personal version thriving within the commercial constraints of the era in which it was made. Great movies are timeless, but also of their time, in ways that can be hard to sort out until much later. And that’s true of critics and viewers, too.

The curated past of Mad Men

with 2 comments

Jessica Paré and Jon Hamm on Mad Men

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What has Mad Men inspired you to seek out?”

Now that Mad Men is entering its final stretch at last, it’s time to acknowledge a subtle but important point about the source of its appeal. This is my favorite television drama of all time. I’m not going to argue that it’s the greatest series ever—we’ll need another decade or two to make that appraisal with a cool head—but from one scene to the next, one episode after another, it’s provided me with more consistent pleasure and emotion than any show I can name. I’ve spoken before, perhaps too often, about what I like to call its fractal quality: the tiniest elements start to feel like emblems of the largest, and there’s seemingly no limit to how deep you can drill while analyzing even the smallest of touches. For proof, we need turn no further than the fashion recaps by Tom and Lorenzo, which stand as some of the most inspired television criticism of recent years. The choice of a fabric or color, the reappearance of a dress or crucial accessory, a contrast between the outfits of one character and another turn out to be profoundly expressive of personality and theme, and it’s a testament to the genius of both costume designer Jane Bryant and Matthew Weiner, the ultimate man behind the curtain.

Every detail in Mad Men, then, starts to feel like a considered choice, and we can argue over their meaning and significance for days. But that’s also true of any good television series. By definition, everything we see in a work of televised fiction is there because someone decided it should be, or didn’t actively prevent it from appearing. Not every showrunner is as obsessed with minutiae as Weiner is, but it’s invariably true of the unsung creative professionals—the art director, the costume designer, the craftsmen responsible for editing, music, cinematography, sound—whose contributions make up the whole. Once you’ve reached the point in your career where you’re responsible for a department in a show watched by millions, you’re not likely to achieve your effects by accident: even if your work goes unnoticed by most viewers, every prop or bit of business is the end result of a train of thought. If asked, I don’t have any doubt that the costume designers for, say, Revenge or The Vampire Diaries would have much to say about their craft as Jane Bryant does. But Mad Men stands alone in the current golden age of television in actually inspiring that kind of routine scrutiny for each of its aesthetic choices, all of which we’re primed to unpack for clues.

Jon Hamm and Matthew Weiner on the set of Mad Men

What sets it apart, of course, is its period setting. With a series set in the present day, we’re more likely to take elements like costume design and art direction for granted; it takes a truly exceptional creative vision, like the one we find in Hannibal, to encourage us to study those choices with a comparable degree of attention. In a period piece, by contrast, everything looks exactly as considered as it really is: we know that every lamp, every end table, every cigarette or magazine cover has been put consciously into place, and while we might appreciate this on an intellectual level with other shows, Mad Men makes us feel it. And its relatively recent timeframe makes those touches even more evident. When you go back further, as with a show like Downton Abbey, most of us are less likely to think about the decisions a show makes, simply because it’s more removed from our experience: only a specialist would take an interest in which kind of silverware Mrs. Hughes sets on the banquet table, rather than another, and we’re likely to think of it as a recreation, not a creation. (This even applies to a series like Game of Thrones, in which it’s easy to take the world it makes at face value, at least until the seams start to show.) But the sixties are still close enough that we’re able to see each element as a choice between alternatives. As a result, Mad Men seems curated in a way that neither a contemporary or more remote show would be.

I’m not saying this to minimize the genuine intelligence behind Mad Men’s look and atmosphere. But it’s worth admitting that if we’re more aware of it than usual, it’s partially a consequence of that canny choice of period. Just as a setting in the recent past allows for the use of historical irony and an oblique engagement with contemporary social issues, it also encourages the audience to regard mundane details as if they were charged with significance. When we see Don Draper reading Bernard Malamud’s The Fixer, for instance, we’re inclined to wonder why, and maybe even check it out for ourselves. And many of us have been influenced by the show’s choices of fashion, music, and even liquor. But its real breakthrough lay in how those surface aspects became an invitation to read more deeply into the elements that mattered. Even if we start to pay less attention to brand names or articles of set dressing, we’re still trained to watch the show as if everything meant something, from a line of throwaway dialogue to Don’s lingering glance at Megan at the end of “Hands and Knees.” Like all great works of art, Mad Men taught us how to watch it, and as artists as different as Hitchcock and Buñuel understood, it knew that it could only awaken us to its deepest resonances by enticing us first with its surfaces. It turned us all into noticers. And the best way to honor its legacy is by directing that same level of attention onto all the shows we love.

Written by nevalalee

April 3, 2015 at 9:33 am

The opening act dilemma

leave a comment »

Ronald McDonald

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “Have you ever gone to a concert just for the opener?”

Earlier this week, I described the initial stages of creating a brand, whether commercial or artistic, as a kind of charitable enterprise: you’ve got to be willing to lose money for years to produce anything with a chance of surviving. I was speaking primarily of investors and patrons, but of course, it’s also true of artists themselves. A career in the arts requires an enormous initial investment of time, energy, and money—at least in the form of opportunity cost, as you choose not to pursue more remunerative forms of making a living—and a major factor separating those who succeed from those who don’t is the amount of pain they’re willing to endure. David Mamet famously said that everyone gets a break in show business in twenty-five years: some get it at the beginning, others at the end, and all you can really control is how willing you are to stick around after everyone else has gone home. That’s always been true, but more recently, it’s led to a growing assumption that emerging artists should be willing, even eager, to give work away for free. With media of all kinds being squeezed on both sides by increasing competition and diminishing audiences, there’s greater pressure than ever to find cheap content, and the most reliable source has always been hungry unknowns desperate for any kind of exposure.

And that last word is an insidious one. Everybody wants exposure—who wouldn’t?—but its promise is often used to justify arrangements in which artists are working for nothing, or at a net loss, for companies that aren’t in it for charity. Earlier this month, McDonald’s initially declined to pay the bands scheduled to play at its showcase at South by Southwest, saying instead that the event would be “a great opportunity for additional exposure.” (This took the form of the performers being “featured on screens throughout the event, as well as possibly mentioned on McDonald’s social media counts.”) When pressed on this, the company replied sadly: “There isn’t a budget for an artist fee.” Ultimately, after an uproar that canceled out whatever positive attention it might have expected, it backtracked and agreed to compensate the artists. And even if this all sort of went nowhere, it serves as a reminder of how craven even the largest corporations can be when it comes to fishing for free content. McDonald’s always seeks out the cheapest labor it can, cynically passing along the hidden human costs to the rest of society, so there’s no reason to expect it to be any different when it comes to music. As Mamet says of movie producers, whenever someone talks to you about “exposure,” what they’re really saying is: “Let me take that cow to the fair for you, son.”

Ronald McDonald

That said, you can’t blame McDonald’s for seizing an opportunity when it saw one. If there are two groups of artists who have always been willing to work for free, it’s writers and musicians, and it’s a situation that has been all but institutionalized by how the industries themselves are structured. A few months ago, Billboard published a sobering breakdown of the costs of touring for various tiers of performers. For a headliner like Lady Gaga or Katy Perry, an arena performance can net something like $300,000, and even after the costs of production, crew, and transportation are deducted, it’s a profitable endeavor. But an opening act gets paid a flat fee of $15,000 or so, and when you subtract expenses and divide the rest between members of the band, you’re essentially paying for the privilege of performing. As Jamie Cheek, an entertainment business manager, is quoted as saying: “If you get signed to a major label, you’re going to make less money for the next two or three years than you’ve ever made in your life.” And it remains a gamble for everyone except the label itself. Over the years, I’ve seen countless opening acts, but I’d have trouble remembering even one, and it isn’t because they lacked talent. We’re simply less likely to take anything seriously if we haven’t explicitly paid for it.

That’s the opening act dilemma. And it’s worth remembering this if you’re a writer being bombarded with proposals to write for free, even for established publications, for the sake of the great god exposure. For freelancers, it’s created a race to the bottom, as they’re expected to work for less and less just to see their names in print. And we shouldn’t confuse this with the small presses that pay contributors in copies, if at all. These are labors of love, meant for a niche audience of devoted readers, and they’re qualitatively different from commercial sites with an eye on their margins. The best publications will always pay their writers as fairly as they can afford. Circulation for the handful of surviving print science-fiction magazines has been falling for years, for instance, but Analog and Asimov’s recently raised their rate per word by a penny or so. It may not sound like much, but it amounts to a hundred dollars or so that it didn’t need to give its authors, most of whom would gladly write for even less. Financially, it’s hard to justify, but as a sign of respect for its contributors, it speaks volumes, even as larger publications relentlessly cut their budgets for freelancers. As painful as it may be, you have to push back, unless you’re content to remain an opening act for the rest of your life. You’re going to lose money anyway, so it may as well be on your own terms. And if someone wants you to work for nothing now, you can’t expect them to pay you later.

Written by nevalalee

March 27, 2015 at 9:22 am

Altered states of conscientiousness

with 2 comments

Bob Dylan in Don't Look Back

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What pop culture is best consumed in an altered state?”

When Bob Dylan first met the Beatles, the story goes, he was astonished to learn that they’d never used drugs. (Apparently, the confusion was all caused by a mondegreen: Dylan misheard a crucial lyric from “I Want to Hold Your Hand” as “I get high” instead of “I can’t hide.”) This was back in the early days, of course, and later, the Beatles would become part of the psychedelic culture in ways that can’t be separated from their greatest achievements. Still, it’s revealing that their initial triumphs emerged from a period of clean living. Drugs can encourage certain qualities, but musicianship and disciplined invention aren’t among them, and I find it hard to believe that Lennon and McCartney would have gained much, if anything, from controlled substances without that essential foundation—certainly not to the point where Dylan would have wanted to meet them in the first place. For artists, drugs are a kind of force multiplier, an ingredient that can enhance elements that are already there, but can’t generate something from nothing. As Norman Mailer, who was notably ambivalent about his own drug use, liked to say, drugs are a way of borrowing on the future, but those seeds can wither and die if they don’t fall on soil that has been prepared beforehand.

Over the years, I’ve read a lot written by or about figures in the drug culture, from Carlos Castaneda to Daniel Pinchbeck to The Electric Kool-Aid Acid Test, and I’m struck by a common pattern: if drugs lead to a state of perceived insight, it usually takes the form of little more than a conviction that everyone should try drugs. Drug use has been a transformative experience for exceptional individuals as different as Aldous Huxley, Robert Anton Wilson, and Steve Jobs, but it tends to be private, subjective, and uncommunicable. As such, it doesn’t have much to do with art, which is founded on its functional objectivity—that is, on its capacity to be conveyed more or less intact from one mind to the next. And it creates a lack of critical discrimination that can be dangerous to artists when extended over time. If marijuana, as South Park memorably pointed out, makes you fine with being bored, it’s the last thing artists need, since art boils down to nothing but a series of deliberate strategies for dealing with, confronting, or eradicating boredom. When you’re high, you’re easily amused, which makes you less likely to produce anything that can sustain the interest of someone who isn’t in the same state of chemical receptivity.

2001: A Space Odyssey

And the same principle applies to the artistic experience from the opposite direction. When someone says that 2001 is better on pot, that isn’t saying much, since every movie seems better on pot. Again, however, this has a way of smoothing out and trivializing a movie’s real merits. Kubrick’s film comes as close as any ever made to encouraging a transcendent state without the need of mind-altering substances, and his own thoughts on the subject are worth remembering:

[Drug use] tranquilizes the creative personality, which thrives on conflict and on the clash and ferment of ideas…One of the things that’s turned me against LSD is that all the people I know who use it have a peculiar inability to distinguish between things that are really interesting and stimulating and things that appear so in the state of universal bliss the drug induces on a good trip. They seem to completely lose their critical faculties and disengage themselves from some of the most stimulating areas of life.

Which isn’t to say that a temporary relaxation of the faculties doesn’t have its place. I’ll often have a beer while watching a movie or television show, and my philosophy here is similar to that of chef David Chang, who explains his preference for “the lightest, crappiest beer”:

Let me make one ironclad argument for shitty beer: It pairs really well with food. All food. Think about how well champagne pairs with almost anything. Champagne is not a flavor bomb! It’s bubbly and has a little hint of acid and is cool and crisp and refreshing. Cheap beer is, no joke, the champagne of beers.

And a Miller Lite—which I’m not embarrassed to proclaim as my beer of choice—pairs well with almost any kind of entertainment, since it both gives and demands so little. At minimum, it makes me the tiniest bit more receptive to whatever I’m being shown, not enough to forgive its flaws, but enough to encourage me to meet it halfway. For much the same reason, I no longer drink while working: even that little extra nudge can be fatal when it comes to evaluating whether something I’ve written is any good. Because Kubrick, as usual, deserves the last word: “Perhaps when everything is beautiful, nothing is beautiful.”

Written by nevalalee

March 20, 2015 at 9:16 am

%d bloggers like this: