Today everything comes to an end virtually as soon as it begins, and vanishes almost as soon as it appears. But everything repeats itself and starts over again…As interest in it gets progressively weaker, so news becomes more rapid and concentrated, until finally, at the end of a shorter and shorter period, it wears itself out…News shrinks to the size of the socially instantaneous, and the immediate instant ends to disappear in an instant which has already passed.
These words might have appeared the other day, or been posted online a few seconds ago, but in fact, they’re over half a century old, written by the French philosopher Henri Lefebvre. Nowadays, it seems that we’re all concerned by the problem of distraction—by a world that carves our attention spans into increasingly tiny increments—but it’s an issue that thoughtful people have worried about for a long time. The technology behind our present situation may be new, but the dilemma is as old as mass media itself: we’re bombarded by information on all sides and have the option of switching easily between countless streams of content, to the point where it feels as if our own thoughts are being driven out of our heads. Creative or meaningful thinking needs quiet time, circles of solitude and disconnection, and there are moments, as I pointed out yesterday, when they seem to be dwindling down to the size of a shower stall.
The quote from Lefebrve comes from a recent New Yorker piece by the writer Evgeny Morozov, who surveys some of the latest works on distraction—including three nonfiction books and a new novel by Dave Eggers—to see if they provide any insights on dealing with our ongoing deluge of stimulation. Morozov, who is in his late twenties, reveals that he owns a safe with a timer that he uses to lock away his Internet cable and cell phone, and he approvingly discusses the Dutch scholar Christoph Linder’s proposal that cities create “slow spots,” areas where visitors have no choice but to disconnect. Reading the literature of disconnection, I’m struck by how often it falls back on religious language or imagery: proponents talk of technological sabbaths, of tech-free retreats, of digital fasting. It’s a natural human impulse to want to withdraw from the world, and these days, the world of the flesh seems to have taken a digital form, as if it were an act of spiritual virtue in itself to turn off one’s phone and confront, for once, who we really are when we aren’t online. And like any form of renunciation, this runs the risk of confusing external trappings with the real inward changes it longs to create.
Because the real challenge isn’t learning to live without distraction, but learning to live with it, just as it’s more difficult to live a life of simplicity and renunciation within the city than in the desert. R.H. Blyth, in his great Zen in English Literature and Oriental Classics, quotes the Saikontan, the Japanese translation of the aphorisms of the Chinese philosopher Hong Zicheng: “The mark of nobility is to have nothing to do with power, reputation, wealth, and rank; but the noblest thing of all is to have these and yet be unaffected by them.” That’s how I tend to feel about distraction. People have been wasting time forever; a hundred years ago, we’d still find ways of procrastinating, of avoiding extended thought, or of confronting the tasks that really matter. The means by which we’re able to avoid these things may have changed, but the underlying impulse remains constant, and it has more to do with human nature than the particular form it takes. This only means that learning to live a productive life in the face of constant distraction doesn’t necessarily mean unplugging altogether—although for some people it wouldn’t hurt—but being able to integrate the forces competing for our attention more thoughtfully into our everyday lives. Renunciation, in itself, isn’t a bad thing, but it’s all too easy to return from that digital sabbath, or from any pilgrimage, to discover that we’re still the same as before.
And the first step is to acknowledge how wonderful connectivity, and even distraction, can be. My online life has informed my offline existence in countless ways: I’ve been exposed to ideas, writers, music, and media that I never would have discovered otherwise, and my creative life has been enriched accordingly. (As I’ve mentioned elsewhere, the germ of the idea for my novel Eternal Empire was the result of a chance encounter with a book excerpt on a blog, and I never could have researched my novels and short stories as efficiently as I have without access to online materials.) At times, like everyone, I worry about my exposure to so much constantly refreshed content, which is when I pick up a book or a ukulele or a baby instead. But just as what we think about in the shower tells us a lot about what really matters to us, the things that distract us provide a glimpse into who we actually are. If we don’t like what we see there, there are ways of addressing it, but turning off the computer only addresses the behavior, not the actual cause. It isn’t easy to become a person who can absorb all these distractions without being consumed by them, and I’m far from being there myself. But if the first step is to occasionally unplug, the next is to plug back in, check your browser history, and ask how you want it to look tomorrow.
Treat nature in terms of the cylinder, the sphere, and the cone…Everything I am telling you about—the sphere, the cone, cylinder, concave shadow—on mornings when I’m tired these notions of mine get me going, they stimulate me. I soon forget them once I start using my eyes.
I realized recently that what one thinks about in the shower in the morning is more important than I’d thought. I knew it was a good time to have ideas. Now I’d go further: now I’d say it’s hard to do a really good job on anything you don’t think about in the shower.
I know what he means. For as long as I can remember, my morning shower has been my best thinking time, the protected space in which I can most comfortably work through whatever problems I’m trying to solve. And while it’s easy to let your mind wander, which, as Graham points out, is a good way of discovering what really matters to you at the moment, I’ve decided that this time is too precious to be left entirely to chance. When I’m writing a novel, I try to look over my notes for the day just before I turn on the water, and I usually find that I’ve come up with a number of new ideas before it shuts off. If I’m stuck for a topic for a blog post, I’ll take whatever sliver of inspiration I can—often in the form of one of Brian Eno’s Oblique Strategies—and mull it over for five minutes as the shower runs. More often than not, I’ll emerge with something useful. It works so consistently, in fact, that I’ve come to see it as an essential part of my writing routine, an extension of my office or brain. And I’m far from alone in this. Woody Allen, for instance, takes his showers very seriously:
I’ve found over the years that any momentary change stimulates a fresh burst of mental energy…The shower is particularly good in cold weather. This sounds so silly, but I’ll be working dressed as I am and I’ll want to get into the shower for a creative stint. So I’ll take off some of my clothes and make myself an English muffin or something and try to give myself a little chill so I want to get in the shower. I’ll stand there with steaming hot water coming down for thirty minutes, forty-five minutes, just thinking out ideas and working on plot. Then I get out and dry myself and dress and then flop down on the bed and think there.
Allen here is as insightful as always—if you haven’t checked out Eric Lax’s Conversations With Woody Allen, from which this quote is taken, you really should—but he’s particularly shrewd on identifying a shower as a moment of change. In the shower, we’re taken out of our usual environment; we become semiaquatic creatures, in a humid little cube, and it’s at such points of transition that our minds are likely to move in promising directions.
There are other ways of encouraging this kind of mental and physical shift, most of them linked to relaxing, unconscious activities: taking a walk, doing routine chores, shaving. But there’s also something about the shower itself that seems especially conductive to mental activity. Alone, unclothed, we’re in a particularly vulnerable state, which is what makes the shower’s most famous cinematic appearance so effective. All the same, we’re in a state of relaxation, but also standing, and although I know that a lot of writers have done good thinking in the bathtub, I don’t think it’s quite as conducive to the kind of focused mental trip that the shower provides. You can read in the bathtub, after all, as long as you’re careful with the pages, while the shower is an enforced citadel of quiet. Hanging a radio or, worse, an iPad on the tile robs us of one of our last remaining fortresses of solitude. It’s best just to stand there in the cone of white noise that the cascade of water creates, as removed from the world as we can be while still remaining awake, and it’s the best time I know for uninterrupted, right-brained, intuitive thought.
And keeping an eye on your thoughts in the shower isn’t just a way of working through problems, but of clarifying which problems really matter. To close on Paul Graham once again:
I suspect a lot of people aren’t sure what’s the top idea in their mind at any given time. I’m often mistaken about it. I tend to think it’s the idea I’d want to be the top one, rather than the one that is. But it’s easy to figure this out: just take a shower. What topic do your thoughts keep returning to? If it’s not what you want to be thinking about, you may want to change something.
In the shower, we come as close as we can to who we really are when all the masks are gone, and we can learn a lot about ourselves by seeing where our minds wander. My own shower has a little window that looks out on my backyard, and I’ll often catch myself looking out at the square of lawn behind my house, thinking over my life, what I’ve accomplished, and what still remains to be done. It’s something like the state we enter as we’re drifting off to sleep, but with our eyes wide open. When we emerge, we’re refreshed and at peace, with a new perspective on the tasks ahead. If this were a new invention, it would seem like magic. And it is.
A few strong instincts and a few plain rules.
As I’ve mentioned elsewhere, I’m at a point in my life—it’s called “fatherhood”—in which I can see maybe three or four films in theaters every year. My wife and I saw The Hobbit the week before our daughter was born, and since then, our moviegoing has been restricted to a handful of big event movies: Star Trek Into Darkness, Man of Steel, Gravity. In general, my criteria for whether a movie is worth catching on the big screen are fairly simple. It needs to be something that would be considerably reduced on television, which applies particularly to a film like Gravity: I loved it, and I plan to watch it again and again, but its impact won’t be nearly the same at home. Reviews count, as well as my own intangible excitement over a franchise, and beyond that, I tend to go with directors whose work has impressed in the past, which is why I know that the one movie I’ll definitely be seeing next year is Chris Nolan’s Interstellar. In other words, after a lifetime of seeking out strange and challenging movies in theaters, I’ve turned into something like a studio’s idea of the mainstream moviegoer, who tends to prefer known quantities to interesting gambles, and is happy to catch the rest on video. You can complain all you like about Hollywood’s reliance on sequels, remakes, and established properties, but when I look at my own choices as a movie lover with a limited amount of time, I can’t say it’s entirely wrong.
But if there’s a bright side to all this, it’s that it allows me to treat myself as a kind of guinea pig: I can take a hard look at my newfound conservatism as a moviegoer with what remains of my old analytical eye. So much of how Hollywood operates is based on a few basic premises about what audiences want, and as I’ve become less adventurous as a viewer, I’ve gotten a better sense of how accurate those assumptions—presumably based on endless focus group testing and box office analysis—really are. And I’ve come to some surprising conclusions. I’ve found, for instance, that star power alone isn’t enough to get me out of the house: I’m an unabashed Tom Cruise fan, but I still waited for Oblivion to arrive at Redbox. I don’t need a happy ending to feel that I’ve gotten my money’s worth, as long as a darker conclusion is honestly earned. And the one that I can’t repeat often enough is this: I’m not worried about whether I’m going to “like” the characters. Studios are famously concerned about how likable their characters are, and they get nervous about any project in which the lead comes off as unsympathetic. Industry observers tend to think in the same way. As a writer for Time Out recently said of the trailer for The Wolf of Wall Street: “Why should we give a damn about these self-absorbed, money-grubbing Armani-clad cretins and spend our money and time learning about their lives?”
Well, to put it mildly, I can think of a few reasons why, and they’re strong enough that The Wolf of Wall Street is the next, and probably last, movie this year that I expect will get me into theaters. Spending three hours in the company of an Armani-clad cretin seen through the eyes of Martin Scorsese strikes me as a great use of my money and time, and while I can’t speak for the rest of the world, the movie we’ve glimpsed so far looks sensational. Part of this, of course, is because Scorsese has proven himself so capable of engaging us in the lives of unlikable characters. I don’t think there’s a sympathetic face to be seen throughout all of Casino, one of the most compulsively watchable movies of all time, and Scorsese has always seemed more comfortable in the heads of the flawed and unredeemable: it’s the difference between Goodfellas and Kundun, or Raging Bull and Hugo, and even a sleek machine like Cape Fear comes off as an experiment in how thoroughly he can grip us without a likable figure in sight. But there’s a larger principle at work here, too. Scorsese, by consensus, operates at a consistently higher level than any other filmmaker of his generation, and if he’s drawn to such flawed characters, this probably tells us less about him personally than about the fact that his craft is powerful enough to get away with it. Likability wouldn’t be a factor if all movies were this good.
In other words, any fears over the protagonist’s likability are really an admission that something else is going wrong, either in story or execution: the audience doesn’t care about the characters not because they aren’t sympathetic enough, but because it hasn’t been given a reason to be invested on a deeper level. Trying to imbue the hero in a meaningless story with more likable qualities is like changing the drapes while the house is on fire, but unfortunately, it’s often all the studio can understand. As Shane Black notes in the excellent interview collection Tales From the Script:
Movie stars are gonna give you your best ideas, because they’re the opposite of development people. Development people are always saying, “How can the character be more likable?” Meanwhile, the actor’s saying, “I don’t want to be likable.” You know, they give you crazy things like, “I wanna eat spaghetti with my hands.” Crazy’s great. Anything but this sort of likable guy that everyone at the studio insists they should play.
“Make him more likable,” like “raising the stakes,” is a development executive’s dream note: it doesn’t require any knowledge of the craft of storytelling, and you won’t get fired for suggesting it. But let’s not mistake it for anything more. I don’t want my characters to be likable; I want them to be interesting. And if the characters, or the story around them, are interesting enough, it might even get me out of the house.
Industry in art is a necessity—not a virtue—and any evidence of the same, in the production, is a blemish, not a quality; a proof, not of achievement, but of absolutely insufficient work, for work alone will efface the footsteps of work.
A few weeks ago, I broke a longstanding promise: I picked up the ukulele again. Earlier this year, I wrote a long post on how I learned not to play the ukulele, or, more generally, why my attempts at developing interesting hobbies have tended to fall apart. Writing consumes so much of my life that there isn’t room for much more, aside from family, friends, and books, but recently, I found myself taking an unaccustomed break. I had two or three projects winding their way through various stages that were out of my hands, so I was doing little more than playing the waiting game. Under most circumstances, I’d have filled the gap with an interim writing project, like a short story, but the break happened to coincide with a period when my daughter, now crawling like a champ, was demanding more of my time: I’d open a book or start writing up some notes only to jump up seconds later to stop her from chewing an extension cord. What I really needed was something to occupy my time while leaving me free to drop it at a moment’s notice, and the ukulele, which had been lying in my office closet for years, seemed like a pretty good candidate. So I dusted it off and set out, armed with an instruction book and a bunch of online tutorials, to see how much I could teach myself in a little over two weeks.
It helped that my ambitions were modest. At the most, I wanted to learn how to noodle around with it well enough to amuse myself and, ideally, my daughter. Over the last year, I’ve found myself with a lot of odd corners of time, too short to do anything meaningful but too long to spend just refreshing my web browser, and learning an instrument felt like a good way to fill up those orphaned minutes. (I may also have been inspired by Lin Yu-Tang’s description of the life of half and half: “[A man] who plays the piano, but only well enough for his most intimate friends to hear, and chiefly to please himself.”) And the nice thing about aiming only to noodle is that you’re pleased by even the most incremental signs of progress. C, G, F, and A minor, held together with some common chord progressions and a good strumming pattern, are enough to occupy a beginner for hours, and in the meantime, you’re developing muscle memory, a sense of rhythm, and those crucial calluses on your first three fingers. I’m nowhere near the point where I’d have any business playing for anyone but my closest friends, but two weeks into the process, I’ve picked up enough that I can see myself noodling away for a long time, acquiring new tricks as needed, and fumbling toward something like basic competence.
And because I’m the kind of person who turns everything into a metaphor for something else, I’ve been thinking a lot about what this means for learning any kind of art. I haven’t tried to teach myself a new creative skill in ages, and it reminds me of how much I take for granted: I’ve been a decent writer for as long as I can remember, and although I shudder a little when I look back at my past efforts—which include the earliest posts on this blog—it’s been a long time since I had to worry about the fundamentals. This isn’t to say that I’m not often dissatisfied with my work, but when I fall short, as I often do, it’s usually because of flawed execution at a higher level, or because the underlying premise itself is wanting. Within a broad range between those extremes, I move comfortably, as I have for a long time. Learning to play an instrument, even one as accessible as the ukulele, takes me back to a time when even the simplest building blocks refuse to come together, and it’s hard to do something as simple as switching from A minor to an E minor chord. You know the sounds you want to make, but your fingers refuse to cooperate, and when you look at the chasm the separates you from the masters of the craft, it feels as if you’ll never get even halfway to what you want to become.
Which is where the noodling comes in. Noodling alone won’t make you an artist, and there inevitably comes a time when you need to focus on aspects of craft that aren’t as fun in themselves—the rules, the development of discipline, the practicing of chords and scales. But when I look back at my own writing life, I’m struck by how much time was spent on the literary equivalent of noodling: bits of stories, fragments of ideas, fanfic, conceits pursued for a page or two before being abandoned. If I had left it at that, I’d never have become the writer I wanted to be, but it was an essential part of learning to live with, and love, the instrument itself. So much of writing instruction, and I include this blog in that category, is rightly obsessed with process and craft, but the rules only have a chance to take hold once you’ve had a taste of what the result will be. It helps to scale your expectations accordingly, and the ability to noodle around with your materials, whether they’re words, chords, or pigments, is as good a place to start as any. Some of us never get beyond that, and that’s fine; noodling offers plenty of pleasures of its own. But it’s reassuring to know that once our fingers have started to remember things for themselves, and we’ve had a hint of the joys to come, that there’s a world of craft still waiting for us, somewhere over the rainbow.