Last week, I mentioned what I’ve come to see as the most valuable piece of writing wisdom I know, which is David Mamet’s advice in Some Freaks “to go one achievable step at a time.” You don’t try to do everything at once, which is probably impossible anyway. Instead, there are days in which you do “careful” jobs that are the artistic equivalent of housekeeping—research, making outlines of physical actions, working out the logic of the plot—and others in which you perform “inventive” tasks that rely on intuition. This seems like common sense: it’s hard enough to be clever or imaginative as it is, without factoring in the switching costs associated with moving from one frame of mind to another. The writer Colin Wilson believed that the best ideas emerge when your left and right hemispheres are moving at the same rate, which tends to occur in moments of either reverie or high excitement. This is based on an outdated model of how the brain works, but the phenomenon it describes is familiar enough, and it’s just a small step from there to acknowledging that neither ecstatic nor dreamlike mental states are particularly suited for methodical work. When you’re laying the foundations for future creative activity, you usually end up somewhere in the middle, in a state of mind that is focused but not heightened, less responsive to connections than to units, and concerned more with thoroughness than with inspiration. It’s an important stage, but it’s also the last place where you’d expect real insights to appear.
Clearly, a writer should strive to work with, rather than against, this natural division of labor. It’s also easy to agree with Mamet’s advice that it’s best to tackle one kind of thinking per day. (Mental switching costs of any kind are usually minimized when you’ve had a good night’s sleep in the meantime.) The real question is how to figure out what sort of work you should be doing at any given moment, and, crucially, whether it’s possible to predict this in advance. Any writer can tell you that there’s an enormous difference between getting up in the morning without any idea of what you’re doing that day, which is the mark of an amateur, and having a concrete plan—which is why professional authors use such tools as outlines and calendars. Ideally, it would be nice to know when you woke up whether it was going to be a “careful” day or an “inventive” day, which would allow you to prepare yourself accordingly. Sometimes the organic life cycle of a writing project supplies the answer: depending on where you are in the process, you engage in varying proportions of careful or inventive thought. But every stage requires some degree of both. As Mamet implies, you’ll often alternate between them, although not as neatly as in his hypothetical example. And while it might seem pointless to allocate time for inspiration, which appears according to no fixed schedule, you can certainly create the conditions in which it’s more likely to appear. But how do you know when?
I’ve come up with a simple test to answer this question: I ask myself how much time I expect to spend sitting down. Usually, before a day begins, I have a pretty good sense of how much sitting or standing I’ll be doing, and that’s really all I need to make informed decisions about how to use my time. There are some kinds of creative work that demand sustained concentration at a desk or in a seated position. This includes most of the “careful” tasks that Mamet describes, but also certain forms of intuitive, nonlinear thinking, like making a mind map. By contrast, there are other sorts of work that not only don’t require you to be at your desk, but are actively stifled by it: daydreaming, brooding over problems, trying to sketch out large blocks of the action. You often do a better job of it when you’re out taking a walk, or in the bus, bath, or bed. When scheduling creative work, then, you should start by figuring out what your body is likely to be doing that day, and then use this to plan what to do with your mind. Your brain has no choice but to tag along with your body when it’s running errands or standing in line at the bank, but if you structure your time appropriately, those moments won’t go to waste. And it’s often such external factors, rather than the internal logic of where you should be in the process, that determine what you should be doing.
At first glance, this doesn’t seem that much different from the stock advice that you should utilize whatever time you have available, whether you’re washing the dishes or taking a shower. But I think it’s a bit more nuanced than this, and that it’s more about matching the work to be done to the kind of time you have. If you try to think systematically and carefully while taking a walk in the park, you’ll feel frustrated when your mind wanders to other subjects. Conversely, if you try to daydream at your desk, not only are you likely to feel boxed in by your surroundings, but you’re also wasting valuable time that would be better spent on work that only requires the Napoleonic virtues of thoroughness and patience. Inspiration can’t be forced, and you don’t know in advance if you’re better off being careful or inventive on any given day—but the amount of time that you’ll be seated provides an important clue. (You can also reverse the process, and arrange to be seated as little as possible on days when you hope to get some inventive thinking done. For most of us, unfortunately, this isn’t entirely under our control, which makes it all the more sensible to take advantage of such moments when they present themselves.) And it doesn’t need to be planned beforehand. If you’re at work on a problem and you’re not sure what kind of thinking you should be doing, you can look at yourself and ask: Am I sitting down right now? And that’s all the information you need.
“If you want to understand function, study structure,” I was supposed to have said in my molecular biology days…I think one should approach these problems at all levels…In nature hybrid species are usually sterile, but in science the reverse is often true. Hybrid subjects are often astonishingly fertile, whereas if a scientific discipline remains too pure it usually wilts.
A few days ago, Jordan Crucchiola of Vulture wrote a think piece titled “The Best Place for Women in Action Movies is Next to Tom Cruise.” The article makes the argument, which strikes me as indisputable, that the women in films like the Mission: Impossible series have made such consistently strong impressions that it can’t all be an accident. I’ve written here before at possibly excessive length about Rebecca Ferguson in Rogue Nation, who was arguably the best part of one of my favorite recent action movies, and Emily Blunt in Edge of Tomorrow speaks for herself. And it’s only after multiple viewings of Ghost Protocol, which is a movie that I’m happy to watch again on any given night, that I’ve come to realize the extent to which Paula Patton is its true star and emotional center: Cruise is content to slip into the background, like a producer paying a visit to the set, while the real interest of the scene unfolds elsewhere. For an actor who has often been accused of playing the same role in every movie—although it’s more accurate to say that he emphasizes different aspects of his core persona, and with greater success and variety than most leading men—he’s notably willing to defer to the strong women with whom he shares the screen. As Crucchiola concludes: “You get the sense that, as he approaches sixty, Cruise is more than happy to share the responsibility of anchoring a blockbuster action movie. It’s almost as if he’s creating a kind of hero apprentice program.”
This is all true, as far as it goes, but it also hints at an even larger insight that the article glimpses but never quite articulates. You can start by widening the scope a bit and noting that the best place for a man in a movie is next to Cruise, too. Actors as different as Cuba Gooding Jr., Colin Farrell, and Ken Watanabe have gotten big assists from providing reliable support in a Cruise vehicle, and his filmography is littered with fascinating but abortive experiments, like Dougray Scott, that never quite got off the ground. As a movie star, Cruise has shown an unusual interest—and again, it’s so consistent that it can’t be accidental—in providing meaningful secondary parts for both men and women, some of which are really the lead in disguise. (Eyes Wide Shut is essentially a series of short films in which Cruise cedes the focus to another performer for ten minutes or so, and each one feels like the beginning of a career.) And when you pull back even further, you notice that he’s performed much the same function for directors. At the height of his power, Cruise made a notable effort to work with most of the world’s best filmmakers, but after Kubrick and Spielberg, there were no more worlds to conquer. Instead, he began to seek out directors who were on the rise or on the rebound: J.J. Abrams, Brad Bird, Christopher McQuarrie. Not every effort along those lines paid off, and it can be hard to discern what he saw in, say, Joseph Kosinski. But you could make a strong case that Cruise has launched more players on both sides of the camera than any other major star.
In other words, his track record with actresses is just a subset, although a very important one, of a more expansive program for developing talent. Elsewhere, I’ve spoken of Cruise as a great producer who happens to inhabit the body of a movie star, but this doesn’t go far enough: he’s more like a one-man studio. A decade ago, he and Paula Wagner made an undeniably bad job of running the creative end of United Artists, but it’s noteworthy that his shift toward working with emerging directors occurred at around the same time. It’s as if after failing to turn around a conventional studio, Cruise saw that he could put together a leaner, nimbler version on his own, and that it required no permanent infrastructure apart from his stardom and ability to raise money. It would be a studio like Pixar, which, instead of scattering its attention across multiple projects, devoted most of its resources to releasing a single big movie every year. When you look at his recent career through that lens, it clarifies one of its less explicable trends: Cruise’s apparent decision, well into his fifties, to redefine himself as an action hero, at a point when most actors are easing themselves into less physically challenging parts. If you remember how versatile a dramatic lead he used to be, it feels like a loss, but it makes sense when you imagine him as the head of a studio with only one asset. Cruise has chosen to focus on tentpole pictures, just like the rest of the industry, and what makes it unique is how relentlessly he relies on himself alone to drive that enormous machine.
Which only reinforces my conviction, which I’ve held for years, that this is the most interesting career in the movies. Even its compromises are instructive, when taken as part of the larger strategy. (The Jack Reacher franchise, for instance, which the world wasn’t exactly clamoring to see, is a conscious attempt to create a series of midrange movies that allow Cruise to hit a double at the box office, rather than going for a home run every time. They’re the breathing spaces between Mission: Impossible installments. Similarly, his upcoming involvement in the reboot of The Mummy feels like a test case in partnering with someone else’s franchise, in a kind of joint venture.) If Tom Cruise is a secret studio, he’s done a better job of it than most corporations. At a time when the industry is struggling to come to terms with the problem of diversity, Cruise has launched the careers of a lot of attractive, talented performers of diverse backgrounds without ever making a point of it, and he’s done it in plain sight. Outside the echo chamber of Hollywood, and with the significant exception of Disney, audiences aren’t interested in studios as brands. Development executives are nonentities whose anonymity allows them to associate themselves with success, distance themselves from failure, and conceal the fact that they don’t know what they’re doing. Cruise doesn’t have that luxury. He’s made smart, pragmatic decisions for thirty years—and in public. And he makes the rest of the industry seem smaller by comparison.
In [the] contrast between wholeness and sum lies the tragical tension in any biological, psychological, and sociological evolution. Progress is only possible by passing from a state of undifferentiated wholeness to differentiation of parts…Every evolution, by unfolding some potentiality, nips in the bud many other possibilities.
In last week’s issue of The New Yorker, the critic Emily Nussbaum delivers one of the most useful takes I’ve seen so far on Westworld. She opens with many of the same points that I made after the premiere—that this is really a series about storytelling, and, in particular, about the challenges of mounting an expensive prestige drama on a premium network during the golden age of television. Nussbaum describes her own ambivalence toward the show’s treatment of women and minorities, and she concludes:
This is not to say that the show is feminist in any clear or uncontradictory way—like many series of this school, it often treats male fantasy as a default setting, something that everyone can enjoy. It’s baffling why certain demographics would ever pay to visit Westworld…The American Old West is a logical fantasy only if you’re the cowboy—or if your fantasy is to be exploited or enslaved, a desire left unexplored…So female customers get scattered like raisins into the oatmeal of male action; and, while the cast is visually polyglot, the dialogue is color-blind. The result is a layer of insoluble instability, a puzzle that the viewer has to work out for herself: Is Westworld the blinkered macho fantasy, or is that Westworld? It’s a meta-cliffhanger with its own allure, leaving us only one way to find out: stay tuned for next week’s episode.
I agree with many of her reservations, especially when it comes to race, but I think that she overlooks or omits one important point: conscious or otherwise, it’s a brilliant narrative strategy to make a work of art partially about the process of its own creation, which can add a layer of depth even to its compromises and mistakes. I’ve drawn a comparison already to Mad Men, which was a show about advertising that ended up subliminally criticizing its own tactics—how it drew viewers into complex, often bleak stories using the surface allure of its sets, costumes, and attractive cast. If you want to stick with the Nolan family, half of Chris’s movies can be read as commentaries on themselves, whether it’s his stricken identification with the Joker as the master of ceremonies in The Dark Knight or his analysis of his own tricks in The Prestige. Inception is less about the construction of dreams than it is about making movies, with characters who stand in for the director, the producer, the set designer, and the audience. And perhaps the greatest cinematic example of them all is Vertigo, in which Scotty’s treatment of Madeline is inseparable from the use that Hitchcock makes of Kim Novak, as he did with so many other blonde leading ladies. In each case, we can enjoy the story on its own merits, but it gains added resonance when we think of it as a dramatization of what happened behind the scenes. It’s an approach that is uniquely forgiving of flawed masterpieces, which comment on themselves better than any critic can, until we wonder about the extent to which they’re aware of their own limitations.
And this kind of thing works best when it isn’t too literal. Movies about filmmaking are often disappointing, either because they’re too close to their subject for the allegory to resonate or because the movie within the movie seems clumsy compared to the subtlety of the larger film. It’s why Being John Malkovich is so much more beguiling a statement than the more obvious Adaptation. In television, the most unfortunate recent example is UnREAL. You’d expect that a show that was so smart about the making of a reality series would begin to refer intriguingly to itself, and it did, but not in a good way. Its second season was a disappointment, evidently because of the same factors that beset its fictional show Everlasting: interference from the network, conceptual confusion, tensions between producers on the set. It seemed strange that UnREAL, of all shows, could display such a lack of insight into its own problems, but maybe it isn’t so surprising. A good analogy needs to hold us at arm’s length, both to grant some perspective and to allow for surprising discoveries in the gaps. The ballet company in The Red Shoes and the New York Inquirer in Citizen Kane are surrogates for the movie studio, and both films become even more interesting when you realize how much the lead character is a portrait of the director. Sometimes it’s unclear how much of this is intentional, but this doesn’t hurt. So much of any work of art is out of your control that you need to find an approach that automatically converts your liabilities into assets, and you can start by conceiving a premise that encourages the viewer or reader to play along at home.
Which brings us back to Westworld. In her critique, Nussbaum writes: “Westworld [is] a come-hither drama that introduces itself as a science-fiction thriller about cyborgs who become self-aware, then reveals its true identity as what happens when an HBO drama struggles to do the same.” She implies that this is a bug, but it’s really a feature. Westworld wouldn’t be nearly as interesting if it weren’t being produced with this cast, on this network, and on this scale. We’re supposed to be impressed by the time and money that have gone into the park—they’ve spared no expense, as John Hammond might say—but it isn’t all that different from the resources that go into a big-budget drama like this. In the most recent episode, “Dissonance Theory,” the show invokes the image of the maze, as we might expect from a series by a Nolan brother: get to the center to the labyrinth, it says, and you’ve won. But it’s more like what Douglas R. Hofstadter describes in I Am a Strange Loop:
What I mean by “strange loop” is—here goes a first stab, anyway—not a physical circuit but an abstract loop in which, in the series of stages that constitute the cycling-around, there is a shift from one level of abstraction (or structure) to another, which feels like an upwards movement in a hierarchy, and yet somehow the successive “upward” shifts turn out to give rise to a closed cycle. That is, despite one’s sense of departing ever further from one’s origin, one winds up, to one’s shock, exactly where one had started out.
This neatly describes both the park and the series. And it’s only through such strange loops, as Hofstadter has long argued, that any complex system—whether it’s the human brain, a robot, or a television show—can hope to achieve full consciousness.
In scientific thought we adopt the simplest theory which will explain all the facts under consideration and enable us to predict new facts of the same kind. The catch in this criterion lies in the word “simplest.” It is really an aesthetic canon, such as we find implicit in our criticisms of poetry or painting. The layman finds such a law as dx/dt = K(d²x/dy²) much less simple than “it oozes,” of which it is the mathematical statement. The physicist reverses this judgment, and his statement is certainly the more fruitful of the two, so far as prediction is concerned.
To know only one thing well is to have a barbaric mind: civilization implies the graceful relation of all varieties of experience to a central humane system of thought. The present age is peculiarly barbaric: introduce, say, a Hebrew scholar to an ichthyologist or an authority on Danish place names and the pair of them would have no single topic in common but the weather or the war…But that so many scholars are barbarians does not matter so long as a few of them are ready to help with their specialized knowledge the few independent thinkers, that is to say the poets, who try to keep civilization alive. The scholar is a quarryman, not a builder, and all that is required of him is that he should quarry cleanly. He is the poet’s insurance against factual error. It is easy enough for the poet in this hopelessly muddled and inaccurate modern world to be misled into false etymology, anachronism, and mathematical absurdity by trying to be what he is not. His function is truth, whereas the scholar’s is fact. Fact is not to be gainsaid; one may put it in this way, that fact is a Tribune of the People with no legislative right, but only the right of veto. Fact is not truth, but a poet who willfully defies fact cannot achieve truth.