A few days ago, Jordan Crucchiola of Vulture wrote a think piece titled “The Best Place for Women in Action Movies is Next to Tom Cruise.” The article makes the argument, which strikes me as indisputable, that the women in films like the Mission: Impossible series have made such consistently strong impressions that it can’t all be an accident. I’ve written here before at possibly excessive length about Rebecca Ferguson in Rogue Nation, who was arguably the best part of one of my favorite recent action movies, and Emily Blunt in Edge of Tomorrow speaks for herself. And it’s only after multiple viewings of Ghost Protocol, which is a movie that I’m happy to watch again on any given night, that I’ve come to realize the extent to which Paula Patton is its true star and emotional center: Cruise is content to slip into the background, like a producer paying a visit to the set, while the real interest of the scene unfolds elsewhere. For an actor who has often been accused of playing the same role in every movie—although it’s more accurate to say that he emphasizes different aspects of his core persona, and with greater success and variety than most leading men—he’s notably willing to defer to the strong women with whom he shares the screen. As Crucchiola concludes: “You get the sense that, as he approaches sixty, Cruise is more than happy to share the responsibility of anchoring a blockbuster action movie. It’s almost as if he’s creating a kind of hero apprentice program.”
This is all true, as far as it goes, but it also hints at an even larger insight that the article glimpses but never quite articulates. You can start by widening the scope a bit and noting that the best place for a man in a movie is next to Cruise, too. Actors as different as Cuba Gooding Jr., Colin Farrell, and Ken Watanabe have gotten big assists from providing reliable support in a Cruise vehicle, and his filmography is littered with fascinating but abortive experiments, like Dougray Scott, that never quite got off the ground. As a movie star, Cruise has shown an unusual interest—and again, it’s so consistent that it can’t be accidental—in providing meaningful secondary parts for both men and women, some of which are really the lead in disguise. (Eyes Wide Shut is essentially a series of short films in which Cruise cedes the focus to another performer for ten minutes or so, and each one feels like the beginning of a career.) And when you pull back even further, you notice that he’s performed much the same function for directors. At the height of his power, Cruise made a notable effort to work with most of the world’s best filmmakers, but after Kubrick and Spielberg, there were no more worlds to conquer. Instead, he began to seek out directors who were on the rise or on the rebound: J.J. Abrams, Brad Bird, Christopher McQuarrie. Not every effort along those lines paid off, and it can be hard to discern what he saw in, say, Joseph Kosinski. But you could make a strong case that Cruise has launched more players on both sides of the camera than any other major star.
In other words, his track record with actresses is just a subset, although a very important one, of a more expansive program for developing talent. Elsewhere, I’ve spoken of Cruise as a great producer who happens to inhabit the body of a movie star, but this doesn’t go far enough: he’s more like a one-man studio. A decade ago, he and Paula Wagner made an undeniably bad job of running the creative end of United Artists, but it’s noteworthy that his shift toward working with emerging directors occurred at around the same time. It’s as if after failing to turn around a conventional studio, Cruise saw that he could put together a leaner, nimbler version on his own, and that it required no permanent infrastructure apart from his stardom and ability to raise money. It would be a studio like Pixar, which, instead of scattering its attention across multiple projects, devoted most of its resources to releasing a single big movie every year. When you look at his recent career through that lens, it clarifies one of its less explicable trends: Cruise’s apparent decision, well into his fifties, to redefine himself as an action hero, at a point when most actors are easing themselves into less physically challenging parts. If you remember how versatile a dramatic lead he used to be, it feels like a loss, but it makes sense when you imagine him as the head of a studio with only one asset. Cruise has chosen to focus on tentpole pictures, just like the rest of the industry, and what makes it unique is how relentlessly he relies on himself alone to drive that enormous machine.
Which only reinforces my conviction, which I’ve held for years, that this is the most interesting career in the movies. Even its compromises are instructive, when taken as part of the larger strategy. (The Jack Reacher franchise, for instance, which the world wasn’t exactly clamoring to see, is a conscious attempt to create a series of midrange movies that allow Cruise to hit a double at the box office, rather than going for a home run every time. They’re the breathing spaces between Mission: Impossible installments. Similarly, his upcoming involvement in the reboot of The Mummy feels like a test case in partnering with someone else’s franchise, in a kind of joint venture.) If Tom Cruise is a secret studio, he’s done a better job of it than most corporations. At a time when the industry is struggling to come to terms with the problem of diversity, Cruise has launched the careers of a lot of attractive, talented performers of diverse backgrounds without ever making a point of it, and he’s done it in plain sight. Outside the echo chamber of Hollywood, and with the significant exception of Disney, audiences aren’t interested in studios as brands. Development executives are nonentities whose anonymity allows them to associate themselves with success, distance themselves from failure, and conceal the fact that they don’t know what they’re doing. Cruise doesn’t have that luxury. He’s made smart, pragmatic decisions for thirty years—and in public. And he makes the rest of the industry seem smaller by comparison.
In [the] contrast between wholeness and sum lies the tragical tension in any biological, psychological, and sociological evolution. Progress is only possible by passing from a state of undifferentiated wholeness to differentiation of parts…Every evolution, by unfolding some potentiality, nips in the bud many other possibilities.
In last week’s issue of The New Yorker, the critic Emily Nussbaum delivers one of the most useful takes I’ve seen so far on Westworld. She opens with many of the same points that I made after the premiere—that this is really a series about storytelling, and, in particular, about the challenges of mounting an expensive prestige drama on a premium network during the golden age of television. Nussbaum describes her own ambivalence toward the show’s treatment of women and minorities, and she concludes:
This is not to say that the show is feminist in any clear or uncontradictory way—like many series of this school, it often treats male fantasy as a default setting, something that everyone can enjoy. It’s baffling why certain demographics would ever pay to visit Westworld…The American Old West is a logical fantasy only if you’re the cowboy—or if your fantasy is to be exploited or enslaved, a desire left unexplored…So female customers get scattered like raisins into the oatmeal of male action; and, while the cast is visually polyglot, the dialogue is color-blind. The result is a layer of insoluble instability, a puzzle that the viewer has to work out for herself: Is Westworld the blinkered macho fantasy, or is that Westworld? It’s a meta-cliffhanger with its own allure, leaving us only one way to find out: stay tuned for next week’s episode.
I agree with many of her reservations, especially when it comes to race, but I think that she overlooks or omits one important point: conscious or otherwise, it’s a brilliant narrative strategy to make a work of art partially about the process of its own creation, which can add a layer of depth even to its compromises and mistakes. I’ve drawn a comparison already to Mad Men, which was a show about advertising that ended up subliminally criticizing its own tactics—how it drew viewers into complex, often bleak stories using the surface allure of its sets, costumes, and attractive cast. If you want to stick with the Nolan family, half of Chris’s movies can be read as commentaries on themselves, whether it’s his stricken identification with the Joker as the master of ceremonies in The Dark Knight or his analysis of his own tricks in The Prestige. Inception is less about the construction of dreams than it is about making movies, with characters who stand in for the director, the producer, the set designer, and the audience. And perhaps the greatest cinematic example of them all is Vertigo, in which Scotty’s treatment of Madeline is inseparable from the use that Hitchcock makes of Kim Novak, as he did with so many other blonde leading ladies. In each case, we can enjoy the story on its own merits, but it gains added resonance when we think of it as a dramatization of what happened behind the scenes. It’s an approach that is uniquely forgiving of flawed masterpieces, which comment on themselves better than any critic can, until we wonder about the extent to which they’re aware of their own limitations.
And this kind of thing works best when it isn’t too literal. Movies about filmmaking are often disappointing, either because they’re too close to their subject for the allegory to resonate or because the movie within the movie seems clumsy compared to the subtlety of the larger film. It’s why Being John Malkovich is so much more beguiling a statement than the more obvious Adaptation. In television, the most unfortunate recent example is UnREAL. You’d expect that a show that was so smart about the making of a reality series would begin to refer intriguingly to itself, and it did, but not in a good way. Its second season was a disappointment, evidently because of the same factors that beset its fictional show Everlasting: interference from the network, conceptual confusion, tensions between producers on the set. It seemed strange that UnREAL, of all shows, could display such a lack of insight into its own problems, but maybe it isn’t so surprising. A good analogy needs to hold us at arm’s length, both to grant some perspective and to allow for surprising discoveries in the gaps. The ballet company in The Red Shoes and the New York Inquirer in Citizen Kane are surrogates for the movie studio, and both films become even more interesting when you realize how much the lead character is a portrait of the director. Sometimes it’s unclear how much of this is intentional, but this doesn’t hurt. So much of any work of art is out of your control that you need to find an approach that automatically converts your liabilities into assets, and you can start by conceiving a premise that encourages the viewer or reader to play along at home.
Which brings us back to Westworld. In her critique, Nussbaum writes: “Westworld [is] a come-hither drama that introduces itself as a science-fiction thriller about cyborgs who become self-aware, then reveals its true identity as what happens when an HBO drama struggles to do the same.” She implies that this is a bug, but it’s really a feature. Westworld wouldn’t be nearly as interesting if it weren’t being produced with this cast, on this network, and on this scale. We’re supposed to be impressed by the time and money that have gone into the park—they’ve spared no expense, as John Hammond might say—but it isn’t all that different from the resources that go into a big-budget drama like this. In the most recent episode, “Dissonance Theory,” the show invokes the image of the maze, as we might expect from a series by a Nolan brother: get to the center to the labyrinth, it says, and you’ve won. But it’s more like what Douglas R. Hofstadter describes in I Am a Strange Loop:
What I mean by “strange loop” is—here goes a first stab, anyway—not a physical circuit but an abstract loop in which, in the series of stages that constitute the cycling-around, there is a shift from one level of abstraction (or structure) to another, which feels like an upwards movement in a hierarchy, and yet somehow the successive “upward” shifts turn out to give rise to a closed cycle. That is, despite one’s sense of departing ever further from one’s origin, one winds up, to one’s shock, exactly where one had started out.
This neatly describes both the park and the series. And it’s only through such strange loops, as Hofstadter has long argued, that any complex system—whether it’s the human brain, a robot, or a television show—can hope to achieve full consciousness.
In scientific thought we adopt the simplest theory which will explain all the facts under consideration and enable us to predict new facts of the same kind. The catch in this criterion lies in the word “simplest.” It is really an aesthetic canon, such as we find implicit in our criticisms of poetry or painting. The layman finds such a law as dx/dt = K(d²x/dy²) much less simple than “it oozes,” of which it is the mathematical statement. The physicist reverses this judgment, and his statement is certainly the more fruitful of the two, so far as prediction is concerned.
To know only one thing well is to have a barbaric mind: civilization implies the graceful relation of all varieties of experience to a central humane system of thought. The present age is peculiarly barbaric: introduce, say, a Hebrew scholar to an ichthyologist or an authority on Danish place names and the pair of them would have no single topic in common but the weather or the war…But that so many scholars are barbarians does not matter so long as a few of them are ready to help with their specialized knowledge the few independent thinkers, that is to say the poets, who try to keep civilization alive. The scholar is a quarryman, not a builder, and all that is required of him is that he should quarry cleanly. He is the poet’s insurance against factual error. It is easy enough for the poet in this hopelessly muddled and inaccurate modern world to be misled into false etymology, anachronism, and mathematical absurdity by trying to be what he is not. His function is truth, whereas the scholar’s is fact. Fact is not to be gainsaid; one may put it in this way, that fact is a Tribune of the People with no legislative right, but only the right of veto. Fact is not truth, but a poet who willfully defies fact cannot achieve truth.
It can be said with complete confidence that any scientist of any age who wants to make important discoveries must study important problems. Dull or piffling problems yield dull or piffling answers. It is not enough that a problem be “interesting”—almost any problem is interesting if it is studied in sufficient depth.
As an example of research work not worth doing, Lord Zuckerman invented the cruelly apt but not ridiculously farfetched example of a young zoology graduate who has decided to try to find out why thirty-six percent of sea urchin eggs have a tiny little black spot on them. This is not an important problem; such a graduate student will be lucky if his work commands the attention or interest of anyone except perhaps the poor fellow next door who is trying to find out why sixty-four percent of sea urchin eggs do not have a little black spot on them. Such a student has committed a kind of scientific suicide, and his supervisors are very much to blame. The example is purely imaginary, of course, for Lord Zuckerman knows very well that no sea urchin eggs are spotted.
It’s been said that all of the personal financial advice that most people need to know can fit on a single index card. In fact, that’s pretty much true—which didn’t stop the man who popularized the idea from writing a whole book about it. But the underlying principle is sound enough. When you’re dealing with a topic like your own finances, instead of trying to master a large body of complicated material, you’re better off focusing on a few simple, reliable rules until you aren’t likely to break them by mistake. Once you’ve internalized the basics, you can move on. The tricky part is identifying the rules that will get you the furthest per unit of effort. In practice, no matter what we’re doing, nearly all of us operate under only a handful of conscious principles at any given moment. We just can’t keep more than that in our heads at any one time. (Unconscious principles are another matter, and you could say that intuition is another word for all the rules that we’ve absorbed to the point where we don’t need to think about them explicitly.) If the three or four rules that you’ve chosen to follow are good ones, it puts you at an advantage over a rival who is working with an inferior set. And while this isn’t enough to overcome the impact of external factors, or dumb luck, it makes sense to maximize the usefulness of the few aspects that you can control. This implies, in turn, that you should think very carefully about a handful of big rules, and let experience and intuition take care of the rest.
Recently, I’ve been thinking about what I’d include on a similar index card for a writer. In my own writing life, a handful of principles have far outweighed the others. I’ve spent countless hours discussing the subject on this blog, but you could throw away almost all of it: a single index card’s worth of advice would have gotten me ninety percent of the way to where I am now. For instance, there’s the simple rule that you should never go back to read what you’ve written until you’ve finished a complete rough draft, whether it’s a short story, an essay, or a novel—which is more responsible than any other precept for the fact that I’m still writing at all. The principle that you should cut at least ten percent from a first draft, in turn, is what helped me sell my first stories, and in my experience, it’s more like twenty percent. Finally, there’s the idea that you should structure your plot as a series of objectives, and that you should probably make some kind of outline to organize your thoughts before you begin. This is arguably more controversial than the other two, and outlines aren’t for everybody. But they’ve allowed me to write more intricate and ambitious stories than I could have managed otherwise, and they make it a lot easier to finish what I’ve started. (The advice to write an outline is a little like the fifth postulate of Euclid: it’s uglier than the others, and you get interesting results when you get rid of it, but most of us are afraid to drop it completely.)
Then we get to words of wisdom that aren’t as familiar, but which I think every writer should keep in mind. If I had to pick one piece of advice to send back in time to my younger self, along with the above, it’s what David Mamet says in Some Freaks:
As a writer, I’ve tried to train myself to go one achievable step at a time: to say, for example, “Today I don’t have to be particularly inventive, all I have to be is careful, and make up an outline of the actual physical things the character does in Act One.” And then, the following day to say, “Today I don’t have to be careful. I already have this careful, literal outline, and I all have to do is be a little bit inventive,” et cetera, et cetera.
It isn’t as elegantly phased as I might like, but it gets at something so important about the writing process that I’ve all but memorized it. A real writer has to be good at everything, and it’s unclear why we should expect all those skills to manifest themselves in a single person. As I once wrote about Proust: “It seems a little unfair that our greatest writer on the subject of sexual jealousy and obsession should also be a genius at describing, say, a seascape.” How can we reasonably expect our writers to create suspense, tell stories about believable characters, advance complicated ideas, and describe the bedroom curtains?
The answer—and while it’s obvious, it didn’t occur to me for years—is that the writer doesn’t need to do all of this at once. A work of art is experienced in a comparative rush, but it doesn’t need to be written that way. (As Homer Simpson was once told: “Very few cartoons are broadcast live. It’s a terrible strain on the animators’ wrists.”) You do one thing at a time, as Mamet says, and divide up your writing schedule so that you don’t need to be clever and careful at the same time. This applies to nonfiction as well. When you think about the work that goes into writing, say, a biography, it can seem absurd that we expect a writer to be the drudge who tracks down the primary sources, the psychologist who interprets the evidence, and the stylist who writes it up in good prose. But these are all roles that a writer plays at different points, and it’s a mistake to conflate them, even as each phase informs all the rest. Once you’ve become a decent stylist and passable psychologist, you’re also a more efficient drudge, since you’re better at figuring out what is and isn’t useful. Which implies that a writer isn’t dealing with just one index card of rules, but with several, and you pick and choose between them based on where you are in the process. Mamet’s point, I think, is that this kind of switching is central to getting things done. You don’t try to do everything simultaneously, and you don’t overthink whatever you’re doing at the moment. As Mamet puts it elsewhere: “Keep it simple, stupid, and don’t violate the rules that you do know. If you don’t know which rule applies, just don’t muck up the more general rules.”