Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Citizen Kane

The final problem

with 2 comments

In 1966, Howard L. Applegate, an administrator for the science fiction manuscript collection at Syracuse University, wrote to the editor John W. Campbell to ask if he would be interested in donating his papers. Campbell replied that he no longer possessed most of the original files, and he concluded: “Sorry, but any scholarly would-be biographers are going to have a tough time finding any useful documentation on me! I just didn’t keep the records!” Fortunately for me, this statement wasn’t totally true—I’ve spent the last two years combing through thousands of pages of letters, magazines, and other documents to assemble a picture of Campbell’s life, and if anything, there’s more here than any one person can absorb. I haven’t read it all, but I feel confident that I’ve looked at more of it than anyone else alive, and I often relate to what Robin W. Winks writes in his introduction to the anthology The Historian as Detective:

Historians pose to themselves difficult, even impossibly difficult, questions. Since they are reasonably intelligent and inquiring and since they do not wish to spend their lives upon a single question or line of investigation, they normally impose a time limit upon a given project or book (or the time limit is imposed for them by a “publish or perish” environment). They will invariably encounter numerous unforeseen difficulties because of missing papers, closed collections, new questions, and tangential problems; and the search through the archive, the chase after the single hoped-to-be-vital manuscript, has an excitement of its own, for that dénouement, the discovery, an answer may—one always hopes—lie in the next folio, in the next collection, in the next archive.

My work is more modest in scale than that of most academic historians, but I can understand the importance of a deadline, the hope that the next page that I read will contain a crucial piece of information, and the need for impossible questions. When I first got my hands on the microfilm reels of Campbell’s letters, I felt as if I’d stumbled across a treasure trove, and I found a lot of fascinating material that I never would have discovered otherwise. As I worked my way through the images, one inch at a time, I kept an eye on how much I had left, and as it dwindled, I felt a sinking feeling at the thought that I might never find certain answers. In fact, I never did resolve a few important issues to my satisfaction—although perhaps that wasn’t the right way to approach this particular Nachlass. In his introduction, Winks draws a telling contrast between the American and the European schools of history:

With sufficient diligence American historians can expect to find the answer—or at least an answer—to most factual or non-value questions they may choose to put to themselves. As a result, American researchers tend to begin with the questions they wish to entertain first (Did failed farmers truly move West to begin life anew in the eighteen-forties? Did immigrants reinforce older patterns of life or create new ones?), confident that the data can be found. European historians, on the other hand, are likely to begin with the available source materials first, and then look to see what legitimate questions they might ask of those sources. (Here are the private papers of Joseph Chamberlain, or of Gladstone, or of Disraeli. What do they tell me of British polities? Of Queen Victoria? Of the Jameson Raid? Of the development of British tariff policy? Of Colonial affairs? Of Ireland?)

Winks’s point is that American scholars have the advantage when it comes to sources, since there are vast archives available for every state with materials dating back to their founding. In writing about the history of science fiction, which is its own country of the mind, I’ve found that the situation is closer to what he says about European historiography. I’m far from the first person to explore this material, and I’m astounded by the diligence, depth of experience, and mastery of the facts of the fans I’ve met along the way, who have saved me from countless mistakes. In some areas, I’ve also been fortunate enough to build on the efforts of previous scholars, like Sam Moskowitz, whose book The Immortal Storm was accurately described by the fan historian Harry Warner, Jr.: “If read directly after a history of World War II, it does not seem like an anticlimax.” (I’m similarly grateful for the work of the late William H. Patterson, who did for Heinlein what I’m hoping to do for Campbell, thereby relieving me of much of the necessity of going over the same ground twice.) But there were also times at which I had to start with the available resources and see what they had to offer me. A lot of it was tedious and unrewarding, as detective work undoubtedly is in the real world. As Winks writes:

Much of the historian’s work, then, like that of the insurance investigator, the fingerprint man, or the coroner, may to the outsider seem to consist of deadening routine. Many miles of intellectual shoe leather will be used, for many metaphorical laundry lists, uninformative diaries, blank checkbooks, old telephone directories, and other trivia will stand between the researcher and his answer. Yet the routine must be pursued or the clue may be missed; the apparently false trail must be followed in order to be certain that it is false; the mute witnesses must be asked the reasons for their silence, for the piece of evidence that is missing from where one might reasonably expect to find it is, after all, a form of evidence in itself.

And the real point of asking a question is less the possibility of an answer than the motivation that it provides for you to keep digging. Winks nicely evokes the world in which the historian lives:

Precisely because the historian must turn to all possible witnesses, he is the most bookish of men. For him, no printed statement is without its interest. For him, the destruction of old cookbooks, gazetteers, road maps, Sears Roebuck catalogues, children’s books, railway timetables, or drafts of printed manuscripts, is the loss of potential evidence. Does one wish to know how the mail-order business was operated or how a Nebraska farmer might have dressed in 1930? Look to those catalogues. Does one wish to know whether a man from Washington just might have been in New York on a day in 1861 when it can be proved that he was in the capital on the day before and the day after? The timetables will help tell us of the opportunity.

But it’s only with a specific question in mind that the historian—or biographer—will bother to seek out such arcana at all, and you’re often rewarded with something that has nothing to do with the reasons why you originally looked. (Sometimes you find it on the other side of the page.) Every setback that I’ve encountered in search of a specific piece of information has opened new doors, and a question is simply the story that we tell ourselves to justify the search. The image that I like to use isn’t a private eye, but the anonymous reporter Thompson in Citizen Kane, whose boss, the shadowy Mr. Rawlston, tells him to solve the mystery of Kane’s last words: “See ‘em all! Get in touch with everybody that ever worked for him, whoever loved him, whoever hated his guts. I don’t mean go through the city directory, of course.” But that’s what you wind up doing. And as I near the end of this book, I’m haunted by what Rawlston says just before we cut to the lightning flash that illuminates the face of Susan Alexander: “It’ll probably turn out to be a very simple thing.”

The greatest trick

leave a comment »

In the essay collection Candor and Perversion, the late critic Roger Shattuck writes: “The world scoffs at old ideas. It distrusts new ideas. It loves tricks.” He never explains what he means by “trick,” but toward the end of the book, in a chapter on Marcel Duchamp, he quotes a few lines from the poet Charles Baudelaire from the unpublished preface to Flowers of Evil:

Does one show to a now giddy, now indifferent public the working of one’s devices? Does one explain all the revision and improvised variations, right down to the way one’s sincerest impulses are mixed in with tricks and with the charlatanism indispensable to the work’s amalgamation?

Baudelaire is indulging here in an analogy from the theater—he speaks elsewhere of “the dresser’s and the decorator’s studio,” “the actor’s box,” and “the wrecks, makeup, pulleys, chains.” A trick, in this sense, is a device that the artist uses to convey an idea that also draws attention to itself, in the same way that we can simultaneously notice and accept certain conventions when we’re watching a play. In a theatrical performance, the action and its presentation are so intermingled that we can’t always say where one leaves off and the other begins, and we’re always aware, on some level, that we’re looking at actors on a stage behaving in a fashion that is necessarily stylized and artificial. In other art forms, we’re conscious of these tricks to a greater or lesser extent, and while artists are usually advised that such technical elements should be subordinated to the story, in practice, we often delight in them for their own sake.

For an illustration of the kind of trick that I mean, I can’t think of any better example than the climax of The Godfather, in which Michael Corleone attends the baptism of his godson—played by the infant Sofia Coppola—as his enemies are executed on his orders. This sequence seems as inevitable now as any scene in the history of cinema, but it came about almost by accident. The director Francis Ford Coppola had the idea to combine the christening with the killings after all of the constituent parts had already been shot, which left him with the problem of assembling footage that hadn’t been designed to fit together. As Michael Sragow recounts in The New Yorker:

[Editor Peter] Zinner, too, made a signal contribution. In a climactic sequence, Coppola had the stroke of genius (confirmed by Puzo) to intercut Michael’s serving as godfather at the christening of Connie’s baby with his minions’ savagely executing the Corleone family’s enemies. But, Zinner says, Coppola left him with thousands of feet of the baptism, shot from four or five angles as the priest delivered his litany, and relatively few shots of the assassins doing their dirty work. Zinner’s solution was to run the litany in its entirety on the soundtrack along with escalating organ music, allowing different angles of the service to dominate the first minutes, and then to build to an audiovisual crescendo with the wave of killings, the blaring organ, the priest asking Michael if he renounces Satan and all his works—and Michael’s response that he does renounce them. The effect sealed the movie’s inspired depiction of the Corleones’ simultaneous, duelling rituals—the sacraments of church and family, and the murders in the street.

Coppola has since described Zinner’s contribution as “the inspiration to add the organ music,” but as this account makes clear, the editor seems to have figured out the structure and rhythm of the entire sequence, building unforgettably on the director’s initial brainstorm.

The result speaks for itself. It’s hard to think of a more powerful instance in movies of the form of a scene, created by cuts and juxtaposition, merging with the power of its storytelling. As we watch it, consciously or otherwise, we respond both to its formal audacity and to the ideas and emotions that it expresses. It’s the ultimate trick, as Baudelaire defines it, and it also inspired one of my favorite passages of criticism, in David Thomson’s entry on Coppola in The Biographical Dictionary of Film:

When The Godfather measured its grand finale of murder against the liturgy of baptism, Coppola seemed mesmerized by the trick, and its nihilism. A Buñuel, by contrast, might have made that sequence ironic and hilarious. But Coppola is not long on those qualities, and he could not extricate himself from the engineering of scenes. The identification with Michael was complete and stricken.

Before reading these lines, I had never considered the possibility that the baptism scene could be “ironic and hilarious,” or indeed anything other than how it so overwhelmingly presents itself, although it might easily have played that way without the music. And I’ve never forgotten Thomson’s assertion that Coppola was mesmerized by his own trick, as if it had arisen from somewhere outside of himself. (It might be even more accurate to say that coming up with the notion that the sequences ought to be cut together is something altogether different from actually witnessing the result, after Zinner assembled all the pieces and added Bach’s Passacaglia and Fugue in C minor—which, notably, entwines three different themes.) Coppola was so taken by the effect that he reused it, years later, for a similar sequence in Bram Stoker’s Dracula, admitting cheerfully on the commentary track that he was stealing from himself.

It was a turning point both for Coppola and for the industry as a whole. Before The Godfather, Coppola had been a novelistic director of small, quirky stories, and afterward, like Michael coming into his true inheritance, he became the engineer of vast projects, following up on the clues that he had planted here for himself. (It’s typical of the contradictions of his career that he placed his own baby daughter at the heart of this sequence, which means that he could hardly keep from viewing the most technically nihilistic scene in all his work as something like a home movie.) And while this wasn’t the earliest movie to invite the audience to revel in its structural devices—half of Citizen Kane consists of moments like this—it may have been the first since The Birth of a Nation to do so while also becoming the most commercially successful film of all time. Along the way, it subtly changed us. In our movies, as in our politics, we’ve become used to thinking as much about how our stories are presented as about what they say in themselves. We can even come to prefer trickery, as Shattuck warns us, to true ideas. This doesn’t meant that we should renounce genuine artistic facility of the kind that we see here, as opposed to its imitation or its absence, any more than Michael can renounce Satan. But the consequences of this confusion can be profound. Coppola, the orchestrator of scenes, came to identify with the mafioso who executed his enemies with ruthless efficiency, and the beauty of Michael’s moment of damnation went a long way toward turning him into an attractive, even heroic figure, an impression that Coppola spent most of The Godfather Parts II and III trying in vain to correct. Pacino’s career was shaped by this moment as well. And we have to learn to distinguish between tricks and the truth, especially when they take pains to conceal themselves. As Baudelaire says somewhere else: “The greatest trick the devil ever pulled was convincing the world he didn’t exist.”

The space between us all

with 5 comments

In an interview published in the July 12, 1970 issue of Rolling Stone, the rock star David Crosby said: “My time has gotta be devoted to my highest priority projects, which starts with tryin’ to save the human race and then works its way down from there.” The journalist Ben Fong-Torres prompted him gently: “But through your music, if you affect the people you come in contact with in public, that’s your way of saving the human race.” And I’ve never forgotten Crosby’s response:

But somehow operating on that premise for the last couple of years hasn’t done it, see? Somehow Sgt. Pepper’s did not stop the Vietnam War. Somehow it didn’t work. Somebody isn’t listening. I ain’t saying stop trying; I know we’re doing the right thing to live, full on. Get it on and do it good. But the inertia we’re up against, I think everybody’s kind of underestimated it. I would’ve thought Sgt. Pepper’s could’ve stopped the war just by putting too many good vibes in the air for anybody to have a war around.

He was right about one thing—the Beatles didn’t stop the war. And while it might seem as if there’s nothing new left to say about Sgt. Pepper’s Lonely Hearts Club Band, which celebrates its fiftieth anniversary today, it’s worth asking what it tells us about the inability of even our greatest works of art to inspire lasting change. It’s probably ridiculous to ask this of any album. But if a test case exists, it’s here.

It seems fair to say that if any piece of music could have changed the world, it would have been Sgt. Pepper. As the academic Langdon Winner famously wrote:

The closest Western Civilization has come to unity since the Congress of Vienna in 1815 was the week the Sgt. Pepper album was released…At the time I happened to be driving across the country on Interstate 80. In each city where I stopped for gas or food—Laramie, Ogallala, Moline, South Bend—the melodies wafted in from some far-off transistor radio or portable hi-fi. It was the most amazing thing I’ve ever heard. For a brief while, the irreparably fragmented consciousness of the West was unified, at least in the minds of the young.

The crucial qualifier, of course, is “at least in the minds of the young,” which we’ll revisit later. To the critic Michael Bérubé, it was nothing less than the one week in which there was “a common culture of widely shared values and knowledge in the United States at any point between 1956 and 1976,” which seems to undervalue the moon landing, but never mind. Yet even this transient unity is more apparent than real. By the end of the sixties, the album had sold about three million copies in America alone. It’s a huge number, but even if you multiply it by ten to include those who were profoundly affected by it on the radio or on a friend’s record player, you end up with a tiny fraction of the population. To put it another way, three times as many people voted for George Wallace for president as bought a copy of Sgt. Pepper in those years.

But that’s just how it is. Even our most inescapable works of art seem to fade into insignificance when you consider the sheer number of human lives involved, in which even an apparently ubiquitous phenomenon is statistically unable to reach a majority of adults. (Fewer than one in three Americans paid to see The Force Awakens in theaters, which is as close as we’ve come in recent memory to total cultural saturation.) The art that feels axiomatic to us barely touches the lives of others, and it may leave only the faintest of marks on those who listen to it closely. The Beatles undoubtedly changed lives, but they were more likely to catalyze impulses that were already there, providing a shape and direction for what might otherwise have remained unexpressed. As Roger Ebert wrote in his retrospective review of A Hard Day’s Night:

The film was so influential in its androgynous imagery that untold thousands of young men walked into the theater with short haircuts, and their hair started growing during the movie and didn’t get cut again until the 1970s.

We shouldn’t underestimate this. But if you were eighteen when A Hard Day’s Night came out, it also means that you were born the same year as Donald Trump, who decisively won voters who were old enough to buy Sgt. Pepper on its initial release. Even if you took its message to heart, there’s a difference between the kind of change that marshals you the way that you were going and the sort that realigns society as a whole. It just isn’t what art is built to do. As David Thomson writes in Rosebud, alluding to Trump’s favorite movie: “The world is very large and the greatest films so small.”

If Sgt. Pepper failed to get us out of Vietnam, it was partially because those who were most deeply moved by it were more likely to be drafted and shipped overseas than to affect the policies of their own country. As Winner says, it united our consciousness, “at least in the young,” but all the while, the old men, as George McGovern put it, were dreaming up wars for young men to die in. But it may not have mattered. Wars are the result of forces that care nothing for what art has to say, and their operations are often indistinguishable from random chance. Sgt. Pepper may well have been “a decisive moment in the history of Western civilization,” as Kenneth Tynan hyperbolically claimed, but as Harold Bloom reminds us in The Western Canon:

Reading the very best writers—let us say Homer, Dante, Shakespeare, Tolstoy—is not going to make us better citizens. Art is perfectly useless, according to the sublime Oscar Wilde, who was right about everything.

Great works of art exist despite, not because of, the impersonal machine of history. It’s only fitting that the anniversary of Sgt. Pepper happens to coincide with a day on which our civilization’s response to climate change will be decided in a public ceremony with overtones of reality television—a more authentic reflection of our culture, as well as a more profound moment of global unity, willing or otherwise. If the opinions of rock stars or novelists counted for anything, we’d be in a very different situation right now. In “Within You Without You,” George Harrison laments “the people who gain the world and lose their soul,” which neatly elides the accurate observation that they, not the artists, are the ones who do in fact tend to gain the world. (They’re also “the people who hide themselves behind a wall.”) All that art can provide is private consolation, and joy, and the reminder that there are times when we just have to laugh, even when the news is rather sad.

The A/B Test

with 2 comments

In this week’s issue of The New York Times Magazine, there’s a profile of Mark Zuckerberg by Farhad Manjoo, who describes how the founder of Facebook is coming to terms with his role in the world in the aftermath of last year’s election. I find myself thinking about Zuckerberg a lot these days, arguably even more than I use Facebook itself. We just missed overlapping in college, and with one possible exception, which I’ll mention later, he’s the most influential figure to emerge from those ranks in the last two decades. Manjoo depicts him as an intensely private man obliged to walk a fine line in public, leading him to be absurdly cautious about what he says: “When I asked if he had chatted with Obama about the former president’s critique of Facebook, Zuckerberg paused for several seconds, nearly to the point of awkwardness, before answering that he had.” Zuckerberg is trying to figure out what he believes—and how to act—under conditions of enormous scrutiny, but he also has more resources at his disposal than just about anyone else in history. Here’s the passage in the article that stuck with me the most:

The people who work on News Feed aren’t making decisions that turn on fuzzy human ideas like ethics, judgment, intuition, or seniority. They are concerned only with quantifiable outcomes about people’s actions on the site. That data, at Facebook, is the only real truth…This ideal runs so deep that the people who make News Feed often have to put aside their own notions of what’s best. “One of the things we’ve all learned over the years is that our intuition can be wrong a fair amount of the time,” John Hegeman, the vice president of product management and a News Feed team member, told me. “There are things you don’t expect will happen. And we learn a lot from that process: Why didn’t that happen, and what might that mean?”

Reading this, I began to reflect on how rarely we actually test our intuitions. I’ve spoken a lot on this blog about the role of intuitive thinking in the arts and sciences, mostly because it doesn’t get the emphasis it deserves, but there’s also no guarantee that intuition will steer us in the right direction. The psychologist Daniel Kahneman has devoted his career to showing how we tend to overvalue our gut reactions, particularly if we’ve been fortunate enough to be right in the past, and the study of human irrationality has become a rich avenue of research in the social sciences, which are often undermined by poor hunches of their own. It may not even be a matter of right or wrong. An intuitive choice may be better or worse than the alternative, but for the most part, we’ll never know. One of the quirks of Silicon Valley culture is that it claims to base everything on raw data, but it’s often in the service of notions that are outlandish, untested, and easy to misrepresent. Facebook comes closer than any company in existence to the ideal of an endless A/B test, in which the user base is randomly divided into two or more groups to see which approaches are the most effective. It’s the best lab ever developed for testing our hunches about human behavior. (Most controversially, Facebook modified the news feeds of hundreds of thousands of users to adjust the number of positive or negative posts, in order to gauge the emotional impact, and it has conducted similar tests on voter turnout.) And it shouldn’t surprise us if many of our intuitions turn out to be mistaken. If anything, we should expect them to be right about half the time—and if we can nudge that percentage just a little bit upward, in theory, it should give us a significant competitive advantage.

So what good is intuition, anyway? I like to start with William Goldman’s story about the Broadway producer George Abbott, who once passed a choreographer holding his head in his hands while the dancers stood around doing nothing. When Abbott asked what was wrong, the choreographer said that he couldn’t figure out what to do next. Abbott shot back: “Well, have them do something! That way we’ll have something to change.” Intuition, as I’ve argued before, is mostly about taking you from zero ideas to one idea, which you can then start to refine. John W. Campbell makes much the same argument in what might be his single best editorial, “The Value of Panic,” which begins with a maxim from the Harvard professor Wayne Batteau: “In total ignorance, try anything. Then you won’t be so ignorant.” Campbell argues that this provides an evolutionary rationale for panic, in which an animal acts “in a manner entirely different from the normal behavior patterns of the organism.” He continues:

Given: An organism with N characteristic behavior modes available. Given: An environmental situation which cannot be solved by any of the N available behavior modes, but which must be solved immediately if the organism is to survive. Logical conclusion: The organism will inevitably die. But…if we introduce Panic, allowing the organism to generate a purely random behavior mode not a member of the N modes characteristically available?

Campbell concludes: “When the probability of survival is zero on the basis of all known factors—it’s time to throw in an unknown.” In extreme situations, the result is panic; under less intense circumstances, it’s a blind hunch. You can even see them as points on a spectrum, the purpose of which is to provide us with a random action or idea that can then be revised into something better, assuming that we survive for long enough. But sometimes the animal just gets eaten.

The idea of refinement, revision, or testing is inseparable from intuition, and Zuckerberg has been granted the most powerful tool imaginable for asking hard questions and getting quantifiable answers. What he does with it is another matter entirely. But it’s also worth looking at his only peer from college who could conceivably challenge him in terms of global influence. On paper, Mark Zuckerberg and Jared Kushner have remarkable similarities. Both are young Jewish men—although Kushner is more observant—who were born less than four years and sixty miles apart. Kushner, whose acceptance to Harvard was so manifestly the result of his family’s wealth that it became a case study in a book on the subject, was a member of the final clubs that Zuckerberg badly wanted to join, or so Aaron Sorkin would have us believe. Both ended up as unlikely media magnates of a very different kind: Kushner, like Charles Foster Kane, took over a New York newspaper from a man named Carter. Yet their approaches to their newfound positions couldn’t be more different. Kushner has been called “a shadow secretary of state” whose portfolio includes Mexico, China, the Middle East, and the reorganization of the federal government, but it feels like one long improvisation, on the apparent assumption that he can wing it and succeed where so many others have failed. As Bruce Bartlett writes in the New York Times, without a staff, Kushner “is just a dilettante meddling in matters he lacks the depth or the resources to grasp,” and we may not have a chance to recover if his intuitions are wrong. In other words, he resembles his father-in-law, as Frank Bruni notes:

I’m told by insiders that when Trump’s long-shot campaign led to victory, he and Kushner became convinced not only that they’d tapped into something that everybody was missing about America, but that they’d tapped into something that everybody was missing about the two of them.

Zuckerberg and Kushner’s lives ran roughly in parallel for a long time, but now they’re diverging at a point at which they almost seem to be offering us two alternate versions of the future, like an A/B test with only one possible outcome. Neither is wholly positive, but that doesn’t make the choice any less stark. And if you think this sounds farfetched, bookmark this post, and read it again in about six years.

From Xenu to Xanadu

leave a comment »

L. Ron Hubbard

I do know that I could form a political platform, for instance, which would encompass the support of the unemployed, the industrialist and the clerk and day laborer all at one and the same time. And enthusiastic support it would be.

L. Ron Hubbard, in a letter to his wife Polly, October 1938

Yesterday, my article “Xenu’s Paradox: The Fiction of L. Ron Hubbard and the Making of Scientology” was published on Longreads. I’d been working on this piece, off and on, for the better part of a year, almost from the moment I knew that I was going to be writing the book Astounding. As part of my research, I had to read just about everything Hubbard ever wrote in the genres of science fiction and fantasy, and I ended up working my way through well over a million words of his prose. The essay that emerged from this process was inspired by a simple question. Hubbard clearly didn’t much care for science fiction, and he wrote it primarily for the money. Yet when the time came to invent a founding myth for Scientology, he turned to the conventions of space opera, which had previously played a minimal role in his work. Both his critics and his followers have looked hard at his published stories to find hints of the ideas to come, and there are a few that seem to point toward later developments. (One that frequently gets mentioned is “One Was Stubborn,” in which a fake religious messiah convinces people to believe in the nonexistence of matter so that he can rule the universe. There’s circumstantial evidence, however, that the premise came mostly from John W. Campbell, and that Hubbard wrote it up on the train ride home from New York to Puget Sound.) Still, it’s a tiny fraction of the whole. And such stories by other writers as “The Double Minds” by Campbell, “Lost Legacy” by Robert A. Heinlein, and The World of Null-A by A.E. van Vogt make for more compelling precursors to dianetics than anything Hubbard ever wrote.

The solution to the mystery, as I discuss at length in the article, is that Hubbard tailored his teachings to the small circle of followers he had available after his blowup with Campbell, many of whom were science fiction fans who owed their first exposure to his ideas to magazines like Astounding. And this was only the most dramatic and decisive instance of a pattern that is visible throughout his life. Hubbard is often called a fabulist who compulsively embellished own accomplishments and turned himself into something more than he really was. But it would be even more accurate to say that Hubbard transformed himself into whatever he thought the people around him wanted him to be. When he was hanging out with members of the Explorers Club, he became a barnstormer, world traveler, and intrepid explorer of the Caribbean and Alaska. Around his fellow authors, he presented himself as the most productive pulp writer of all time, inflating his already impressive word count to a ridiculous extent. During the war, he spun stories about his exploits in battle, claiming to have been repeatedly sunk and wounded, and even a former naval officer as intelligent and experienced as Heinlein evidently took him at his word. Hubbard simply became whatever seemed necessary at the time—as long as he was the most impressive man in the room. It wasn’t until he found himself surrounded by science fiction fans, whom he had mostly avoided until then, that he assumed the form that he would take for the rest of his career. He had never been interested in past lives, but many of his followers were, and the memories that they were “recovering” in their auditing sessions were often colored by the imagery of the stories they had read. And Hubbard responded by coming up with the grandest, most unbelievable space opera saga of them all.

Donald Trump

This leaves us with a few important takeaways. The first is that Hubbard, in the early days, was basically harmless. He had invented a colorful background for himself, but he wasn’t alone: Lester del Rey, among others, seems to have engaged in the same kind of self-mythologizing. His first marriage wasn’t a happy one, and he was always something of a blowhard, determined to outshine everyone he met. Yet he also genuinely impressed John and Doña Campbell, Heinlein, Asimov, and many other perceptive men and women. It wasn’t until after the unexpected success of dianetics that he grew convinced of his own infallibility, casting off such inconvenient collaborators as Campbell and Joseph Winter as obstacles to his power. Even after he went off to Wichita with his remaining disciples, he might have become little more than a harmless crank. As he began to feel persecuted by the government and professional organizations, however, his mood curdled into something poisonous, and it happened at a time in which he had undisputed authority over the people around him. It wasn’t a huge kingdom, but because of its isolation—particularly when he was at sea—he was able to exercise a terrifying amount of control over his closest followers. Hubbard didn’t even enjoy it. He had wealth, fame, and the adulation of a handful of true believers, but he grew increasingly paranoid and miserable. At the time of his death, his wrath was restricted to his critics and to anyone within arm’s reach, but he created a culture of oppression that his successor cheerfully extended against current and former members in faraway places, until no one inside or outside the Church of Scientology was safe.

I wrote the first draft of this essay in May of last year, but it’s hard to read it now without thinking of Donald Trump. Like Hubbard, Trump spent much of his life as an annoying but harmless windbag: a relentless self-promoter who constantly inflated his own achievements. As with Hubbard, everything that he did had to be the biggest and best, and until recently, he was too conscious of the value of his own brand to risk alienating too many people at once. After a lifetime of random grabs for attention, however, he latched onto a cause—the birther movement—that was more powerful than anything he had encountered before, and, like Hubbard, he began to focus on the small number of passionate followers he had attracted. His presidential campaign seems to have been conceived as yet another form of brand extension, culminating in the establishment of a Trump Television network. He shaped his message in response to the crowds who came to his rallies, and before long, he was caught in the same kind of cycle: a man who had once believed in nothing but himself gradually came to believe his own words. (Hubbard and Trump have both been described as con men, but the former spent countless hours auditing himself, and Trump no longer seems conscious of his own lies.) Both fell upward into positions of power that exceeded their wildest expectations, and it’s frightening to consider what might come next, when we consider how Hubbard was transformed. During his lifetime, Hubbard had a small handful of active followers; the Church of Scientology has perhaps 30,000, although, like Trump, they’re prone to exaggerate such numbers; Trump has millions. It’s especially telling that both Hubbard and Trump loved Citizen Kane. I love it, too. But both men ended up in their own personal Xanadu. And as I’ve noted before, the only problem with that movie is that our affection for Orson Welles distracts us from the fact that Kane ultimately went crazy.

Land of the giants

leave a comment »

Zootopia and Captain America: Civil War

Earlier this morning, I found myself thinking about two of my favorite movie scenes of the year. One is the sequence in Zootopia in which Judy Hopps chases a thief into the neighborhood of Little Rodentia, where she suddenly seems gigantic by comparison, tiptoeing gingerly past buildings the size of dollhouses. The other is the epic fight between the superheroes in Captain America: Civil War, in which Ant-Man reverses his usual shrinking power to transform himself into Giant Man. Both are standout moments in very good movies, and they have a lot in common. In each one, a normally meek and physically vulnerable character is abruptly blown up to gargantuan proportions, a situation that offers up more natural comedy than if it had involved a more conventional hero. (It’s a lot of fun to see Hank Pym treating the rest of the Avengers as his personal action figures, when it wouldn’t mean much of anything to see a giant Hulk.) Both are bright daytime scenes that allow us to scrutinize every detail of their huge central figure, which is logically satisfying in a way that a movie like the Godzilla remake isn’t: the latter is so weirdly loyal to the notion that you shouldn’t show the monster that it keeps cutting away nervously even when Godzilla ought to be the biggest thing in sight.

Most of all, of course, these scenes play with scale in ways that remind us of how satisfying that basic trick can be. A contrast in scale, properly handled, can be delightful, and it’s even more instructive to see it here, in a pair of mainstream studio movies, than it might be in more refined contexts. As the architect Christopher Alexander writes in The Nature of Order:

The first thing I noticed, when I began to study objects which have life, was that they all contain different scales. In my new language, I would now say that the centers these objects are made of tend to have a beautiful range of sizes, and that these sizes exist at a series of well-marked levels, with definite jumps between them. In short, there are big centers, middle-sized centers, small centers, and very small centers…[Scale] provides a way in which one center can be helped in its intensity by other smaller centers.

It might seem like a leap from the harmonious gradation of scale that Alexander is describing here and the goofy appearance of Giant Man, but both draw on the same underlying fact, which is that contrasts of size provide a standard of measurement. When Giant Man shows up, it feels like we’re seeing him and the rest of the Avengers for the first time.

King Kong and Citizen Kane

The movies have always taken pleasure in toying with our sense of proportion: there’s a reason why a new version of King Kong seems to pop up every few decades. If film is naturally drawn to massive contrasts of scale, it’s in part because it’s so good at it. It’s hard to imagine another medium that could pull it off so well, aside from our own imaginations, and movies like The Thief of Baghdad have reveled in bringing the giants and ogres of folklore—who are like a small child’s impression of the adult world—to life. Every movie that we see in theaters becomes a confrontation with giants. When we watch Bogart and Bergman on the big screen in Casablanca, their faces are the size of billboards, and you could argue that we respond to giants in the movies because they force the other characters to experience what the rest of us feel in the auditorium. Hollywood has always seen itself as a land of giants, even if it’s populated by moral pygmies, as Gloria Swanson reminds us in Sunset Boulevard: “I am big. It’s the pictures that got small.” And I’ve always been struck by the fact that the classic posters for King Kong and Citizen Kane are so similar, with the title character looming over smaller figures who stand terrified at the level of his ankles. Kane and Kong, whose names go together so well, are both monsters who came out of RKO Pictures, and perhaps it isn’t surprising that Orson Welles, like Brando, grew so large toward the end of his life.

The idea that a giant might symbolize the gigantic qualities of the work of art in which it appears isn’t a new one. In his great essay “Gravity’s Encyclopedia,” which I seem to think about all the time, the scholar Edward Mendelson lists what he calls “encyclopedic narratives”—The Divine Comedy, Gargantua and Patnagruel, Don Quixote, Faust, Moby-Dick, Ulysses, and Gravity’s Rainbow—and observes that they all have one thing in common:

All encyclopedias metastasize their monstrousness by including giants or gigantism: the giants who guard the pit of hell in Dante, the eponymous heroes of Rabelais, the windmills that Don Quixote takes for giants, the mighty men whom Faust sends into battle, Moby-Dick himself, the stylistic gigantism of Joyce’s “Cyclops,” and, in Gravity’s Rainbow, the titans under the earth and the angel over Lübeck whose eyes go “towering for miles.”

Your average blockbuster is even more gargantuan, in its way, than even a great novel, since it involves the collaboration of hundreds of artisans and the backing of an enormous corporation that can start to seem vaguely monstrous itself. Like most adult moviegoers, I hope that Hollywood gives us more intimate human stories, too. But we can also allow it a few giants.

Written by nevalalee

December 15, 2016 at 9:01 am

The strange loop of Westworld

leave a comment »

The maze in Westworld

In last week’s issue of The New Yorker, the critic Emily Nussbaum delivers one of the most useful takes I’ve seen so far on Westworld. She opens with many of the same points that I made after the premiere—that this is really a series about storytelling, and, in particular, about the challenges of mounting an expensive prestige drama on a premium network during the golden age of television. Nussbaum describes her own ambivalence toward the show’s treatment of women and minorities, and she concludes:

This is not to say that the show is feminist in any clear or uncontradictory way—like many series of this school, it often treats male fantasy as a default setting, something that everyone can enjoy. It’s baffling why certain demographics would ever pay to visit Westworld…The American Old West is a logical fantasy only if you’re the cowboy—or if your fantasy is to be exploited or enslaved, a desire left unexplored…So female customers get scattered like raisins into the oatmeal of male action; and, while the cast is visually polyglot, the dialogue is color-blind. The result is a layer of insoluble instability, a puzzle that the viewer has to work out for herself: Is Westworld the blinkered macho fantasy, or is that Westworld? It’s a meta-cliffhanger with its own allure, leaving us only one way to find out: stay tuned for next week’s episode.

I agree with many of her reservations, especially when it comes to race, but I think that she overlooks or omits one important point: conscious or otherwise, it’s a brilliant narrative strategy to make a work of art partially about the process of its own creation, which can add a layer of depth even to its compromises and mistakes. I’ve drawn a comparison already to Mad Men, which was a show about advertising that ended up subliminally criticizing its own tactics—how it drew viewers into complex, often bleak stories using the surface allure of its sets, costumes, and attractive cast. If you want to stick with the Nolan family, half of Chris’s movies can be read as commentaries on themselves, whether it’s his stricken identification with the Joker as the master of ceremonies in The Dark Knight or his analysis of his own tricks in The Prestige. Inception is less about the construction of dreams than it is about making movies, with characters who stand in for the director, the producer, the set designer, and the audience. And perhaps the greatest cinematic example of them all is Vertigo, in which Scotty’s treatment of Madeline is inseparable from the use that Hitchcock makes of Kim Novak, as he did with so many other blonde leading ladies. In each case, we can enjoy the story on its own merits, but it gains added resonance when we think of it as a dramatization of what happened behind the scenes. It’s an approach that is uniquely forgiving of flawed masterpieces, which comment on themselves better than any critic can, until we wonder about the extent to which they’re aware of their own limitations.


And this kind of thing works best when it isn’t too literal. Movies about filmmaking are often disappointing, either because they’re too close to their subject for the allegory to resonate or because the movie within the movie seems clumsy compared to the subtlety of the larger film. It’s why Being John Malkovich is so much more beguiling a statement than the more obvious Adaptation. In television, the most unfortunate recent example is UnREAL. You’d expect that a show that was so smart about the making of a reality series would begin to refer intriguingly to itself, and it did, but not in a good way. Its second season was a disappointment, evidently because of the same factors that beset its fictional show Everlasting: interference from the network, conceptual confusion, tensions between producers on the set. It seemed strange that UnREAL, of all shows, could display such a lack of insight into its own problems, but maybe it isn’t so surprising. A good analogy needs to hold us at arm’s length, both to grant some perspective and to allow for surprising discoveries in the gaps. The ballet company in The Red Shoes and the New York Inquirer in Citizen Kane are surrogates for the movie studio, and both films become even more interesting when you realize how much the lead character is a portrait of the director. Sometimes it’s unclear how much of this is intentional, but this doesn’t hurt. So much of any work of art is out of your control that you need to find an approach that automatically converts your liabilities into assets, and you can start by conceiving a premise that encourages the viewer or reader to play along at home.

Which brings us back to Westworld. In her critique, Nussbaum writes: “Westworld [is] a come-hither drama that introduces itself as a science-fiction thriller about cyborgs who become self-aware, then reveals its true identity as what happens when an HBO drama struggles to do the same.” She implies that this is a bug, but it’s really a feature. Westworld wouldn’t be nearly as interesting if it weren’t being produced with this cast, on this network, and on this scale. We’re supposed to be impressed by the time and money that have gone into the park—they’ve spared no expense, as John Hammond might say—but it isn’t all that different from the resources that go into a big-budget drama like this. In the most recent episode, “Dissonance Theory,” the show invokes the image of the maze, as we might expect from a series by a Nolan brother: get to the center to the labyrinth, it says, and you’ve won. But it’s more like what Douglas R. Hofstadter describes in I Am a Strange Loop:

What I mean by “strange loop” is—here goes a first stab, anyway—not a physical circuit but an abstract loop in which, in the series of stages that constitute the cycling-around, there is a shift from one level of abstraction (or structure) to another, which feels like an upwards movement in a hierarchy, and yet somehow the successive “upward” shifts turn out to give rise to a closed cycle. That is, despite one’s sense of departing ever further from one’s origin, one winds up, to one’s shock, exactly where one had started out.

This neatly describes both the park and the series. And it’s only through such strange loops, as Hofstadter has long argued, that any complex system—whether it’s the human brain, a robot, or a television show—can hope to achieve full consciousness.

%d bloggers like this: