Note: I’m counting down my ten favorite works of nonfiction, in order of the publication dates of their first editions, and with an emphasis on books that deserve a wider readership. You can find the earlier installments here.
When it comes to giving advice on something as inherently unteachable as writing, books on the subject tend to fall into one of three categories. The first treats the writing manual as an extension of the self-help genre, offering what amounts to an extended pep talk that is long on encouragement but short on specifics. A second, more useful approach is to consolidate material on a variety of potential strategies, either through the voices of multiple writers—as George Plimpton did so wonderfully in The Writer’s Chapbook, which assembles the best of the legendary interviews given to The Paris Review—or through the perspective of a writer and teacher, like John Gardner, generous enough to consider the full range of what the art of fiction can be. And the third, exemplified by David Mamet’s On Directing Film, is to lay out a single, highly prescriptive recipe for constructing stories. This last approach might seem unduly severe. Yet after a lifetime of reading what other writers have to say on the subject, Mamet’s little book is still the best I’ve ever found, not just for film, but for fiction and narrative nonfiction as well. On one level, it can serve as a starting point for your own thoughts about how the writing process should look: Mamet provides a strict, almost mathematical set of tools for building a plot from first principles, and even if you disagree with his methods, they clarify your thinking in a way that a more generalized treatment might not. But even if you just take it at face value, it’s still the closest thing I know to a foolproof formula for generating rock-solid first drafts. (If Mamet himself has a flaw as a director, it’s that he often stops there.) In fact, it’s so useful, so lucid, and so reliable that I sometimes feel reluctant to recommend it, as if I were giving away an industrial secret to my competitors.
Mamet’s principles are easy to grasp, but endlessly challenging to follow. You start by figuring out what every scene is about, mostly by asking one question: “What does the protagonist want?” You then divide each scene up into a sequence of beats, consisting of an immediate objective and a logical action that the protagonist takes to achieve it, ideally in a form that can be told in visual terms, without the need for expository dialogue. And you repeat the process until the protagonist succeeds or fails at his or her ultimate objective, at which point the story is over. This may sound straightforward, but as soon as you start forcing yourself to think this way consistently, you discover how tough it can be. Mamet’s book consists of a few simple examples, teased out in a series of discussions at a class he taught at Columbia, and it’s studded with insights that once heard are never forgotten: “We don’t want our protagonist to do things that are interesting. We want him to do things that are logical.” “Here is a tool—choose your shots, beats, scenes, objectives, and always refer to them by the names you chose.” “Keep it simple, stupid, and don’t violate those rules that you do know. If you don’t know which rule applies, just don’t muck up the more general rules.” “The audience doesn’t want to read a sign; they want to watch a motion picture.” “A good writer gets better only by learning to cut, to remove the ornamental, the descriptive, the narrative, and especially the deeply felt and meaningful.” “Now, why did all those Olympic skaters fall down? The only answer I know is that they hadn’t practiced enough.” And my own personal favorite: “The nail doesn’t have to look like a house; it is not a house. It is a nail. If the house is going to stand, the nail must do the work of a nail. To do the work of the nail, it has to look like a nail.”
Note: I’m counting down my ten favorite works of nonfiction, in order of the publication dates of their first editions, and with an emphasis on books that deserve a wider readership. You can find the earlier installments here.
David Thomson’s Biographical Dictionary of Film is one of the weirdest books in all of literature, and more than the work of any other critic, it has subtly changed the way I think about both life and the movies. His central theme—which is stated everywhere and nowhere—is the essential strangeness of turning shadows on a screen into men and women who can seem more real to us than the people in our own lives. His writing isn’t conventional criticism so much as a single huge work of fiction, with Thomson himself as both protagonist and nemesis. It isn’t a coincidence that one of his earliest books was a biography of Laurence Sterne, author of Tristram Shandy: his entire career can be read as one long Shandean exercise, in which Thomson, as a fictional character in his own work, is cheerfully willing to come off as something of a creep, as long as it illuminates our reasons for going to the movies. And his looniness is part of his charm. Edmund Wilson once playfully speculated that George Saintsbury, the great English critic, invented his own Toryism “in the same way that a dramatist or novelist arranges contrasting elements,” and there are times when I suspect that Thomson is doing much the same thing. (If his work is a secret novel, its real precursor is Pale Fire, in which Thomson plays the role of Kinbote, and every article seems to hint darkly at some monstrous underlying truth. A recent, bewildered review of his latest book on The A.V. Club is a good example of the reaction he gets from readers who aren’t in on the joke.)
But if you leave him with nothing but his perversity and obsessiveness, you end up with Armond White, while Thomson succeeds because he’s also lucid, encyclopedically informed, and ultimately sane, although he does his best to hide it. The various editions of The Biographical Dictionary of Film haven’t been revised so much as they’ve accumulated: Thomson rarely goes back to rewrite earlier entries, but tacks on new thoughts to the end of each article, so that it grows by a process of accretion, like a coral reef. The result can be confusing, but when I go back to his earlier articles, I remember at once why this is still the essential book on film. I’ll look at Thomson on Coppola (“He is Sonny and Michael Corleone for sure, but there are traces of Fredo, too”); on Sydney Greenstreet (“Indeed, there were several men trapped in his grossness: the conventional thin man; a young man; an aesthete; a romantic”); or on Eleanor Powell’s dance with Astaire in Broadway Melody of 1940 (“Maybe the loveliest moment in films is the last second or so, as the dancers finish, and Powell’s alive frock has another half-turn, like a spirit embracing the person”). Or, perhaps most memorably of all, his thoughts on Citizen Kane, which, lest we forget, is about the futile search of a reporter named Thompson:
As if Welles knew that Kane would hang over his own future, regularly being used to denigrate his later works, the film is shot through with his vast, melancholy nostalgia for self-destructive talent…Kane is Welles, just as every apparent point of view in the film is warmed by Kane’s own memories, as if the entire film were his dream in the instant before death.
It’s a strange, seductive, indispensable book, and to paraphrase Thomson’s own musings on Welles, it’s the greatest career in film criticism, the most tragic, and the one with the most warnings for the rest of us.
Note: Since I’m taking a deserved break for the holidays, I’m reposting a couple of my favorite entries from early in this blog’s run. This post was originally published, in a slightly different form, on January 13, 2011. Visual spoilers follow. Cover your eyes!
As I’ve noted before, the last line of a novel is almost always of interest, but the last line of a movie generally isn’t. It isn’t hard to understand why: movies are primarily a visual medium, and there’s a sense in which even the most brilliant dialogue can often seem beside the point. And as much the writer in me wants to believe otherwise, audiences don’t go to the movies to listen to words: they go to look at pictures.
Perhaps inevitably, then, there are significantly more great closing shots in film than there are great curtain lines. Indeed, the last shot of nearly every great film is memorable, so the list of finalists can easily expand into the dozens. Here, though, in no particular order, are twelve of my favorites. Click for the titles:
Working to timings and synchronising your musical thoughts with the film can be stimulating rather than restrictive. Scoring is a limitation but like any limitation it can be made to work for you. Verdi, except for a handful of pieces, worked best when he was “turned on” by a libretto. The most difficult problem in music is form, and in a film you already have this problem solved for you. You are presented with a basic structure, a blueprint, and provided the film has been well put together, well edited, it often suggests its own rhythms and tempo. The quality of the music is strictly up to the composer. Many people seem to assume that because film music serves the visual it must be something of secondary value. Well, the function of any art is to serve a purpose in society. For many years, music and painting served religion. The thing to bear in mind is that film is the youngest of the arts, and that scoring is the youngest of the music arts. We have a great deal of development ahead of us.
To me, the most useful experience in working in “the film industry” has been watching and learning the editing process. You can write whatever you want and try to film whatever you want, but the whole thing really happens in that editing room. How do you edit comics? If you do them in a certain way, the standard way, it’s basically impossible. That’s what led me to this approach of breaking my stories into segments that all have a beginning and end on one, two, three pages. This makes it much easier to shift things around, to rearrange parts of the story sequence. It’s something that I’m really interested in trying to figure out, but there are pluses and minuses to every approach. For instance, I think if you did all your panels exactly the same size and left a certain amount of “breathing room” throughout the story, you could make fairly extensive after-the-fact changes, but you’d sacrifice a lot by doing that…
It’s a very mysterious process: you put together a cut of the film and at the first viewing it always seems just terrible, then you work on it for two weeks and you can’t imagine what else you could do with it; then six months later, you’re still working on it and making significant changes every day. It’s very odd, but you kind of know when it’s there.
—Daniel Clowes, quoted by Todd Hignite in In the Studio: Visits with Contemporary Cartoonists
Of all the movies I’ve ever seen, Curtis Hanson’s adaptation of James Ellroy’s L.A. Confidential has influenced my own work the most. This isn’t to say that it’s my favorite movie of all time—although it’s certainly in the top ten—or even that I find its themes especially resonant: I have huge admiration for Ellroy’s talents, but it’s safe to say that he and I are operating under a different set of obsessions. Rather, it’s the structure of the film that I find so compelling: three protagonists, with three main stories, that interweave and overlap in unexpected ways until they finally converge at the climax. It’s a narrative structure that has influenced just about every novel I’ve ever written, or tried to write—and the result, ironically, has made my own work less adaptable for the movies.
Movies, you see, aren’t especially good at multiple plots and protagonists. Most screenplays center, with good reason, on a single character, the star part, whose personal story is the story of the movie. Anything that departs from this form is seen as inherently problematic, which is why L.A. Confidential’s example is so singular, so seductive, and so misleading. As epic and layered as the movie is, Ellroy’s novel is infinitely larger: it covers a longer span of time, with more characters and subplots, to the point where entire storylines—like that of a particularly gruesome serial killer—were jettisoned completely for the movie version. Originally it was optioned as a possible miniseries, which would have made a lot of sense, but to the eternal credit of Hanson and screenwriter Brian Helgeland, they decided that there might also be a movie here.
To narrow things down, they started with my own favorite creative tool: they made a list. As the excellent bonus materials for the film make clear, Hanson and Helgeland began with a list of characters or plot points they wanted to keep: Bloody Christmas, the Nite Owl massacre, Bud White’s romance with Lynn Bracken, and so on. Then they ruthlessly pared away the rest of the novel, keeping the strands they liked, finding ways to link them together, and writing new material when necessary, to the point where some of the film’s most memorable moments—including the valediction of Jack Vincennes and the final showdown at the Victory Motel, which repurposes elements of the book’s prologue—are entirely invented. And the result, as Ellroy says, was a kind of “alternate life” for the characters he had envisioned.
So what are the lessons here? For aspiring screenwriters, surprisingly few: a film like L.A. Confidential appears only a couple of times each decade, and the fact that it was made at all, without visible compromise, is one of the unheralded miracles of modern movies. If nothing else, though, it’s a reminder that adaptation is less about literal faithfulness than fidelity of spirit. L.A. Confidential may keep less than half of Ellroy’s original material, but it feels as turbulent and teeming with possibility, and gives us the sense that some of the missing stories may still be happening here, only slightly offscreen. Any attempt to adapt similarly complex material without that kind of winnowing process, as in the unfortunate Watchmen, usually leaves audiences bewildered. The key is to find the material’s alternate life. And no other movie has done it so well.
It’s been just over twenty years now since The Silence of the Lambs was released in theaters, and the passage of time—and its undisputed status as a classic—sometimes threatens to blind us to the fact that it’s such a peculiar movie. At the time, it certainly seemed like a dubious prospect: it had a director known better for comedy than suspense, an exceptional cast but no real stars, and a story whose violence verged on outright kinkiness. If it emphatically overcame those doubts, it was with its mastery of tone and style, a pair of iconic performances, and, not incidentally, the best movie poster of the modern era. And the fact that it not only became a financial success but took home the Academy Award for Best Picture, as well as the four other major Oscars, remains genre filmmaking’s single most unqualified triumph.
It also had the benefit of some extraordinary source material. I’ve written at length about Thomas Harris elsewhere, but what’s worth emphasizing about his original novel is that it’s the product of several diverse temperaments. Harris began his career as a journalist, and there’s a reportorial streak running through all his best early books, with their fascination with the technical language, tools, and arcana of various esoteric professions, from forensic profiling to brain tanning. He also has a Gothic sensibility that has only grown more pronounced with time, a love of language fed by the poetry of William Blake and John Donne, and, in a quality that is sometimes undervalued, the instincts of a great pulp novelist. The result is an endlessly fascinating book poised halfway between calculated bestseller and major novel, and all the better for that underlying tension.
Which is why it pains me as a writer to say that as good as the book is, the movie is better. Part of this is due to the inherent differences in the way we experience movies and popular fiction: for detailed character studies, novels have the edge, but for a character who is seen mostly from the outside, as an enigma, nothing in Harris prepares us for what Anthony Hopkins does with Hannibal Lecter, even if it amounts to nothing more than a few careful acting decisions for his eyes and voice. It’s also an example of how a popular novel can benefit from an intelligent, respectful adaptation. Over time, Ted Tally’s fine screenplay has come to seem less like a variation on Harris’s novel than a superlative second draft: Tally keeps all that is good in the book, pares away the excesses, and even improves the dialogue. (It’s the difference between eating a census taker’s liver with “a big Amarone” and “a nice Chianti.”)
And while the movie is a sleeker, more streamlined animal, it still benefits from the novel’s strangeness. For better or worse, The Silence of the Lambs created an entire genre—the sleek, modern serial killer movie—but like most founding works, it has a fundamental oddity that leaves it out of place among its own successors. The details of its crimes are horrible, but what lingers are its elegance, its dry humor, and the curious rhythms of its central relationship, which feels like a love story in ways that Hannibal made unfortunately explicit. It’s genuinely concerned with women, even as it subjects them to horrible fates, and in its look and mood, it’s a work of stark realism shading inexorably into a fairy tale. That ability to combine strangeness with ruthless efficiency is the greatest thing a thriller in any medium can do. Few movies, or books, have managed it since, even after twenty years of trying.
A few months ago, after greatly enjoying The Conversations, Michael Ondaatje’s delightful book-length interview with Walter Murch, I decided to read Ondaatje’s The English Patient for the first time. I went through it very slowly, only a handful of pages each day, in parallel with my own work on the sequel to The Icon Thief. Upon finishing it last week, I was deeply impressed, not just by the writing, which had drawn me to the book in the first place, but also by the novel’s structural ingenuity—derived, Ondaatje says, from a long process of rewriting and revision—and the richness of its research. This is one of the few novels where detailed historical background has been integrated seamlessly into the poetry of the story itself, and it reflects a real, uniquely novelistic curiosity about other times and places. It’s a great book.
Reading The English Patient also made me want to check out the movie, which I hadn’t seen in more than a decade, when I watched it as part of a special screening for a college course. I recalled admiring it, although in a rather detached way, and found that I didn’t remember much about the story, aside from a few moments and images (and the phrase “suprasternal notch”). But I sensed it would be worth revisiting, both because I’d just finished the book and because I’ve become deeply interested, over the past few years, in the career of editor Walter Murch. Murch is one of film’s last true polymaths, an enormously intelligent man who just happened to settle into editing and sound design, and The English Patient, for which he won two Oscars (including the first ever awarded for a digitally edited movie), is a landmark in his career. It was with a great deal of interest, then, that I watched the film again last night.
First, the good news. The adaptation, by director Anthony Minghella, is very intelligently done. It was probably impossible to film Ondaatje’s full story, with its impressionistic collage of lives and memories, in any kind of commercially viable way, so the decision was wisely made to focus on the central romantic episode, the doomed love affair between Almásy (Ralph Fiennes) and Katherine Clifton (Kristin Scott Thomas). Doing so involved inventing a lot of new, explicitly cinematic material, some satisfying (the car crash and sandstorm in the desert), some less so (Almásy’s melodramatic escape from the prison train). The film also makes the stakes more personal: the mission of Caravaggio (Willem Dafoe) is less about simple fact-finding, as it was in the book, than about revenge. And the new ending, with Almásy silently asking Hana (Juliette Binoche) to end his life, gives the film a sense of resolution that the book deliberately lacks.
These changes, while extensive, are smartly done, and they respect the book while acknowledging its limitations as source material. As Roger Ebert points out in his review of Apocalypse Now, another milestone in Murch’s career, movies aren’t very good at conveying abstract ideas, but they’re great for showing us “the look of a battle, the expression on a face, the mood of a country.” On this level, The English Patient sustains comparison with the works of David Lean, with a greater interest in women, and remains, as David Thomson says, “one of the most deeply textured of films.” Murch’s work, in particular, is astonishing, and the level of craft on display here is very impressive.
Yet the pieces don’t quite come together. The novel’s tentative, intellectual nature, which the adaptation doesn’t try to match, infects the movie as well. It feels like an art film that has willed itself into being an epic romance, when in fact the great epic romances need to be a little vulgar—just look at Gone With the Wind. Doomed romances may obsess their participants in real life, but in fiction, seen from the outside, they can seem silly or absurd. The English Patient understands a great deal about the craft of the romantic epic, the genre in which it has chosen to plant itself, but nothing of its absurdity. In the end, it’s just too intelligent, too beautifully made, to move us on more than an abstract level. It’s a heroic effort; I just wish it were something a little more, or a lot less.
Warning: Visual spoilers follow. Cover your eyes!
As I’ve noted before, the last line of a novel is almost always of interest, but the last line of a movie generally isn’t. It isn’t hard to understand why: movies are primarily a visual medium, after all, and there’s a sense in which even the most brilliant dialogue can often seem beside the point. And as much the writer in me wants to believe otherwise, audiences don’t go to the movies to listen to words: they go to look at pictures.
Perhaps inevitably, then, there are significantly more great closing shots in film than there are great curtain lines. Indeed, the last shot of nearly every great film is memorable, so the list of finalists can easily expand into the dozens. Here, though, in no particular order, are twelve of my favorites. Click or mouse over for the titles:
In March 1969, Robert A. Heinlein flew with his wife Ginny to Brazil, where he had been invited to serve as a guest of honor at a film festival in Rio de Janeiro. Another passenger on their plane was the director Roman Polanski, who introduced Heinlein to his wife, the actress Sharon Tate, at a party at the French embassy a few days after their arrival. (Tate had been in Italy filming The Thirteen Chairs, her final movie role before her death, which she had taken largely out of a desire to work with Orson Welles.) On August 8, Tate and four others were murdered in Los Angeles by members of the Manson Family. Two months later, Heinlein received a letter from a woman named “Annette or Nanette or something,” who claimed that police helicopters were chasing her and her friends. Ginny was alarmed by its incoherent tone, and she told her husband to stay out of it: “Honey, this is worse than the crazy fan mail. This is absolutely insane. Don’t have anything to do with it.” Heinlein contented himself with calling the Inyo County Sheriff’s Office, which confirmed that a police action was underway. In fact, it was a joint federal, state, and county raid of the Myers and Barker Ranches, where Charles Manson and his followers had been living, as part of an investigation into an auto theft ring—their connection to the murders had not yet been established. Manson was arrested, along with two dozen others. And the woman who wrote to Heinlein was probably Lynette “Squeaky” Fromme, another member of the Manson Family, who would be sentenced to life in prison for a botched assassination attempt six years later on President Gerald Ford.
On January 8, 1970, the San Francisco Herald-Examiner ran a story on the front page with the headline “Manson’s Blueprint? Claim Tate Suspect Used Science Fiction Plot.” Later that month, Time published an article, “A Martian Model,” that began:
In the psychotic mind, fact and fantasy mingle freely. The line between the real and the imagined easily blurs or disappears. Most madmen invent their own worlds. If the charges against Charles Manson, accused along with five members of his self-styled “family” of killing Sharon Tate and six other people, are true, Manson showed no powers of invention at all. In the weeks since his indictment, those connected with the case have discovered that he may have murdered by the book. The book is Robert A. Heinlein’s Stranger in a Strange Land, an imaginative science-fiction novel long popular among hippies…
Not surprisingly, the Heinleins were outraged by the implication, although Robert himself was in no condition to respond—he was hospitalized with a bad case of peritonitis. In any event, the parallels between the career of Charles Manson and Heinlein’s fictional character Valentine Michael Smith were tenuous at best, and the angle was investigated by the prosecutor Vincent Bugliosi, who dismissed it. A decade later, in a letter to the science fiction writer and Heinlein fan J. Neil Schulman, Manson stated, through another prisoner, that he had never read the book. Yet the novel was undeniably familiar to members of his circle, as it was throughout the countercultural community of the late sixties. The fact that Fromme wrote to Heinlein is revealing in itself, and Manson’s son, who was born on April 15, 1968, was named Valentine Michael by his mother.
Years earlier, Manson had been exposed—to a far more significant extent—to the work of another science fiction author. In Helter Skelter, his account of the case, Bugliosi writes of Manson’s arrival at McNeil Island Federal Penitentiary in 1961:
Manson gave as his claimed religion “Scientologist,” stating that he “has never settled upon a religious formula for his beliefs and is presently seeking an answer to his question in the new mental health cult known as Scientology”…Manson’s teacher, i.e. “auditor” was another convict, Lanier Rayner. Manson would later claim that while in prison he achieved Scientology’s highest level, “theta clear.”
In his own memoir, Manson writes: “A cell partner turned me on to Scientology. With him and another guy I got pretty heavy into dianetics and Scientology…There were times when I would try to sell [fellow inmate Alan Karpis] on the things I was learning through Scientology.” In total, Manson appears to have received about one hundred and fifty hours of auditing, and his yearly progress report noted: “He appears to have developed a certain amount of insight into his problems through his study of this discipline.” The following year, another report stated: “In his effort to ‘find’ himself, Manson peruses different religious philosophies, e.g. Scientology and Buddhism; however, he never remains long enough with any given teachings to reap material benefits.” In 1968, Manson visited a branch of the Church of Scientology in Los Angeles, where he asked the receptionist: “What do you do after ‘clear?'” But Bugliosi’s summary of the matter seems accurate enough:
Although Manson remained interested in Scientology much longer than he did in any other subject except music, it appears that…he stuck with it only as long as his enthusiasm lasted, then dropped it, extracting and retaining a number of terms and phrases (“auditing,” “cease to exist,” “coming to Now”) and some concepts (karma, reincarnation, etc.) which, perhaps fittingly, Scientology had borrowed in the first place.
So what should we make of all this? I think that there are a few relevant points here. The first is that Heinlein and Hubbard’s influence on Manson—or any of his followers, including Fromme, who had been audited as well—appears to have been marginal, and only in the sense that you could say that he was “influenced” by the Beatles. Manson was a scavenger who assembled his notions out of scraps gleaned from whatever materials were currently in vogue, and science fiction had saturated the culture to an extent that it would have been hard to avoid it entirely, particularly for someone who was actively searching for such ideas. On some level, it’s a testament to the cultural position that both Hubbard and Heinlein had attained, although it also cuts deeper than this. Manson represented the psychopathic fringe of an impulse for which science fiction and its offshoots provided a convenient vocabulary. It was an urge for personal transformation in the face of what felt like apocalyptic social change, rooted in the ideals that Campbell and his authors had defined, and which underwent several mutations in the decades since its earliest incarnation. (And it would mutate yet again. The Aum Shinrikyo cult, which was responsible for the sarin gas attacks in the Japanese subway system in 1995, borrowed elements of Asimov’s Foundation trilogy for its vision of a society of the elect that would survive the coming collapse of civilization.) It’s an aspect of the genre that takes light and dark forms, and it sometimes displays both faces simultaneously, which can lead to resistance from both sides. The Manson Family murders began with the killing of a man named Gary Hinman, who was taken hostage on July 25, 1969, a day in which the newspapers were filled with accounts of the successful splashdown of Apollo 11. The week before, at the ranch where Manson’s followers were living, a woman had remarked: “There’s somebody on the moon today.” And another replied: “They’re faking it.”
Over the last year or so, I’ve found myself repeatedly struck by the parallels between the careers of John W. Campbell and Orson Welles. At first, the connection might seem tenuous. Campbell and Welles didn’t look anything alike, although they were about the same height, and their politics couldn’t have been more different—Welles was a staunch progressive and defender of civil rights, while Campbell, to put it mildly, wasn’t. Welles was a wanderer, while Campbell spent most of his life within driving distance of his birthplace in New Jersey. But they’re inextricably linked in my imagination. Welles was five years younger than Campbell, but they flourished at exactly the same time, with their careers peaking roughly between 1937 and 1942. Both owed significant creative breakthroughs to the work of H.G. Wells, who inspired Campbell’s story “Twilight” and Welles’s Mercury Theater adaptation of The War of the Worlds. In 1938, Campbell saw Welles’s famous modern-dress production of Julius Caesar with the writer L. Sprague de Camp, of which he wrote in a letter:
It represented, in a way, what I’m trying to do in the magazine. Those humans of two thousand years ago thought and acted as we do—even if they did dress differently. Removing the funny clothes made them more real and understandable. I’m trying to get away from funny clothes and funny-looking people in the pictures of the magazine. And have more humans.
And I suspect that the performance started a train of thought in both men’s minds that led to de Camp’s novel Lest Darkness Fall, which is about a man from the present who ends up in ancient Rome.
Campbell was less pleased by Welles’s most notable venture into science fiction, which he must have seen as an incursion on his turf. He wrote to his friend Robert Swisher: “So far as sponsoring that War of [the] Worlds thing—I’m damn glad we didn’t! The thing is going to cost CBS money, what with suits, etc., and we’re better off without it.” In Astounding, he said that the ensuing panic demonstrated the need for “wider appreciation” of science fiction, in order to educate the public about what was and wasn’t real:
I have long been an exponent of the belief that, should interplanetary visitors actually arrive, no one could possibly convince the public of the fact. These stories wherein the fact is suddenly announced and widespread panic immediately ensues have always seemed to me highly improbable, simply because the average man did not seem ready to visualize and believe such a statement.
Undoubtedly, Mr. Orson Welles felt the same way.
Their most significant point of intersection was The Shadow, who was created by an advertising agency for Street & Smith, the publisher of Astounding, as a fictional narrator for the radio series Detective Story Hour. Before long, he became popular enough to star in his own stories. Welles, of course, voiced The Shadow from September 1937 to October 1938, and Campbell plotted some of the magazine installments in collaboration with the writer Walter B. Gibson and the editor John Nanovic, who worked in the office next door. And his identification with the character seems to have run even deeper. In a profile published in the February 1946 issue of Pic magazine, the reporter Dickson Hartwell wrote of Campbell: “You will find him voluble, friendly and personally depressing only in what his friends claim is a startling physical resemblance to The Shadow.”
It isn’t clear if Welles was aware of Campbell, although it would be more surprising if he wasn’t. Welles flitted around science fiction for years, and he occasionally crossed paths with other authors in that circle. To my lasting regret, he never met L. Ron Hubbard, which would have been an epic collision of bullshitters—although Philip Seymour Hoffman claimed that he based his performance in The Master mostly on Welles, and Theodore Sturgeon once said that Welles and Hubbard were the only men he had ever met who could make a room seem crowded simply by walking through the door. In 1946, Isaac Asimov received a call from a lawyer whose client wanted to buy all rights to his robot story “Evidence” for $250. When he asked Campbell for advice, the editor said that he thought it seemed fair, but Asimov’s wife told him to hold out for more. Asimov called back to ask for a thousand dollars, adding that he wouldn’t discuss it further until he found out who the client was. When the lawyer told him that it was Welles, Asimov agreed to the sale, delighted, but nothing ever came of it. (Welles also owned the story in perpetuity, making it impossible for Asimov to sell it elsewhere, a point that Campbell, who took a notoriously casual attitude toward rights, had neglected to raise.) Twenty years later, Welles made inquiries into the rights for Heinlein’s The Puppet Masters, which were tied up at the time with Roger Corman, but never followed up. And it’s worth noting that both stories are concerned with the problem of knowing how other people are what they claim to be, which Campbell had brilliantly explored in “Who Goes There?” It’s a theme to which Welles obsessively returned, and it’s fascinating to speculate what he might have done with it if Howard Hawks and Christian Nyby hadn’t gotten there first with The Thing From Another World. Who knows what evil lurks in the hearts of men?
But their true affinities were spiritual ones. Both Campbell and Welles were child prodigies who reinvented an art form largely by being superb organizers of other people’s talents—although Campbell always downplayed his own contributions, while Welles appears to have done the opposite. Each had a spectacular early success followed by what was perceived as decades of decline, which they seem to have seen coming. (David Thomson writes: “As if Welles knew that Kane would hang over his own future, regularly being used to denigrate his later works, the film is shot through with his vast, melancholy nostalgia for self-destructive talent.” And you could say much the same thing about “Twilight.”) Both had a habit of abandoning projects as soon as they realized that they couldn’t control them, and they both managed to seem isolated while occupying the center of attention in any crowd. They enjoyed staking out unreasonable positions in conversation, just to get a rise out of listeners, and they ultimately drove away their most valuable collaborators. What Pauline Kael writes of Welles in “Raising Kane” is equally true of Campbell:
He lost the collaborative partnerships that he needed…He was alone, trying to be “Orson Welles,” though “Orson Welles” had stood for the activities of a group. But he needed the family to hold him together on a project and to take over for him when his energies became scattered. With them, he was a prodigy of accomplishments; without them, he flew apart, became disorderly.
Both men were alone when they died, and both filled their friends, admirers, and biographers with intensely mixed feelings. I’m still coming to terms with Campbell. But I have a hunch that I’ll end up somewhere close to Kael’s ambivalence toward Welles, who, at the end of an essay that was widely seen as puncturing his myth, could only conclude: “In a less confused world, his glory would be greater than his guilt.”
If you’re a certain kind of writer, whenever you pick up a new book, instead of glancing at the beginning or opening it to a random page, you turn immediately to the acknowledgments. Once you’ve spent any amount of time trying to get published, that short section of fine print starts to read like a gossip column, a wedding announcement, and a high school yearbook all rolled into one. For most writers, it’s also the closest they’ll ever get to an Oscar speech, and many of them treat it that way, with loving tributes and inside jokes attached to every name. It’s a chance to thank their editors and agents—while the unagented reader suppresses a twinge of envy—and to express gratitude to various advisers, colonies, and fellowships. (The most impressive example I’ve seen has to be in The Lisle Letters by Muriel St. Clare Byrne, which pays tribute to the generosity of “Her Majesty Queen Elizabeth II.”) But if there’s one thing I’ve learned from the acknowledgments that I’ve been reading recently, it’s that I deserve an assistant. It seems as if half the nonfiction books I see these days thank a whole squadron of researchers, inevitably described as “indefatigable,” who live in libraries, work through archives and microfilm reels, and pass along the results to their grateful employers. If the author is particularly famous, like Bob Woodward or Kurt Eichenwald, the acknowledgment can sound like a letter of recommendation: “I was startled by his quick mind and incomparable work ethic.” Sometimes the assistants are described in such glowing terms that you start to wonder why you aren’t reading their books instead. And when I’m trying to decipher yet another illegible scan of a carbon copy of a letter written fifty years ago on a manual typewriter, I occasionally wish that I could outsource it to an intern.
But there are also good reasons for doing everything yourself, at least at the early stages of a project. In his book The Integrity of the Body, the immunologist Sir Frank Macfarlane Burnet says that there’s one piece of advice that he always gives to “ambitious young research workers”: “Do as large a proportion as possible of your experiments with your own hands.” In Discovering, Robert Scott Root-Bernstein expands on this point:
When you climb those neighboring hills make sure you do your own observing. Many scientists assign all experimental work to lab techs and postdocs. But…only the prepared mind will note and attach significance to an anomaly. Each individual possesses a specific blend of personality, codified science, science in the making, and cultural biases that will match particular observations. If you don’t do your own observing, the discovery won’t be made. Never delegate research.
Obviously, there are situations in which you can’t avoid delegating the work to some degree. But I think Root-Bernstein gets at something essential when he frames it in terms of recognizing anomalies. If you don’t sift through the raw material yourself, it’s difficult to know what is unusual or important, and even if you have a bright assistant who will flag any striking items for your attention, it’s hard to put them in perspective. As I’ve noted elsewhere, drudgery can be an indispensable precursor to insight. You’re more likely to come up with worthwhile connections if you’re the one mining the ore.
This is why the great biographers and historians often seem like monsters of energy. I never get tired of quoting the advice that Alan Hathaway gave to the young Robert Caro at Newsday: “Turn every goddamn page.” Caro took this to heart, noting proudly of one of the archives he consulted: “The number [of pages] may be in the area of forty thousand. I don’t know how many of these pages I’ve read, but I’ve read a lot of them.” And it applies to more than just what you read, as we learn from a famous story about Caro and his editor Robert Gottlieb:
Gottlieb likes to point to a passage fairly early in The Power Broker describing Moses’ parents one morning in their lodge at Camp Madison, a fresh-air charity they established for poor city kids, picking up the Times and reading that their son had been fined $22,000 for improprieties in a land takeover. “Oh, he never earned a dollar in his life, and now we’ll have to pay this,” Bella Moses says.
“How do you know that?” Gottlieb asked Caro. Caro explained that he tried to talk to all of the social workers who had worked at Camp Madison, and in the process he found one who had delivered the Moseses’ paper. “It was as if I had asked him, ‘How do you know it’s raining out?’”
This is the kind of thing that you’d normally ask your assistant to do, if it occurred to you at all, and it’s noteworthy that Caro has kept at it long after he could have hired an army of researchers. Instead, he relies entirely on his wife Ina, whom he calls “the only person besides myself who has done research on the four volumes of The Years of Lyndon Johnson or on the biography of Robert Moses that preceded them, the only person I would ever trust to do so.” And perhaps a trusted spouse is the best assistant you could ever have.
Of course, there are times when an assistant is necessary, especially if, unlike Caro, you’re hoping to finish your project in fewer than forty years. But it’s often the assistant who benefits. As one of them recalled:
I was working for [Professor] Bernhard J. Stern…and since he was writing a book on social resistance to technological change, he had me reading a great many books that might conceivably be of use to him. My orders were to take note of any passages that dealt with the subject and to copy them down.
It was a liberal education for me and I was particularly struck by a whole series of articles by astronomer Simon Newcomb, which I read at Stern’s direction. Newcomb advanced arguments that demonstrated the impossibility of heavier-than-air flying machines, and maintained that one could not be built that would carry a man. While these articles were appearing, the Wright brothers flew their plane. Newcomb countered with an article that said, essentially, “Very well, one man, but not two.”
Every significant social advance roused opposition on the part of many, it seemed. Well, then, shouldn’t space flight, which involved technological advances, arouse opposition too?
The assistant in question was Isaac Asimov, who used this idea as the basis for his short story “Trends,” which became his first sale to John W. Campbell. It launched his career, and the rest is history. And that’s part of the reason why, when I think of my own book, I say to myself: “Very well, one man, but not two.”
You can’t throw in images the way you throw in a fishhook, at random! These obedient images are, in a film constructed according to the dark and mysterious rules of the unconscious, necessary images, imperious and tyrannical images…It can be useful for a while to rediscover by methods that are unusual, excessive, arbitrary, methods that are primitive, direct, and stripped of nonessentials, polished to the bone, the laws of eternal poetry, but these laws are always the same, and the goal of poetry cannot be simply to play with the laws by which it is made…Just because with the help of psychoanalysis the rules of the game have become infinitely clear, and because the technique of poetry has revealed its secrets, the point is not to show that we are extraordinarily intelligent and that we now know how to go about it.
If you wanted to design the competent man of adventure and science fiction from first principles, you couldn’t do much better than Sir Richard Francis Burton. Elsewhere, I’ve spoken of him as “an unlikely combination of James Frazer, T.E. Lawrence, and Indiana Jones who comes as close as any real historical figure to the Most Interesting Man in the World from the Dos Equis commercials,” and, if anything, that description might be too conservative. As Jorge Luis Borges recounts in his excellent essay “The Translators of The Thousand and One Nights”:
Burton, disguised as an Afghani, made the pilgrimage to the holy cities of Arabia…Before that, in the guise of a dervish, he practiced medicine in Cairo—alternating it with prestidigitation and magic so as to gain the trust of the sick. In 1858, he commanded an expedition to the secret sources of the Nile, a mission that led him to discover Lake Tanganyika. During that undertaking he was attacked by a high fever; in 1855, the Somalis thrust a javelin through his jaws…Nine years later, he essayed the terrible hospitality of the ceremonious cannibals of Dahomey; on his return there was no scarcity of rumors (possibly spread and certainly encouraged by Burton himself) that, like Shakespeare’s omnivorous proconsul, he had “eaten strange flesh.”
Unfounded rumors about his escapades were circulating even during his lifetime, and the only witness to many of his adventures was Burton himself. But his great translation of The Book of the Thousand Nights and a Night, which is one of my most treasured possessions, is a lasting monument. As Borges observes, it is the legendary Burton who remains:
It will be observed that, from his amateur cannibal to his dreaming polyglot, I have not rejected those of Richard Burton’s personae that, without diminution of fervor, we could call legendary. My reason is clear: The Burton of the Burton legend is the translator of the Nights…To peruse The Thousand and One Nights in Sir Richard’s translation is no less incredible than to read it in ‘a plain and literal translation with explanatory notes’ by Sinbad the Sailor.
This is the Burton whom we remember, and his influence can be strongly seen in the careers of two men. The first is the English occultist Aleister Crowley, who dedicated his autobiography to Burton, “the perfect pioneer of spiritual and physical adventure,” and referred to him repeatedly as “my hero.” As the scholar Alex Owen writes in her essay “The Sorcerer and His Apprentice”: “Burton represented the kind of man Crowley most wished to be—strong, courageous, intrepid, but also a learned scholar-poet who chafed against conventional restraints.” Crowley first read Burton in college, and he said of his own efforts at disguising himself during his travels: “I thought I would see what I could do to take a leaf out of Burton’s book.” He later included Burton in the list of saints invoked in the Gnostic Mass of the Ordo Templi Orientis—the same ritual, evidently, that was performed by the rocket scientist and science fiction fan Jack Parsons in Los Angeles, at meetings that were attended by the likes of Jack Williamson, Cleve Cartmill, and Robert A. Heinlein. (Heinlein, who received a set of the Burton Club edition of The Arabian Nights as a birthday present from his wife Ginny in 1963, listed it as part of the essential library in the fallout shelter in Farnham’s Freehold, and a character refers to reading “the Burton original” in Time Enough for Love, which itself is a kind of Scheherazade story.) It can be difficult to separate both Burton and Crowley from the legends that grew up around them, and both have been accused of murder, although I suspect that Crowley, like Burton, would have “confessed rather shamefacedly that he had never killed anybody at any time.” But as Crowley strikingly said of Burton: “The best thing about him is his amazing common sense.” Many of Crowley’s admirers would probably say the same thing, which should remind us that common sense, taken to its extreme, is often indistinguishable from insanity.
Jack Parsons, of course, was notoriously associated with L. Ron Hubbard, who once referred to Crowley as “my good friend,” although the two men never actually met. And Hubbard was fascinated by Burton as well. At the age of twelve, Hubbard encountered a kind of deutero-Burton, the naval officer Joseph “Snake” Thompson, a spy, linguist, zoologist, and psychoanalyst who seemed so implausible a figure that he was once thought to be fictional, although he was very real. In the short novel Slaves of Sleep, Hubbard writes:
A very imperfect idea of the jinn is born of the insipid children’s translations of The Arabian Nights Entertainment, but in the original work…the subject is more competently treated. For the ardent researcher, Burton’s edition is recommended, though due to its being a forbidden work in these United States, it is very difficult to find. There is, however, a full set in the New York Public Library where the wise librarians have devoted an entire division to works dealing with the black arts.
Burton’s translation might have been rare, but it wasn’t exactly forbidden: a few years earlier, L. Sprague de Camp had bought a full set from his boss at the International Correspondence School for seventeen dollars, and it isn’t hard to imagine that his friend Hubbard occasionally borrowed one of its volumes. Another story by Hubbard, The Ghoul, takes place in the fictitious Hotel Burton, and Burton’s influence is visible in all of the Arabian Nights stories that he published in Unknown, as well as in the smug tone of self-deprecation that he used to talk about his accomplishments. When Burton writes that, as a young man, he was “fit for nothing but to be shot at for six pence a day,” or that “I have never been so flattered in my life than to think it would take three hundred men to kill me,” you can hear a premonitory echo both of the voice that Hubbard adopted for his heroes and of his own bluff style of insincere understatement.
And it was Burton’s presentation of himself that resonated the most with Crowley and Hubbard. Burton was the patron saint of the multihyphenates whose fans feel obliged to garland them with a long list of careers, or what Borges calls “the innumerable ways of being a man that are known to mankind.” On their official or semiofficial sites, Burton is described as a “soldier, explorer, linguist, ethnologist, and controversialist”; Crowley as a “poet, novelist, journalist, mountaineer, explorer, chess player, graphic designer, drug experimenter, prankster, lover of women, beloved of men, yogi, magician, prophet, early freedom fighter, human rights activist, philosopher, and artist”; and Hubbard as an “adventurer, explorer, master mariner, pilot, writer, filmmaker, photographer, musician, poet, botanist and philosopher.” But a man only collects so many titles when his own identity remains stubbornly undefined. All three men, notably, affected Orientalist disguises—Burton during his forbidden journey to Mecca, Crowley in Madhura, India, where he obtained “a loincloth and a begging bowl,” and Hubbard, allegedly, in Los Angeles: “I went right down in the middle of Hollywood, I rented an office, got a hold of a nurse, wrapped a towel around my head and became a swami.” (There are also obvious shades of T.E. Lawrence. Owen notes: “While the relationships of Crowley, Burton, and Lawrence to imposture and disguise are different, all three men had vested interests in masking their origins and their uncertain social positions.” And it’s worth noting that all three men were the object of persistent rumors about their sexuality.) In the end, they never removed their masks. Burton may or may not have been the ultimate competent man, but he was a shining example of an individual who became the legend that he had created for himself. Crowley and Hubbard took it even further by acquiring followers, which Burton never did, at least not during his lifetime. But his cult may turn out to be the most lasting of them all.
In his bestselling book The Tipping Point, Malcolm Gladwell devotes several pages to a discussion of the breakout success of the novel Divine Secrets of the Ya-Ya Sisterhood. After its initial release in 1996, it sold reasonably well in hardcover, receiving “a smattering of reviews,” but it became an explosive phenomenon in paperback, thanks primarily to what Gladwell calls “the critical role that groups play in social epidemics.” He writes:
The first bestseller list on which Ya-Ya Sisterhood appeared was the Northern California Independent Bookseller’s list. Northern California…was where seven hundred and eight hundred people first began showing up at [Rebecca Wells’s] readings. It was where the Ya-Ya epidemic began. Why? Because…the San Francisco area is home to one of the country’s strongest book club cultures, and from the beginning Ya-Ya was what publishers refer to as a “book club book.” It was the kind of emotionally sophisticated, character-driven, multilayered novel that invites reflection and discussion, and book groups were flocking to it. The groups of women who were coming to Wells’s readings were members of reading groups, and they were buying extra copies not just for family and friends but for other members of the group. And because Ya-Ya was being talked about and read in groups, the book itself became that much stickier. It’s easier to remember and appreciate something, after all, if you discuss it for two hours with your best friends. It becomes a social experience, an object of conversation. Ya-Ya’s roots in book group culture tipped it into a larger word-of-mouth epidemic.
You could say much the same thing about a very different book that became popular in California nearly five decades earlier. Scientology has exhibited an unexpected degree of staying power among a relatively small number of followers, but Dianetics: The Modern Science of Mental Health, the work that that made L. Ron Hubbard famous, was a textbook case of a viral phenomenon. Just three months elapsed between the book’s publication on May 9, 1950 and Hubbard’s climactic rally at the Shrine Auditorium on August 10, and its greatest impact on the wider culture occurred over a period of less than a year. And its dramatic spread and decline had all the hallmarks of virality. In the definitive Hubbard biography Bare-Faced Messiah, Russell Miller writes:
For the first few days after publication of Dianetics: The Modern Science of Mental Health, it appeared as if the publisher’s caution about the book’s prospects had been entirely justified. Early indications were that it had aroused little interest; certainly it was ignored by most reviewers. But suddenly, towards the end of May, the line on the sales graph at the New York offices of Hermitage House took a steep upturn.
By midsummer, it was selling a thousand copies a day, and by late fall, over seven hundred dianetics clubs had been established across the country. As Miller writes: “Dianetics became, virtually overnight, a national ‘craze’ somewhat akin to the canasta marathons and pyramid clubs that had briefly flourished in the hysteria of postwar America.”
The result was a quintessential social epidemic, and I’m a little surprised that Gladwell, who is so hungry for case studies, has never mentioned it. The book itself was “sticky,” with its promise of a new science of mental health that could be used by anyone and that got results every time. Like Ya-Ya, it took root in an existing group—in this case, the science fiction community, which was the natural audience for its debut in the pages of Astounding. Just as the ideal book club selection is one that inspires conversations, dianetics was a shared experience: in order to be audited, you needed to involve at least one other person. Auditing, as the therapy was originally presented, seemed so easy that anyone could try it, and many saw it as a kind of parlor game. (In his biography of Robert A. Heinlein, William H. Patterson shrewdly compares it to the “Freuding parties” that became popular in Greenwich Village in the twenties.) Even if you didn’t want to be audited yourself, dianetics became such a topic of discussion among fans that summer that you had to read the book to be a part of it. It also benefited from the presence of what Gladwell calls mavens, connectors, and salesmen. John W. Campbell was the ultimate maven, an information broker who, as one of Gladwell’s sources puts it, “wants to solve other people’s problems, generally by solving his own.” The connectors included prominent members of the fan community, notably A.E. van Vogt, who ended up running the Los Angeles foundation, and Forrest Ackerman, Hubbard’s agent and “the number one fan.” And the salesman was Hubbard himself, who threw himself into the book’s promotion on the West Coast. As Campbell wrote admiringly to Heinlein: “When Ron wants to, he can put on a personality that would be a confidence man’s delight—persuasive, gentle, intimately friendly. The perfect bedside manner, actually.”
In all epidemics, geography plays a crucial role, and in the case of dianetics, it had profound consequences on individual careers. One of Campbell’s priorities was to sell the therapy to his top writers, much as the Church of Scientology later reached out to movie stars, and the single greatest predictor of how an author would respond was his proximity to the centers of fan culture. Two of the most important converts were van Vogt, who was in Los Angeles, and Theodore Sturgeon, who lived in New York, where he was audited by Campbell himself. Isaac Asimov, by contrast, had moved from Manhattan to Boston just the year before, and Heinlein, fascinatingly, had left Hollywood, where he had been working on the film Destination Moon, in February of 1950. Heinlein was intrigued by dianetics, but because he was in Colorado Springs with his wife Ginny, who refused to have anything to do with it, he was unable to find an auditing partner. And it’s worth wondering what might have ensued if he had remained in Southern California for another six months. (Such accidents of place and time can have significant aftereffects. Van Vogt had moved from the Ottawa area to Los Angeles in 1944, and his involvement with dianetics took him out of writing for the better part of a decade, at the very moment when science fiction was breaking into the culture as a whole. His absence during this critical period, which made celebrities out of Heinlein and Asimov, feels like a big part of the reason why van Vogt has mostly disappeared from the popular consciousness. And it might never have happened if he had stayed in Canada.) The following year, dianetics as a movement fizzled out, due largely to Hubbard’s own behavior—although he might also have sensed that it wouldn’t last. But it soon mutated into another form. And before long, Hubbard would begin to spread a few divine secrets of his own.
By now, you’re probably sick of hearing about what happened at the Oscars. I’m getting a little tired of it, too, even though it was possibly the strangest and most riveting two minutes I’ve ever seen on live television. It left me feeling sorry for everyone involved, but there are at least three bright spots. The first is that it’s going to make a great case study for somebody like Malcolm Gladwell, who is always looking for a showy anecdote to serve as a grabber opening for a book or article. So many different things had to go wrong for it to happen—on the levels of design, human error, and simple dumb luck—that you can use it to illustrate just about any point you like. A second silver lining is that it highlights the basically arbitrary nature of all such awards. As time passes, the list of Best Picture winners starts to look inevitable, as if Cimarron and Gandhi and Chariots of Fire had all been canonized by a comprehensible historical process. If anything, the cycle of inevitability is accelerating, so that within seconds of any win, the narratives are already locking into place. As soon as La La Land was announced as the winner, a story was emerging about how Hollywood always goes for the safe, predictable choice. The first thing that Dave Itzkoff, a very smart reporter, posted on the New York Times live chat was: “Of course.” Within a couple of minutes, however, that plot line had been yanked away and replaced with one for Moonlight. And the fact that the two versions were all but superimposed onscreen should warn us against reading too much into outcomes that could have gone any number of ways.
But what I want to keep in mind above all else is the example of La La Land producer Jordan Horowitz, who, at a moment of unbelievable pressure, simply said: “I’m going to be really proud to hand this to my friends from Moonlight.” It was the best thing that anybody could have uttered under those circumstances, and it tells us a lot about Horowitz himself. If you were going to design a psychological experiment to test a subject’s reaction under the most extreme conditions imaginable, it’s hard to think of a better one—although it might strike a grant committee as possibly too expensive. It takes what is undoubtedly one of the high points of someone’s life and twists it instantly into what, if perhaps not the worst moment, at least amounts to a savage correction. Everything that the participants onstage did or said, down to the facial expressions of those standing in the background, has been subjected to a level of scrutiny worthy of the Zapruder film. At the end of an event in which very little occurs that hasn’t been scripted or premeditated, a lot of people were called upon to figure out how to act in real time in front of an audience of hundreds of millions. It’s proverbial that nobody tells the truth in Hollywood, an industry that inspires insider accounts with titles like Hello, He Lied and Which Lie Did I Tell? A mixup like the one at the Oscars might have been expressly conceived as a stress test to bring out everyone’s true colors. Yet Horowitz said what he did. And I suspect that it will do more for his career than even an outright win would have accomplished.
It also reminds me of other instances over the last year in which we’ve learned exactly what someone thinks. When we get in trouble for a remark picked up on a hot mike, we often say that it doesn’t reflect who we really are—which is just another way of stating that it doesn’t live up to the versions of ourselves that we create for public consumption. It’s far crueler, but also more convincing, to argue that it’s exactly in those unguarded, unscripted moments that our true selves emerge. (Freud, whose intuition on such matters was uncanny, was onto something when he focused on verbal mistakes and slips of the tongue.) The justifications that we use are equally revealing. Maybe we dismiss it as “locker room talk,” even if it didn’t take place anywhere near a locker room. Kellyanne Conway excused her reference to the nonexistent Bowling Green Massacre by saying “I misspoke one word,” even though she misspoke it on three separate occasions. It doesn’t even need to be something said on the spur of the moment. At his confirmation hearing for the position of ambassador to Israel, David M. Friedman apologized for an opinion piece he had written before the election: “These were hurtful words, and I deeply regret them. They’re not reflective of my nature or my character.” Friedman also said that “the inflammatory rhetoric that accompanied the presidential campaign is entirely over,” as if it were an impersonal force that briefly took possession of its users and then departed. We ask to be judged on our most composed selves, not the ones that we reveal at our worst.
To some extent, that’s a reasonable request. I’ve said things in public and in private that I’ve regretted, and I wouldn’t want to be judged solely on my worst moments as a writer or parent. At a time when a life can be ruined by a single tweet, it’s often best to err on the side of forgiveness, especially when there’s any chance of misinterpretation. But there’s also a place for common sense. You don’t refer to an event as a “massacre” unless you really think of it that way or want to encourage others to do so. And we judge our public figures by what they say when they think that nobody is listening, or when they let their guard down. It might seem like an impossibly high standard, but it’s also the one that’s effectively applied in practice. You can respond by becoming inhumanly disciplined, like Obama, who in a decade of public life has said maybe five things he has reason to regret. Or you can react like Trump, who says five regrettable things every day and trusts that its sheer volume will reduce it to a kind of background noise—which has awakened us, as Trump has in so many other ways, to a political option that we didn’t even knew existed. Both strategies are exhausting, and most of us don’t have the energy to pursue either path. Instead, we’re left with the practical solution of cultivating the inner voice that, as I wrote last week, allows us to act instinctively. Kant writes: “Live your life as though your every act were to become a universal law.” Which is another way of saying that we should strive to be the best version of ourselves at all times. It’s probably impossible. But it’s easier than wearing a mask.
In his book A New Theory of Urban Design, which was published thirty years ago, the architect Christopher Alexander opens with a consideration of the basic problem confronting all city planners. He draws an analogy between the process of urban design and that of creating a work of art or studying a biological organism, but he also points out their fundamental differences:
With a city, we don’t have the luxury of either of these cases. We don’t have the luxury of a single artist whose unconscious process will produce wholeness spontaneously, without having to understand it—there are simply too many people involved. And we don’t have the luxury of the patient biologist, who may still have to wait a few more decades to overcome his ignorance.
What happens in the city, happens to us. If the process fails to produce wholeness, we suffer right away. So, somehow, we must overcome our ignorance, and learn to understand the city as a product of a huge network of processes, and learn just what features might make the cooperation of these processes produce a whole.
And wherever he writes “city,” you can replace it with any complicated system—a nation, a government, an environmental crisis—that seems too daunting for any individual to affect on his or her own, and toward which it’s easy to despair over our own helplessness, especially, as Alexander notes, when it’s happening to us.
Alexander continues: “We must therefore learn to understand the laws which produce wholeness in the city. Since thousands of people must cooperate to produce even a small part of a city, wholeness in the city will only be created to the extent that we can make these laws explicit, and can then introduce them, openly, explicitly, into the normal process of urban development.” We can pause here to note that this is as good an explanation as any of why rules play a role in all forms of human activity. It’s easy to fetishize or dismiss the rules to the point where we overlook why they exist in the first place, but you could say that they emerge whenever we’re dealing with a process that is too complicated for us to wing it. Some degree of improvisation enters into much of what we do, and in many cases—when we’re performing a small task for the first time with minimal stakes—it’s fine to make it up as we go along. The larger, more important, or more complex the task, however, the more useful it becomes to have a few guidelines on which we can fall back whenever our intuition or conscience fails us. Rules are nice because they mean that we don’t constantly have to reason from first principles whenever we’re faced with a choice. They often need to be amended, supplemented, or repealed, and we should never stop interrogating them, but they’re unavoidable. Every time we discard a rule, we implicitly replace it with another. And it can be hard to strike the right balance between a reasonable skepticism of the existing rules and an understanding of why they’re pragmatically good to have around.
Before we can develop a set of rules for any endeavor, however, it helps to formulate what Alexander calls “a single, overriding rule” that governs the rest. It’s worth quoting him at length here, because the challenge of figuring out a rule for urban design is much the same as that for any meaningful project that involves a lot of stakeholders:
The growth of a town is made up of many processes—processes of construction of new buildings, architectural competitions, developers trying to make a living, people building additions to their houses, gardening, industrial production, the activities of the department of public works, street cleaning and maintenance…But these many activities are confusing and hard to integrate, because they are not only different in their concrete aspects—they are also guided by entirely different motives…One might say that this hodgepodge is highly democratic, and that it is precisely this hodgepodge which most beautifully reflects the richness and multiplicity of human aspirations.
But the trouble is that within this view, there is no sense of balance, no reasonable way of deciding how much weight to give the different aims within the hodgepodge…For this reason, we propose to begin entirely differently. We propose to imagine a single process…one which works at many levels, in many different ways…but still essentially a single process, in virtue of the fact that it has a single goal.
And Alexander arrives at a single, overriding rule that is so memorable that I seem to think about it all the time: “Every increment of construction must be made in such a way as to heal the city.”
But it isn’t hard to understand why this rule isn’t more widely known. It’s difficult to imagine invoking it at a city planning meeting, and it has a mystical ring to it that I suspect makes many people uncomfortable. Yet this is less a shortcoming in the rule itself than a reflection of the kind of language that we need to develop an intuition about what other rules to follow. Alexander argues that most of us have a “a rather good intuitive sense” of what this rule means, and he points out: “It is, therefore, a very useful kind of inner voice, which forces people to pay attention to the balance between different goals, and to put things together in a balanced fashion.” The italics are mine. Human beings have trouble keeping all of their own rules in their heads at once, much less those that apply to others, so our best bet is to develop an inner voice that will guide us when we don’t have ready access to the rules for a specific situation. (As David Mamet says of writing: “Keep it simple, stupid, and don’t violate the rules that you do know. If you don’t know which rule applies, just don’t muck up the more general rules.”) Most belief systems amount to an attempt to cultivate that voice, and if Alexander’s advice has a religious overtone, it’s because we tend to associate such admonitions with the contexts in which they’ve historically arisen. “Love your enemies” is one example. “Desire is suffering” is another. Such precepts naturally give rise to other rules, which lead in turn to others, and one of the shared dangers in city planning and religion is the failure to remember the underlying purpose when faced with a mass of regulations. Ideally, they serve as a system of best practices, but they often have no greater goal than to perpetuate themselves. And as Alexander points out, it isn’t until you’ve taken the time to articulate the one rule that governs the rest that you can begin to tell the difference.
The mingling of object and image in collage, of given fact and conscious artifice, corresponds to the illusion-producing processes of contemporary civilization. In advertisements, news stories, films, and political campaigns, lumps of unassailable data are implanted in preconceived formats in order to make the entire fabrication credible. Documents waved at hearings by Joseph McCarthy to substantiate his fictive accusations were a version of collage, as is the corpse of Lenin, inserted by Stalin into the Moscow mausoleum to authenticate his own contrived ideology. Twentieth-century fictions are rarely made up of the whole cloth, perhaps because the public has been trained to have faith in “information.” Collage is the primary formula of the aesthetics of mystification developed in our time.