Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Archive for the ‘Movies’ Category

The bedtime story

leave a comment »

Earlier this morning, I finally got my hands on the companion book to James Cameron’s Story of Science Fiction, which is airing this month on AMC. Naturally, I immediately looked for references to the four main subjects of Astounding, and the passage that caught my eye first was an exchange between Cameron and Steven Spielberg:

Spielberg: The working title of E.T. was Watch the Skies. Which is sort of the last line from The Thing. I just remember looking at the sky because of the influence of my father, and saying, only good should come from that. If it ain’t an ICBM coming from the Soviet Union, only good should come from beyond our gravitational hold…He was a visionary about that, yet he read all the Analog. Those paperbacks? And Amazing Stories, the paperbacks of that. I used to read that along with him. Sometimes, he’d read those books to me, those little tabloids to me at night.

Cameron: Asimov, Heinlein, all those guys were all published in those pulp magazines.

Spielberg: They were all published in those magazines, and a lot of them were optimists. They weren’t always calculating our doom. They were finding ways to open up our imagination and get us to dream and get us to discover and get us to contribute to the greater good.

The discussion quickly moves on to other subjects, but not before hinting at the solution to a mystery that I’ve been trying to figure out for years, which is why the influence of Astounding and its authors can be so hard to discern in the work of someone like Spielberg. In part, it’s a matter of timing. Spielberg was born in 1946, which means that he would have been thirteen when John W. Campbell announced that that his magazine was changing its title to Analog. As a result, at a point at which he should have been primed to devour science fiction, Spielberg doesn’t seem to have found its current incarnation all that interesting, for which you can hardly blame him. Instead, his emotional associations with the pulps were evidently passed down through his father, Arnold Spielberg, an electrical engineer who worked for General Electric and RCA. The elder Spielberg, remarkably, is still active at the age of 101, and just two months ago, he said in an interview with GE Reports:

I was also influenced by science fiction. There were twins in our neighborhood who read one of the first sci-fi magazines, called Astounding Stories of Science and Fact. They gave me one copy, and when I brought it home, I was hooked. The magazine is now called Analog Science Fiction and Fact, and I still get it.

And while I don’t think that there’s any way of verifying it, if Arnold Spielberg—the father of Steven Spielberg—isn’t the oldest living subscriber to Analog, he must be close.

This sheds light on his son’s career, although perhaps not in the way that you might think. Spielberg is such a massively important figure that his very existence realigns the history of the genre, and when he speaks of his influences, we need to be wary of the shadow cast by his inescapable personality. But there’s no denying the power—and truth—of the image of Arnold Spielberg reading from the pulps aloud to his son. It feels like an image from one of Spielberg’s own movies, which has been shaped from the beginning by the tradition of oral storytelling. (It’s worth noting, though, that the father might recall things differently than the son. In his biography of the director, Joseph McBride quotes Arnold Spielberg: “I’ve been reading science fiction since I was seven years old, all the way back to the earliest Amazing Stories. Amazing, Astounding, Analog—I still subscribe. I still read ’em. My kids used to complain, ‘Dad’s in the bathroom with a science-fiction magazine. We can’t get in.'”) For Spielberg, the stories seem inextricably linked with the memory of being taken outside by his father to look at the stars:

My father was the one that introduced me to the cosmos. He’s the one who built—from a big cardboard roll that you roll rugs on—a two-inch reflecting telescope with an Edmund Scientific kit that he had sent away for. [He] put this telescope together, and then I saw the moons of Jupiter. It was the first thing he pointed out to me. I saw the rings of Saturn around Saturn. I’m six, seven years old when this all happened.

Spielberg concludes: “Those were the stories, and just looking up at the sky, that got me to realize, if I ever get a chance to make a science fiction movie, I want those guys to come in peace.”

But it also testifies to the ways in which a strong personality will take exactly what it needs from its source material. Elsewhere in the interview, there’s another intriguing reference:

Spielberg: I always go for the heart first. Of course, sometimes I go for the heart so much I get a little bit accused of sentimentality, which I’m fine [with] because…sometimes I need to push it a little further to reach a little deeper into a society that is a little less sentimental than they were when I was a young filmmaker.

Cameron: You pushed it in the same way that John W. Campbell pushed science fiction [forward] from the hard-tech nerdy guys who had to put PhD after their name to write science fiction. It was all just about the equations and the math and the physics [and evolved to become much more] human stories [about] the human heart.

I see what Cameron is trying to say here, but if you’ve read enough of the magazine that turned into Analog, this isn’t exactly the impression that it leaves. It’s true that Campbell put a greater emphasis than most of his predecessors on characterization, at least in theory, but the number of stories that were about “the human heart” can be counted on two hands, and none were exactly Spielbergian—although they might seem that way when filtered through the memory of his father’s voice. And toward the end, the nerds took over again. In Dangerous Visions, which was published in 1967, Harlan Ellison wrote of “John W. Campbell, Jr., who used to edit a magazine that ran science fiction, called Astounding, and who now edits a magazine that runs a lot of schematic drawings, called Analog.” It was the latter version of the magazine that Spielberg would have seen as a boy—which may be why, when the time came, he made a television show called Amazing Stories.

A director’s education

leave a comment »

As for the various functions of the craft you are involved with, the best way is to practice them yourself. It’s less important to read about acting and to study the advice of great teachers than to act yourself. If you can, work in a summer theatre. Or get a part in a play or a movie and submit to the will of the director. Design a set for a production, create the costumes for another. Knuckle down and do it. Lighting is critical—you are creating another world, not a real world—and experience in this is absolutely necessary. You must know what the choices are and what the problems are and what the techniques are and what the available materials are before you can successfully guide others. Learn by doing. Fail, but try everything you can. Take subsidiary positions. Be an assistant director, a call boy, be a stage manager. Go to a dance class and learn how choreographers work and what keeps a dance moving, what the classical movements are. Everything is relevant.

Then read. What a pleasurable way of life it is that requires you to study everything! Collect books. Collect clippings. Cut out every illustration in papers or magazines that attracts you. It caught your eye for a reason. There are marvelous compositions in newspapers, especially the bad ones…Keep a diary. It will force you to articulate your observations, and it will train your eyes and ears to see and hear and notice. Carry a pocket notebook. Aways have a pen with you. A half-hour’s walk can be valuable. Every street in New York can provide an encounter, and all encounters that surprise you are precious. A bus or subway ride can be a treasure trip. Don’t take taxis. You see nothing and learn nothing in a taxi—it’s a waste of time.

Elia Kazan, Kazan on Directing

Written by nevalalee

May 12, 2018 at 7:30 am

Playing the game

leave a comment »

Yesterday, the magazine PC Gamer published an article by Alex Wiltshire on the challenges of writing for blockbuster video games. It’s an illuminating piece, especially if you haven’t given much thought to the subject before, and it’s loaded with interesting insights. What struck me the most, though, was the way in which the writers tend to talk about themselves and their craft. Walt Williams, the author of a memoir about game development that I clearly need to read, says of the trade: “As much as we like to say that video games can be a narrative medium, financially they’re really not…Writing is expendable.” Tom Bissell, who has worked on games in the Gears of War and Uncharted series, has similar views, as Wiltshire writes:

Bissell says that games have “shitty stories” because games are often simply absurd. “That’s not a criticism, it’s an acknowledgment of the reality that stares anyone working on an action game right in the face…The only way you escape the absurdity problem is through sheer force of will, and you can do that only when the prime creative force behind the game is also overseeing virtually every aspect of it…That’s not a position most game writers will ever find themselves in, obviously.”

And Williams concludes: “Our biggest mistake is that we’ve decided to consider AAA [blockbuster] games as something better than they are. We like to think our super-silly destruction derby arena is a piece of serious art that can say something meaningful.”

As I read this, I was strongly reminded of what another writer says about an art form that had been around for decades, but was still in its formative stages at the height of his career:

The movies are one of the bad habits that corrupted our century…The persistent banality of the movies is due to the “vision” of their manufacturers. I do not mean by manufacturers, writers or directors. These harassed toilers are no more than the lowest of Unteroffizieren in movieland. The orders come from the tents of a dozen invisible generals. The “vision” is theirs. They keep a visionary eye glued to the fact that the lower in class an entertainment product is, the more people will buy it…[The studio head] must examine every idea, plot, or venture submitted to him from the single point of view of whether it is trite enough to appeal to the masses.

The writer here is the screenwriter Ben Hecht, whose memoir A Child of the Century is filled with what Pauline Kael describes in “Raising Kane” as his “frivolously cynical” view of filmmaking. In 1925, Hecht, who had only seen “a few movies” at the time, said confidently to his friend Herman J. Mankiewicz: “Anybody with a good memory for clichés and unafraid to write like a child can bat out a superb movie in a few days.” A year later, Mankiewicz—who would go on to win an Oscar for Citizen Kane—took him at his word, and he cabled Hecht from Hollywood: “Will you accept three hundred per week to work for Paramount Pictures? All expenses paid. The three hundred is peanuts. Millions are to be grabbed out here and your only competition is idiots. Don’t let this get around.”

Hecht went on to a legendary career, much of which was spent serving as what Tomb Raider writer Rhianna Pratchett calls a “narrative paramedic” on movies like Gone With the Wind. And while I doubt that any video game writers are earning millions from their work, their attitude toward their medium seems largely the same as Hecht’s, even before you account for the intervening ninety years. Hecht writes of the films of the thirties:

One basic plot only has appeared daily in their fifteen hundred theaters—the triumph of virtue and the overthrow of wickedness…Not only was the plot the same, but the characters in it never varied. These characters must always be good or bad (and never human) in order not to confuse the plot of Virtue Triumphing. This denouement could be best achieved by stereotypes a fraction removed from those in the comic strips.

Despite their occasional stabs at moral ambiguity, most games operate under similar constraints, and the situation is only exacerbated by the money and resources at stake. Hecht writes that “millions of dollars and not mere thousands were involved,” while Bissell says that video games are “possibly the most complicated popular art form ever created,” which only decreases any tolerance for risk. Invariably, it’s the writers who lose. Hecht says that he ultimately lost every fight that he had with his producers, adding mordantly: “Months later, watching ‘my’ movie in a theater, I realize that not much damage had actually been done. A movie is basically so trite and glib that the addition of a half dozen miserable inanities does not cripple it.”

You might think that the solution would be to give the writers more control, but those on the inside seem unconvinced. Wiltshire writes:

For Bissell it’s a misconception that they’d improve if only writers were more integral with development. “Sorry, but that’s just not true in my experience. Games can go wrong in so many ways that have nothing to do with who the writer is or how well or poorly he or she or they are treated. Sometimes cleaning up the mess in a wayward game falls on level design and sometimes art and sometimes narrative, but this idea that games have ‘shitty stories’ because there aren’t good writers in the industry, or that writers aren’t listened to, is, to be perfectly frank, a deflection.”

Hecht makes much the same observation: “In a curious way, there is not much difference between the product of a good writer and a bad one. They both have to toe the same mark.” Which seems to be the real point in common. Movies and video games can both produce masterpieces, even at their most commercial, but on the blockbuster level, they tend to be the sum of a pattern of forces, with the writer serving as a kind of release valve for the rest, even if his or her contributions are usually undervalued. (“Everyone writes, whereas not everyone designs or codes, and I think people feel they have a stake in it,” says Phil Huxley, a former writer for Rocksteady.) In both cases, success or failure can be a matter of luck, and in the meantime, the game has to be its own reward, as Hecht knows well: “Making movies is a game played by a few thousand toy-minded folk. It is obsessive, exhausting, and jolly, as a good game should be. Played intently, it divorces you from life, as a good game will do.”

Thinkers of the unthinkable

with 4 comments

At the symposium that I attended over the weekend, the figure whose name seemed to come up the most was Herman Kahn, the futurologist and military strategist best known for his book On Thermonuclear War. Kahn died in 1983, but he still looms large over futures studies, and there was a period in which he was equally inescapable in the mainstream. As Louis Menand writes in a harshly critical piece in The New Yorker: “Herman Kahn was the heavyweight of the Megadeath Intellectuals, the men who, in the early years of the Cold War, made it their business to think about the unthinkable, and to design the game plan for nuclear war—how to prevent it, or, if it could not be prevented, how to win it, or, if it could not be won, how to survive it…The message of [his] book seemed to be that thermonuclear war will be terrible but we’ll get over it.” And it isn’t surprising that Kahn engaged in a dialogue throughout his life with science fiction. In her book The Worlds of Herman Kahn, Sharon Ghamari-Tabrizi relates:

Early in life [Kahn] discovered science fiction, and he remained an avid reader throughout adulthood. While it nurtured in him a rich appreciation for plausible possibilities, [his collaborator Anthony] Wiener observed that Kahn was quite clear about the purposes to which he put his own scenarios. “Herman would say, ‘Don’t imagine that it’s an arbitrary choice as though you were writing science fiction, where every interesting idea is worth exploring.’ He would have insisted on that. The scenario must focus attention on a possibility that would be important if it occurred.” The heuristic or explanatory value of a scenario mattered more to him than its accuracy.

Yet Kahn’s thinking was inevitably informed by the genre. Ghamari-Tabrizi, who refers to nuclear strategy as an “intuitive science,” sees hints of “the scientist-sleuth pulp hero” in On Thermonuclear War, which is just another name for the competent man, and Kahn himself openly acknowledged the speculative thread in his work: “What you are doing today fundamentally is organizing a Utopian society. You are sitting down and deciding on paper how a society at war works.” On at least one occasion, he invoked psychohistory directly. In the revised edition of the book Thinking About the Unthinkable, Kahn writes of one potential trigger for a nuclear war:

Here we turn from historical fact to science fiction. Isaac Asimov’s Foundation novels describe a galaxy where there is a planet of technicians who have developed a long-term plan for the survival of civilization. The plan is devised on the basis of a scientific calculation of history. But the plan is upset and the technicians are conquered by an interplanetary adventurer named the Mule. He appears from nowhere, a biological mutant with formidable personal abilities—an exception to the normal laws of history. By definition, such mutants rarely appear but they are not impossible. In a sense, we have already seen a “mule” in this century—Hitler—and another such “mutant” could conceivably come to power in the Soviet Union.

And it’s both frightening and revealing, I think, that Kahn—even as he was thinking about the unthinkable—doesn’t take the next obvious step, and observe that such a mutant could also emerge in the United States.

Asimov wouldn’t have been favorably inclined toward the notion of a “winnable” nuclear war, but Kahn did become friendly with a writer whose attitudes were more closely aligned with his own. In the second volume of Robert A. Heinlein: In Dialogue with His Century, William H. Patterson describes the first encounter between the two men:

By September 20, 1962, [the Heinleins] were in Las Vegas…[They] met Dr. Edward Teller, who had been so supportive of the Patrick Henry campaign, as well as one of Teller’s colleagues, Herman Kahn. Heinlein’s ears pricked up when he was introduced to this jolly, bearded fat man who looked, he said, more like a young priest than one of the sharpest minds in current political thinking…Kahn was a science fiction reader and most emphatically a Heinlein fan.

Three years later, Heinlein attended a seminar, “The Next Ten Years: Scenarios and Possibilities,” that Kahn held at the Hudson Institute in New York. Heinlein—who looked like Quixote to Kahn’s Sancho Panza—was flattered by the reception:

If I attend an ordinary cocktail party, perhaps two or three out of a large crowd will know who I am. If I go to a political meeting or a church or such, I may not be spotted at all…But at Hudson Institute, over two-thirds of the staff and over half of the students button-holed me. This causes me to have a high opinion of the group—its taste, IQ, patriotism, sex appeal, charm, etc. Writers are incurably conceited and pathologically unsure of themselves; they respond to stroking the way a cat does.

And it wasn’t just the “stroking” that Heinlein liked, of course. He admired Thinking About the Unthinkable and On Thermonuclear War, both of which would be interesting to read alongside Farnham’s Freehold, which was published just a few years later. Both Heinlein and Kahn thought about the future through stories, in a pursuit that carried a slightly disreputable air, as Kahn implied in his use of the word “scenario”:

As near as I can tell, the term scenario was first used in this sense in a group I worked with at the RAND Corporation. We deliberately choose the word to deglamorize the concept. In writing the scenarios for various situations, we kept saying “Remember, it’s only a scenario,” the kind of thing that is produced by Hollywood writers, both hacks and geniuses.

You could say much the same about science fiction. And perhaps it’s appropriate that Kahn’s most lasting cultural contribution came out of Hollywood. Along with Wernher von Braun, he was one of the two most likely models for the title character in Dr. Strangelove. Stanley Kubrick immersed himself in Kahn’s work—the two men met a number of times—and Kahn’s reaction to the film was that of a writer, not a scientist. As Ghamari-Tabrizi writes:

The Doomsday Machine was Kahn’s idea. “Since Stanley lifted lines from On Thermonuclear War without change but out of context,” Khan told reporters, he thought he was entitled to royalties from the film. He pestered him several times about it, but Kubrick held firm. “It doesn’t work that way!” he snapped, and that was that.

The crowded circle

leave a comment »

Earlier this week, Thrillist posted a massive oral history devoted entirely to the climactic battle scene in The Avengers. It’s well over twelve thousand words, or fifty percent longer than Ronan Farrow’s Pulitzer Prize-winning investigation of Harvey Weinstein, and you can occasionally feel it straining to justify its length. In its introduction, it doesn’t shy away from the hard sell:

Scholars swore that comic-book moviemaking peaked with Christopher Nolan’s lauded vision for The Dark Knight, yet here was an alternative, propulsive, prismatic, and thoughtful…The Battle of New York wasn’t just a third-act magic trick; it was a terraforming of the blockbuster business Hollywood believed it understood.

To put it mildly, this slightly overstates the case. Yet the article is still worth reading, both for its emphasis on the contributions of such artists as storyboard artist Jane Wu and for the presence of director Joss Whedon, who casually throws shade in all directions, including at himself. For instance, at one point, Ryan Meinerding, the visual effects department supervisor, recalls of the design of the alien guns: “We tried to find something that, if Black Widow got ahold of one of their weapons, she could use it in an interesting way. Which is how we ended up with that sort of long Civil War weapons.” Whedon’s perspective is somewhat different: “I look back, and I’m like, So my idea for making the weapons look different was to give them muskets? Did I really do that? Was that the sexiest choice? Muskets? Okay. But you know, hit or miss.”

These days, I can’t listen to Whedon’s studiously candid, self-deprecating voice in quite the way that I once did, but he’s been consistently interesting—if not always convincing—on points of craft, and his insights here are as memorable as usual. My favorite moment comes when he discusses the structure of the sequence itself, which grew from an idea for what he hoped would be an iconic image:

We’re going to want to see the group together. We’re going to want to do a shot of everyone back to back. Now we are a team. This is “The Avengers.” We’d get them in a circle and all facing up. Ryan Meinerding painted the team back to back, and that’s basically what I shot. They’re so kinetic and gorgeous, and he has a way of taking comic books and really bringing them to life, even beyond Alex Ross in a way that I’ve never seen…But then it was like, okay, why are they in a circle? That’s where they’re standing, but why? Let’s assume that there are aliens all over the walls, they’re surrounding them, they’re going to shoot at them, but they haven’t started yet. Why haven’t they started yet? And I was like Oh, let’s give the aliens a war cry… Then one of the aliens takes off his mask because we need to see their faces and hear that cry. The Avengers are surrounded by guys going, “We are going to fuck you up.” But not by guys who are shooting yet.

He concludes: “So there is a very specific reason that sort of evolved more and more right before we shot it. And then it’s like, okay, we got them here, and then once they’re there, you’re like, okay, how do we get them to the next thing?”

On some level, this is the kind of thing I should love. As I’ve discussed here before, the big beats of a story can emerge from figuring out what comes before and after a single moment, and I always enjoy watching a writer work through such problems in the most pragmatic way possible. In this case, though, I’m not sure about the result. The third act of The Avengers has always suffered a little, at least for me, from its geographic constraints. A handful of heroes have to credibly fend off an attack from an alien army, which naturally limits how big or dispersed the threat can be, and it seems strange that an invasion of the entire planet could be contained within a few blocks, even if they happen to include the photogenic Park Avenue Viaduct. The entire conception is undermined by the need to keep most of the characters in one place. You could imagine other possible climaxes—a chase, an assault on the enemy stronghold, a battle raging simultaneously at different locations around the world—that would have involved all the major players while still preserving a sense of plausibility and scale. But then you wouldn’t have gotten that circle shot. (Elsewhere in the article, Whedon offers a weirdly condescending aside about Zak Penn’s original draft of the script: “I read it one time, and I’ve never seen it since. I was like, ‘Nope. There’s nothing here.’ There was no character connection. There was a line in the stage directions that said, apropos of nothing, ‘And then they all walk towards the camera in slow motion because you have to have that.’ Yeah, well, no: You have to earn that.” Which sounds more to me like Whedon defensively dismissing the kind of joke that he might have made himself. And you could make much the same criticism of the circle shot that he had in mind.)

And the whole anecdote sums up my mixed feelings toward the Marvel Universe in general and The Avengers in particular. On its initial release, I wrote that “a lot of the film, probably too much, is spent slotting all the components into place.” That certainly seems to have been true of the climax, which also set a dangerous precedent in which otherwise good movies, like The Winter Soldier, felt obliged to end in a blur of computer effects. And it’s even more clear now that Whedon’s tastes and personality were only occasionally allowed to shine through, often in the face of active opposition from the studio. (Of the one of the few moments from the entire movie that I still recall fondly, Whedon remembers: “There were objections to Hulk tossing Loki. I mean, strong objections. But they were not from Kevin [Feige] and Jeremy [Latcham], so I didn’t have to worry.”) Marvel has since moved on to movies like Captain America: Civil War, Thor: Ragnarok, and Black Panther, much of which are authentically idiosyncratic, fun, and powerful in a way that the studio’s defining effort managed to only intermittently pull off. But it’s revealing that the last two films were mostly allowed to stand on their own, which is starting to seem like a luxury. Marvel is always trying to get to that circle shot, and now the numbers have been multiplied by five. It reflects what I’ve described as the poster problem, which turns graphic design—or storytelling—into an exercise in crowd control. I’m looking forward to Avengers: Infinity War, but my expectations have been tempered in ways for which The Avengers itself, and specifically its climactic battle, was largely responsible. As Whedon concedes: “Sometimes you have to do the shorthand version, and again, that’s sort of against how I like to view people, but it’s necessary when you already have twenty major characters.”

One breath, one blink

leave a comment »

Gene Hackman in The Conversation

Note: I’m taking a few days off, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on February 14, 2017.

A while back, my wife, who is a professional podcaster, introduced me to the concept of the “breath” in audio editing. When you’re putting together an episode for a medium like radio, you often find yourself condensing an interview or splicing together two segments, and you can run into trouble when those edits interfere with the speaker’s natural breathing rhythms. As an excellent tutorial from NPR explains it:

Breaths are a problem when they are upcut or clipped. An upcut breath is one that is edited so it’s incomplete (or “chopped”)—only the first or last part is audible…Missing breaths are just that—breaths that have been removed or silenced. They sound unnatural and can cause some listeners to feel tense…Breaths are also problematic when they don’t match the cadence of the speech (i.e. a short, quick breath appears in the middle of a slower passage)…

When editing breaths, listen closely to the beginning and end. If replacing a breath, choose one that matches the cadence and tone of the words around it.

For example, a short, quick breath is useful during an interruption or an excited, quick-paced reply. A longer breath is appropriate for a relaxed, measured response…As a rule of thumb, do not remove breaths—it sounds unnatural.

I’m particularly interested in the idea that a poorly edited breath can make the listener feel anxious without knowing it, which reminds me of something that the film editor Walter Murch says in his book In The Blink of an Eye. Murch writes that when he was editing Francis Ford Coppola’s The Conversation, he noticed that Harry Caul, the character played by Gene Hackman, would frequently blink around the point where he had decided to make a cut. “It was interesting,” Murch says, “but I didn’t know what to make of it.” Then he happened to read an interview with the director John Huston that shed an unexpected light on the subject:

To me, the perfect film is as though it were unwinding behind your eyes…Look at that lamp across the room. Now look back at me. Look back at that lamp. Now look back at me again. Do you see what you did? You blinked. Those are cuts. After the first look, you know that there’s no reason to pan continuously from me to the lamp because you know what’s in between. Your mind cut the scene. First you behold the lamp. Cut. Then you behold me.

Murch was fascinated by this, and he began to pay closer attention to blinking’s relationship to emotional or cognitive states. He concluded that blinks tend to occur at instants in which an internal separation of thought has taken place, either to help it along or as an involuntary reflex that coincides with a moment of transition. (It also reminds me a little of the work of the philosopher Andy Clark, who notes, as Huston did, that the mind only processes a scene when something changes.)

Walter Murch

As Murch writes in In the Blink of an Eye: “Start a conversation with somebody and watch when they blink. I believe you will find that your listener will blink at the precise moment he or she ‘gets’ the idea of what you are saying, not an instant earlier or later…And that blink will occur where a cut could have happened, had the conversation been filmed.” This doesn’t necessarily mean that an editor should worry about when the actors are blinking, but that if he or she is making the cut in the right spot, as a kind of visual punctuation, the blinks and the cuts will coincide anyway. Apart from Murch’s anecdotal observations, I don’t know if this phenomenon has ever been studied in detail, but it’s intriguing. For instance, it suggests that breathing in audio and blinking in film are two aspects of the same thing. Both are physiological phenomena, but they’re also connected with cognition in profound ways, especially when we’re trying to communicate with others. When we’re talking to someone else, we don’t stop to breathe in arbitrary places, but at moments when the sense of what we’re saying has reached a natural break. Hence the function of the comma, which is a visual marker that sets apart clauses or units of information on the page, as well as a vestigial trace of the pause that would have occurred in conversation—even if we usually don’t stop when we’re reading it silently to ourselves. And I’ve spoken elsewhere of the relationship between breathing and the length of sentences or lines of poetry, in which the need to breathe is inseparable from the necessity of pausing for consolidation or comprehension.

Editors care about these issues because they’re essentially playing a confidence trick. They’re trying to create an impression of continuity while assembling many discrete pieces, and if they fail to honor the logic of the breath or the blink, the listener or viewer will subconsciously sense it. This is the definition of a thankless task, because you’ll never notice it when it works, and when it doesn’t, you probably won’t even be able to articulate the problem. I suspect that the uneasiness caused by a poorly edited stretch of audio or film is caused by the rhythms of one’s own body falling out of sync with the story: when a work of art is flowing properly, we naturally adjust ourselves to its rhythms, and a dropped or doubled breath can shake us out of that sense of harmony. After a while, addressing this becomes a matter of instinct, and a skilled editor will unconsciously take these factors into account, much as an author eventually learns to write smoothly without worrying about it too much. We only become aware of it when something feels wrong. (It’s also worth paying close attention to it during the revision phase. The NPR tutorial notes that problems with breaths can occur when the editor tries to “nickel and dime” an interview to make it fit within a certain length. And when James Cameron tried to cut Terminator 2 down to its contractual length by removing just a single frame per second from the whole movie, he found that the result was unwatchable.) When we’re awake, no matter what else we might be doing, we’re breathing and blinking. And it’s a testament to the challenges that all editors face that they can’t even take breathing for granted.

Written by nevalalee

April 17, 2018 at 8:23 am

The men who sold the movies

with one comment

Yesterday, I noted that although Isaac Asimov achieved worldwide fame as a science fiction writer, his stories have inspired surprisingly few cinematic adaptations, despite the endless attempts to do something with the Foundation series. But there’s a more general point to be made here, which is the relative dearth of movies based on the works of the four writers whom I discuss in Astounding. Asimov has a cheap version of Nightfall, Bicentennial Man, and I, Robot. John W. Campbell has three versions of The Thing and nothing else. L. Ron Hubbard, who admittedly is a special case, just has Battlefield Earth, while Robert A. Heinlein has The Puppet Masters, Starship Troopers and its sequels, and the recent Predestination. Obviously, this isn’t bad, and most writers, even successful ones, never see their work onscreen at all. But when you look at someone like Philip K. Dick, whose stories have been adapted into something like three television series and ten feature films, this scarcity starts to seem odd, even when you account for other factors. Hubbard is presumably off the table, and the value of Campbell’s estate, to be honest, consists entirely of “Who Goes There?” It’s also possible that much of Asimov’s work just isn’t very cinematic. But if you’re a Heinlein fan, it’s easy to imagine an alternate reality in which we can watch adaptations of “If This Goes On—,” “The Roads Must Roll,” “Universe,” “Gulf,” Tunnel in the Sky, Have Space Suit—Will Travel, Glory Road, The Moon is a Harsh Mistress, and three different versions of Stranger in a Strange Land—the corny one from the seventies, the slick but empty remake from the late nineties, and the prestige television adaptation that at least looked great on Netflix.

That isn’t how it turned out, but it wasn’t for lack of trying. Various works by Heinlein and Asimov have been continuously under option for decades, and three out of these four authors made repeated efforts to break into movies or television. Hubbard, notably, was the first, with a sale to Columbia Pictures of a unpublished story that he adapted into the serial The Secret of Treasure Island in 1938. He spent ten weeks on the studio lot, doing uncredited rewrites on the likes of The Adventures of the Mysterious Pilot and The Great Adventures of Wild Bill Hickok, and he would later claim, without any evidence, that he had worked on the scripts for Stagecoach, Dive Bomber, and The Plainsman. Decades later, Hubbard actively shopped around the screenplay for Revolt in the Stars, an obvious Star Wars knockoff, and among his last works were the scripts Ai! Pedrito! and A Very Strange Trip. Campbell, in turn, hosted the radio series Exploring Tomorrow; corresponded with the producer Clement Fuller about the television series The Unknown, with an eye to adapting his own stories or writing originals; and worked briefly as a freelance story editor for the syndicated radio series The Planet Man. Heinlein had by far the most success—he wrote Rocket Ship Galileo with one eye toward the movies, and he developed a related project with Fritz Lang before partnering with George Pal on Destination Moon. As I mentioned last week, he worked on the film Project Moon Base and an unproduced teleplay for a television show called Century XXII, and he even had the dubious privilege of suing Roger Corman for plagiarism over The Brain Eaters. And Asimov seethed with jealousy:

[Destination Moon] was the first motion picture involving one of us, and while I said not a word, I was secretly unhappy. Bob had left our group and become famous in the land of the infidels…I don’t know whether I simply mourned his loss, because I thought that now he would never come back to us; or whether I was simply and greenly envious. All I knew was that I felt more and more uncomfortable. It was like having a stomachache in the mind, and it seemed to spoil all my fun in being a science fiction writer.

But Asimov remained outwardly uninterested in the movies, writing of one mildly unpleasant experience: “It showed me again what Hollywood was like and how fortunate I was to steer as clear of it as possible.” It’s also hard to imagine him moving to Los Angeles. Yet he was at least open to the possibility of writing a story for Paul McCartney, and his work was often in development. In Nat Segaloff’s recent biography A Lit Fuse: The Provocative Life of Harlan Ellison, we learn that the television producer John Mantley had held an option on I, Robot “for some twenty years” when Ellison was brought on board in 1978. (This isn’t exactly right—Asimov states in his memoirs that Mantley first contacted him on August 11, 1967, and it took a while for a contract to be signed. But it was still a long time.) Asimov expressed hope that the adaptation would be “the first really adult, complex, worthwhile science fiction movie ever made,” which incidentally sheds light on his opinion of 2001, but it wasn’t meant to be. As Segaloff writes:

For a year from December 1977 Ellison was, as he has put it, “consumed with the project.” He used Asimov’s framework of a reporter, Robert Bratenahl, doing a story about Susan Calvin’s former lover, Stephen Byerly, and presented four of Calvin’s stories as flashbacks, making her the central figure, even in events that she could not have witnessed. It was a bold and admittedly expensive adaptation…When no response was forthcoming, Ellison arranged an in-person meeting with [Warner executive Bob] Shapiro on October 25, 1978, during which he realized that the executive had not read the script.

Ellison allegedly told Shapiro: “You’ve got the intellectual capacity of an artichoke.” He was fired from the project a few months later.

And the case of I, Robot hints at why these authors have had only limited success in Hollywood. As Segaloff notes, the burst of interest in such properties was due mostly to the success of Star Wars, and after Ellison left, a few familiar names showed up:

Around June 1980, director Irvin Kershner, who had made a success with The Empire Strikes Back, expressed interest, but when he was told that Ellison would not be rehired to make changes, according to Ellison his interest vanished…In 1985, Gary Kurtz, who produced the Star Wars films, made inquiries but was told that the project would cost too much to shoot, both because of its actual budget and the past expenses that had been charged against it.

At various points, in other words, many of the same pieces were lined up for I, Robot that had been there just a few years earlier for Star Wars. (It’s worth noting that less time separates Star Wars from these abortive attempts than lies between us and Inception, which testifies to how vivid its impact still was.) But it didn’t happen a second time, and I can think of at least one good reason. In conceiving his masterpiece, George Lucas effectively skipped the golden age entirely to go back to an earlier pulp era, which spoke more urgently to him and his contemporaries—which may be why we had a television show called Amazing Stories and not Astounding. Science fiction in the movies often comes down to an attempt to recreate Star Wars, and if that’s your objective, these writers might as well not exist.

%d bloggers like this: