Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Archive for the ‘Television’ Category

A comedian reads the newspaper

leave a comment »

A few days ago, I was leafing through Ladies and Gentlemen—Lenny Bruce, the monumental biography of the legendary standup comic by Albert Goldman and Lawrence Schiller. My eye was caught by a description of a typical performance by Bruce, who died in 1966:

When Lenny starts to spritz, interspersed with the hip jargon, riding along the bops and beats of his Broadway-Brooklyn tachycardic speech pattern, are allusions to big sounds like Stravinsky, Picasso, Charlie Parker, José Limon and James Joyce. Jazz, existentialism, analysis, peyote cults, and California. He’s concerned about the racial scene and the man in the White House and the economy, the way the country is changing. Speaks from experience, done an awful lot of reading.

These days, we may not expect our comedians to drop allusions to Stravinsky or José Limon, but we’re still interested in what they have say about “the racial scene and the man in the White House and the economy, the way the country is changing.” It’s part of a tradition of turning to standup comics for wisdom—or truth—that can largely be traced back to Bruce himself. And here’s the punchline, as Goldman delivers it: “The image is a bitch to sustain. Lenny isn’t that knowledgable about jazz. He’s never been to Europe since the Navy. Most everything he knows, he picks up from the movies.”

This pressure to seem informed about current events is one to which most of us can relate, and it must be particularly challenging to those figures who find themselves at the forefront of the culture, where we expect them to be inhumanly knowledgeable about everything while making the result seem effortless. As Goldman points out, though, there are ways of getting around it: “Mort Sahl found the solution before Lenny. It’s called osmosis.” He continues:

The way Sahl worked? Wherever he was, at home or on the road, he would have his room lined with magazines and books. He never read anything. A voracious skimmer. By flipping through this and staring at that, reading a sentence here and picking up a word there, he got a very good idea of where everything was. When he went into his monologue, you would swear that he had digested the whole world for that week. Charles de Gaulle, Dwight Eisenhower, segregation, Shelley Berman, trade unions, Marty, Dave Brubeck, New York, Berkeley, Beckett, newspapers, coffeehouses, sandals, J.D. Salinger, filter-tip cigarettes, the State Department, Dick Clark, German radios, birth control, Charles Van Doren, Adlai Stevenson, natural-shoulder suits, Cuba, Israel, Dave Garroway, the Diners’ Club, Billy Graham, sports cars, the Strategic Air Command—wow! A barrage!

And if you replace that catalog of topics with one that seems more current—Red Hen, zero tolerance, “This is America,” Harley Davidson, and that’s just this week—it still captures something of what we expect from our late night hosts and talking heads on a daily basis.

The ability to skim a newspaper and turn it into a monologue for an audience every night is a valuable skill, and it can earn millions for those who possess it. But there’s no particular reason that comedians or pundits need to do the skimming themselves. In the period about which Goldman is writing, Bruce’s solution centered on the unlikely figure of Terry Lane, his assistant and a former burlesque drummer:

Lenny doesn’t need all this crap. He has an imagination and he’s really funny, not just nervous, like Sahl. But the trick is the same. Neither a reader nor a skimmer, what’s he supposed to do? Just accept it? Be a schmuck? Oh, no! There are always people who can help you. You don’t have to take a lot of shit from them either. Just sit a guy like Terry down and say: “Now look man, here’s the gig. I need an intellectual seeing-eye dog. Somebody who can check out the papers every day, read Time and Newsweek, do a little research for me, and just set me up nice so when I go out on the floor tonight, I’m the best-informed person in the city. Dig?”

What Goldman is describing here is basically the relationship between a star comic and his head writer, as enacted in a seedy hotel room in Times Square instead of backstage at The Tonight Show. And while Terry Lane’s résumé may no longer be typical—his equivalent today would be more likely to have gone to Harvard—his personal qualifications are much the same: “What grabbed Lenny was the fact that Terry was a reader…Lenny hadn’t got the patience, the concentration, the sitzfleisch. When pushed too hard he got terrible headaches. But Terry there, at the table between shows, would sit, riddling off titles like a college English professor…Lenny was impressed.”

But the real takeaway here is how this approach to current events has expanded outward from the nightclubs to radio and cable news, which is where Bruce’s true successors can be found. Goldman nicely describes the skill in question:

And the system works fine. Terry or Richey or Benny or whoever is traveling with Lenny is always a smart, studious sort of cat, who can feed him facts and help him learn big new words out of the dictionary. After all, what is literacy? Words. How do you learn words? Hear them. If you have a good ear and a tongue that can mimic anything you hear, you can learn whole languages by rote. Lenny is a mind-mouth man. His brain is located somewhere between his ears and his tongue. All he has to do is get the hang of a word, and he finds a place to slip it into his act.

These days, many of us get our news exactly from such “mind-mouth” men or women, whose gift consists of taking a few headlines and spinning them into thirty minutes of daily content. On the left, they’ve traditionally come from the ranks of improv, standup, and sketch comedy; on the right, which has trouble coming up with funny people, from talk radio. (Rush Limbaugh got his start as a disc jockey, which points to the fact that his true power is the ability to talk into a microphone for hours.) I’m not denigrating this talent, which is so rare that only a handful of people seem capable of doing it for large audiences at any one time. And we could do worse than to take our political cues from the writers at The Daily Show. But it’s still a simulacrum of insight, rather than the real thing. And we need to think hard about what happens when so many people turn to it for their information—including the man in the White House.

Written by nevalalee

June 26, 2018 at 8:20 am

The ghost in the machine

with one comment

Note: Spoilers follow for the season finale of Westworld.

When you’re being told a story, you want to believe that the characters have free will. Deep down, you know that they’ve been manipulated by a higher power that can make them do whatever it likes, and occasionally, it can even be fun to see the wires. For the most part, though, our enjoyment of narrative art is predicated on postponing that realization for as long as possible. The longer the work continues, the harder this becomes, and it can amount to a real problem for a heavily serialized television series, which can start to seem strained and artificial as the hours of plot developments accumulate. These tensions have a way of becoming the most visible in the protagonist, whose basic purpose is to keep the action clocking along. As I’ve noted here before, there’s a reason why the main character is often the least interesting person in sight. The show’s lead is under such pressure to advance the plot that he or she becomes reduced to the diagram of a pattern of forces, like one of the fish in D’Arcy Wentworth Thompson’s On Growth and Form, in which the animal’s physical shape is determined by the outside stresses to which it has been subjected. Every action exists to fulfill some larger purpose, which often results in leads who are boringly singleminded, with no room for the tangents that can bring supporting players to life. The characters at the center have to constantly triangulate between action, motivation, and relatability, which can drain them of all surprise. And if the story ever relaxes its hold, they burst, like sea creatures brought up from a crevasse to the surface.

This is true of most shows that rely heavily on plot twists and momentum—it became a huge problem for The Vampire Diaries—but it’s even more of an issue when a series is also trying to play tricks with structure and time. Westworld has done more than any other television drama that I can remember to push against the constraints of chronology, and the results are often ingenious. Yet they come at a price. (As the screenwriter Robert Towne put it in a slightly different content: “You end up paying for it with an almost mathematical certainty.”) And the victim, not surprisingly, has been the ostensible lead. Over a year and a half ago, when the first season was still unfolding, I wrote that Dolores, for all her problems, was the engine that drove the story, and that her gradual movement toward awareness was what gave the series its narrative thrust. I continued:

This is why I’m wary of the popular fan theory, which has been exhaustively discussed online, that the show is taking place in different timelines…Dolores’s story is the heart of the series, and placing her scenes with William three decades earlier makes nonsense of the show’s central conceit: that Dolores is slowly edging her way toward greater self-awareness because she’s been growing all this time. The flashback theory implies that she was already experiencing flashes of deeper consciousness almost from the beginning, which requires us to throw out most of what we know about her so far…It has the advantage of turning William, who has been kind of a bore, into a vastly more interesting figure, but only at the cost of making Dolores considerably less interesting—a puppet of the plot, rather than a character who can drive the narrative forward in her own right.

As it turned out, of course, that theory was totally on the mark, and I felt a little foolish for having doubted it for so long. But on a deeper level, I have to give myself credit for anticipating the effect that it would have on the series as a whole. At the time, I concluded: “Dolores is such a load-bearing character that I’m worried that the show would lose more than it gained by the reveal…The multiple timeline theory, as described, would remove the Dolores we know from the story forever. It would be a fantastic twist. But I’m not sure the show could survive it.” And that’s pretty much what happened, although it took another season to clarify the extent of the damage. On paper, Dolores was still the most important character, and Evan Rachel Wood deservedly came first in the credits. But in order to preserve yet another surprise, the show had to be maddeningly coy about what exactly she was doing, even as she humorlessly pursued her undefined mission. Every line was a cryptic hint about what was coming, and the payoff was reasonably satisfying. But I don’t know if it was worth it. Offhand, I can’t recall another series in which an initially engaging protagonist was reduced so abruptly to a plot device, and it’s hard not to blame the show’s conceptual and structural pretensions, which used Dolores as a valve for the pressure that was occurring everywhere else but at its center. It’s frankly impossible for me to imagine what Dolores would even look like if she were relaxing or joking around or doing literally anything except persisting grimly in her roaring rampage of revenge. Because of the nature of its ambitions, Westworld can’t give her—or any of its characters—the freedom to act outside the demands of the story. It’s willing to let its hosts be reprogrammed in any way that the plot requires. Which you’ve got to admit is kind of ironic.

None of this would really matter if the payoffs were there, and there’s no question that last night’s big reveal about Charlotte is an effective one. (Unfortunately, it comes at the expense of Tessa Thompson, who, like Wood, has seemed wasted throughout the entire season for reasons that have become evident only now.) But the more I think about it, the more I feel that this approach might be inherently unsuited for a season of television that runs close to twelve hours. When a conventional movie surprises us with a twist at the end, part of the pleasure is mentally rewinding the film to see how it plays in light of the closing revelation—and much of the genius of Memento, which was based on Jonathan Nolan’s original story, was that it allowed us to do this every ten minutes. Yet as Westworld itself repeatedly points out, there’s only so much information or complexity that the human mind can handle. I’m a reasonably attentive viewer, but I often struggled to recall what happened seven episodes ago, and the volume of data that the show presents makes it difficult to check up on any one point. Now that the series is over, I’m sure that if I revisited the earlier episodes, many scenes would take on an additional meaning, but I just don’t have the time. And twelve hours may be too long to make viewers wait for the missing piece that will lock the rest into place, especially when it comes at the expense of narrative interest in the meantime, and when anything truly definitive will need to be withheld for the sake of later seasons. It’s to the credit of Westworld and its creators that there’s little doubt that they have a master plan. They aren’t making it up as they go along. But this also makes it hard for the characters to make anything of themselves. None of us, the show implies, is truly in control of our actions, which may well be the case. But a work of art, like life itself, doesn’t seem worth the trouble if it can’t convince us otherwise.

Written by nevalalee

June 25, 2018 at 8:42 am

The president is collaborating

leave a comment »

Last week, Bill Clinton and James Patterson released their collaborative novel The President is Missing, which has already sold something like a quarter of a million copies. Its publication was heralded by a lavish two-page spread in The New Yorker, with effusive blurbs from just about everyone whom a former president and the world’s bestselling author might be expected to get on the phone. (Lee Child: “The political thriller of the decade.” Ron Chernow: “A fabulously entertaining thriller.”) If you want proof that the magazine’s advertising department is fully insulated from its editorial side, however, you can just point to the fact that the task of reviewing the book itself was given to Anthony Lane, who doesn’t tend to look favorably on much of anything. Lane’s style—he has evidently never met a smug pun or young starlet he didn’t like—can occasionally turn me off from his movie reviews, but I’ve always admired his literary takedowns. I don’t think a month goes by that I don’t remember his writeup of the New York Times bestseller list May 15, 1994, which allowed him to tackle the likes of The Bridges of Madison County, The Celestine Prophecy, and especially The Day After Tomorrow by Allan Folsom, from which he quoted a sentence that permanently changed my view of such novels: “Two hundred European cities have bus links with Frankfurt.” But he seems to have grudgingly liked The President is Missing. If nothing else, he furnishes a backhanded compliment that has already been posted, hilariously out of context, on Amazon: “If you want to make the most of your late-capitalist leisure-time, hit the couch, crack a Bud, punch the book open, focus your squint, and enjoy.”

The words “hit the couch, crack a Bud, punch the book open, [and] focus your squint,” are all callbacks to samples of Patterson’s prose that Lane quotes in the review, but the phrase “late-capitalist leisure-time” might require some additional explanation. It’s a reference to the paper “Structure over Style: Collaborative Authorship and the Revival of Literary Capitalism,” which appeared last year in Digital Humanities Review, and I’m grateful to Lane for bringing it to my attention. The authors, Simon Fuller and James O’Sullivan, focus on the factory model of novelists who employ ghostwriters to boost their productivity, and their star exhibit is Patterson, to whom they devote the same kind of computational scrutiny that has previously uncovered traces of collaboration in Shakespeare. Not surprisingly, it turns out that Patterson doesn’t write most of the books that he ostensibly coauthors. (He may not even have done much of the writing on First to Die, which credits him as the sole writer.) But the paper is less interesting for its quantitative analysis than for its qualitative evaluation of what Patterson tells us about how we consume and enjoy fiction. For instance:

The form of [Patterson’s] novels also appears to be molded by contemporary experience. In particular, his work is perhaps best described as “commuter fiction.” Nicholas Paumgarten describes how the average time for a commute has significantly increased. As a result, reading has increasingly become one of those pursuits that can pass the time of a commute. For example, a truck driver describes how “he had never read any of Patterson’s books but that he had listened to every single one of them on the road.” A number of online reader reviews also describe Patterson’s writing in terms of their commutes…With large print, and chapters of two or three pages, Patterson’s works are constructed to fit between the stops on a metro line.

Of course, you could say much the same of many thrillers, particularly the kind known as the airport novel, which wasn’t just a book that you read on planes—at its peak, it was one in which many scenes took place in airports, which were still associated with glamor and escape. What sets Patterson apart from his peers is his ability to maintain a viable brand while publishing a dozen books every year. His productivity is inseparable from his use of coauthors, but he wasn’t the first. Fuller and O’Sullivan cite the case of Alexandre Dumas, who allegedly boasted of having written four hundred novels and thirty-five plays that had created jobs for over eight thousand people. And they dig up a remarkable quote from The German Ideology by Karl Marx and Friedrich Engels, who “favorably compare French popular fiction to the German, paying particular attention to the latter’s appropriation of the division of labor”:

In proclaiming the uniqueness of work in science and art, [Max] Stirner adopts a position far inferior to that of the bourgeoisie. At the present time it has already been found necessary to organize this “unique” activity. Horace Vernet would not have had time to paint even a tenth of his pictures if he regarded them as works which “only this Unique person is capable of producing.” In Paris, the great demand for vaudevilles and novels brought about the organization of work for their production, organization which at any rate yields something better than its “unique” competitors in Germany.

These days, you could easily imagine Marx and Engels making a similar case about film, by arguing that the products of collaboration in Hollywood have often been more interesting, or at least more entertaining, than movies made by artists working outside the system. And they might be right.

The analogy to movies and television seems especially appropriate in the case of Patterson, who has often drawn such comparisons himself, as he once did to The Guardian: “There is a lot to be said for collaboration, and it should be seen as just another way to do things, as it is in other forms of writing, such as for television, where it is standard practice.” Fuller and O’Sullivan compare Patterson’s brand to that of Alfred Hitchcock, whose name was attached to everything from Dell anthologies to The Three Investigators to Alfred Hitchcock’s Mystery Magazine. It’s a good parallel, but an even better one might be hiding in plain sight. In her recent profile of the television producer Ryan Murphy, Emily Nussbaum evokes an ability to repackage the ideas of others that puts even Patterson to shame:

Murphy is also a collector, with an eye for the timeliest idea, the best story to option. Many of his shows originate as a spec script or as some other source material. (Murphy owned the rights to the memoir Orange Is the New Black before Jenji Kohan did, if you want to imagine an alternative history of television.) Glee grew out of a script by Ian Brennan; Feud began as a screenplay by Jaffe Cohen and Michael Zam. These scripts then get their DNA radically altered and replicated in Murphy’s lab, retooled with his themes and his knack for idiosyncratic casting.

Murphy’s approach of retooling existing material in his own image might be even smarter than Patterson’s method of writing outlines for others to expand, and he’s going to need it. Two months ago, he signed an unprecedented $300 million contract with Netflix to produce content of all kinds: television shows, movies, documentaries. And another former president was watching. While Bill Clinton was working with Patterson, Barack Obama was finalizing a Netflix deal of his own—and if he needs a collaborator, he doesn’t have far to look.

The Prime of Miss Elizabeth Hoover

with 2 comments

Yesterday, as I was working on my post for this blog, I found myself thinking about the first time that I ever heard of Lyme disease, which, naturally, was on The Simpsons. In the episode “Lisa’s Substitute,” which first aired on April 25, 1991, Lisa’s teacher, Miss Hoover, tells the class: “Children, I won’t be staying long. I just came from the doctor, and I have Lyme disease.” As Principal Skinner cheerfully explains: “Lyme disease is spread by small parasites called ‘ticks.’ When a diseased tick attaches itself to you, it begins sucking your blood. Malignant spirochetes infect your bloodstream, eventually spreading to your spinal fluid and on into the brain.” At the end of the second act, however, Miss Hoover unexpectedly returns, and I’ve never forgotten her explanation for her sudden recovery:

Miss Hoover: You see, class, my Lyme disease turned out to be psychosomatic.
Ralph: Does that mean you’re crazy?
Janie: It means she was faking it.
Miss Hoover: No, actually, it was a little of both. Sometimes, when a disease is in all the magazines and on all the news shows, it’s only natural that you think you have it.

And while it might seem excessive to criticize a television episode that first aired over a quarter of a century ago, it’s hard to read these lines after Porochista Khakpour’s memoir Sick without wishing that this particular joke didn’t exist.

In its chronic form, Lyme disease remains controversial, but like chronic fatigue syndrome and fibromyalgia, it’s an important element in the long, complicated history of women having trouble finding doctors who will take their pain seriously. As Lidija Haas writes in The New Yorker:

There’s a class of illnesses—multi-symptomatic, chronic, hard to diagnose—that remain associated with suffering women and disbelieving experts. Lyme disease, symptoms of which can afflict patients years after the initial tick bite, appears to be one…[The musician Kathleen Hanna] describes an experience common to many sufferers from chronic illness—that of being dismissed as an unreliable witness to what is happening inside her. Since no single medical condition, a doctor once told her, could plausibly affect so many different systems—neurological, respiratory, gastrointestinal—she must be having a panic attack…As in so many other areas of American life, women of color often endure the most extreme versions of this problem.

It goes without saying that when “Lisa’s Substitute” was written, there weren’t any women on the writing staff of The Simpsons, although even if there were, it might not have made a difference. In her recent memoir Just the Funny Parts, Nell Scovell, who worked as a freelance television writer in the early nineties, memorably describes the feeling of walking into the “all-male” Simpsons writing room, which was “welcoming, but also intimidating.” It’s hard to imagine these writers, so many of them undeniably brilliant, thinking twice about making a joke like this—and it’s frankly hard to see them rejecting it now, when it might only lead to attacks from people who, in Matt Groening’s words, “love to pretend they’re offended.”

I’m not saying that there are any subjects that should be excluded from comedic consideration, or that The Simpsons can’t joke about Lyme disease. But as I look back at the classic years of my favorite television show of all time, I’m starting to see a pattern that troubles me, and it goes far beyond Apu. I’m tempted to call it “punching down,” but it’s worse. It’s a tendency to pick what seem at the time like safe targets, and to focus with uncanny precision on comic gray areas that allow for certain forms of transgression. I know that I quoted this statement just a couple of months ago, but I can’t resist repeating what producer Bill Oakley says of Greg Daniels’s pitch about an episode on racism in Springfield:

Do you remember this? Something about Homer and Dr. Hibbert? Well, you pitched it several times and I think we were just…It was some exploration of the concept of race in Springfield, and we just said, you know, we don’t think this is the forum. The Simpsons can’t be the right forum to deal with racism.

He was probably right. But when you look at the few racially charged jokes that the show actually made, the characters involved weren’t black, but quite specifically “brown,” or members of groups that occupy a liminal space in our cultural understanding of race: Apu, Akira, Bumblebee Man. (I know that Akira was technically whiter than anybody else, but you get my drift.) By contrast, the show was very cautious when it came to its black characters. Apart from Dr. Hibbert, who was derived from Bill Cosby, the show’s only recurring black faces were Carl and Officer Lou, the latter of whom is so unmemorable that I had to look him up to make sure that he wasn’t Officer Eddie. And both Carl and Lou were given effectively the same voice by Hank Azaria, the defining feature of which was that it was nondescript as humanly possible.

I’m not necessarily criticizing the show’s treatment of race, but the unconscious conservatism that carefully avoided potentially controversial areas while lavishing attention on targets that seemed unobjectionable. It’s hard to imagine a version of the show that would have dared to employ such stereotypes, even ironically, on Carl, Lou, or even Judge Snyder, who was so racially undefined that he was occasionally colored as white. (The show’s most transgressive black figures, Drederick Tatum and Lucius Sweet, were so transparently modeled on real people that they barely even qualified as characters. As Homer once said: “You know Lucius Sweet? He’s one of the biggest names in boxing! He’s exactly as rich and as famous as Don King, and he looks just like him, too!” And I’m not even going to talk about “Bleeding Gums” Murphy.) That joke about Miss Hoover is starting to feel much the same way, and if it took two decades for my own sensibilities to catch up with that fact, it’s for the same reasons that we’re finally taking a harder look at Apu. And if I speak as a fan, it isn’t to qualify these thoughts, but to get at the heart of why I feel obliged to write about them at all. We’re all shaped by popular culture, and I can honestly say of The Simpsons, as Jack Kerouac writes in On the Road: “All my actions since then have been dictated automatically to my subconscious by this horrible osmotic experience.” The show’s later seasons are reflexively dismissed as lazy, derivative, and reliant on easy jokes, but we still venerate its golden years. Yet if The Simpsons has gradually degraded under the watch of many of its original writers and producers, this implies that we’re only seeing the logical culmination—or eruption—of something that was there all along, afflicting its viewers years after the original bite. We all believed that The Simpsons, in its prime, was making us smarter. But what if it was just psychosomatic?

A season of disenchantment

leave a comment »

A few days ago, Matt Groening announced that his new animated series, Disenchantment, will premiere in August on Netflix. Under other circumstances, I might have been pleased by the prospect of another show from the creator of The Simpsons and Futurama—not to mention producers Bill Oakley and Josh Weinstein—and I expect that I’ll probably watch it. At the moment, however, it’s hard for me to think about Groening at all without recalling his recent reaction to the long overdue conversation around the character of Apu. When Bill Keveny of USA Today asked earlier this month if he had any thoughts on the subject, Groening replied: “Not really. I’m proud of what we do on the show. And I think it’s a time in our culture where people love to pretend they’re offended.” It was a profoundly disappointing statement, particularly after Hank Azaria himself had expressed his willingness to step aside from the role, and it was all the more disillusioning coming from a man whose work has been a part of my life for as long as I can remember. As I noted in my earlier post, the show’s unfeeling response to this issue is painful because it contradicts everything that The Simpsons was once supposed to represent. It was the smartest show on television; it was simply right about everything; it offered its fans an entire metaphorical language. And as the passage of time reveals that it suffered from its own set of blinders, it doesn’t just cast doubt on the series and its creators, but on the viewers, like me, who used it for so long as an intellectual benchmark.

And it’s still an inescapable part of my personal lexicon. Last year, for instance, when Elon Musk defended his decision to serve on Trump’s economic advisory council, I thought immediately of what Homer says to Marge in “Whacking Day”: “Maybe if I’m part of that mob, I can help steer it in wise directions.” Yet it turns out that I might have been too quick to give Musk—who, revealingly, was the subject of an adulatory episode of The Simpsons—the benefit of the doubt. A few months later, in response to reports of discrimination at Tesla, he wrote an email to employees that included this remarkable paragraph:

If someone is a jerk to you, but sincerely apologizes, it is important to be thick-skinned and accept that apology. If you are part of a lesser represented group, you don’t get a free pass on being a jerk yourself. We have had a few cases at Tesla were someone in a less represented group was actually given a job or promoted over more qualified highly represented candidates and then decided to sue Tesla for millions of dollars because they felt they weren’t promoted enough. That is obviously not cool.

The last two lines, which were a clear reference to the case of A.J. Vandermeyden, tell us more about Musk’s idea of a “sincere apology” than he probably intended. And when Musk responded this week to criticism of Tesla’s safety and labor practices by accusing the nonprofit Center for Investigative Reporting of bias and proposing a site where users could provide a “credibility score” for individual journalists, he sounded a lot like the president whose circle of advisers he only reluctantly left.

Musk, who benefited from years of uncritical coverage from people who will forgive anything as long as you talk about space travel, seems genuinely wounded by any form of criticism or scrutiny, and he lashes out just as Trump does—by questioning the motives of ordinary reporters or sources, whom he accuses of being in the pocket of unions or oil companies. Yet he’s also right to be worried. We’re living in a time when public figures and institutions are going to be judged by their responses to questions that they would rather avoid, which isn’t likely to change. And the media itself is hardly exempt. For the last two weeks, I’ve been waiting for The New Yorker to respond to stories about the actions of two of its most prominent contributors, Junot Díaz and the late David Foster Wallace. I’m not even sure what I want the magazine to do, exactly, except make an honest effort to grapple with the situation, and maybe even offer a valuable perspective, which is why I read it in the first place. (In all honesty, it fills much the same role in my life these days as The Simpsons did in my teens. As Norman Mailer wrote back in the sixties: “Hundreds of thousands, perhaps millions of people in the most established parts of the middle class kill their quickest impulses before they dare to act in such a way as to look ridiculous to the private eye of their taste whose style has been keyed by the eye of The New Yorker.”) As the days passed without any comment, I assumed that it was figuring out how to tackle an admittedly uncomfortable topic, and I didn’t expect it to rush. Now that we’ve reached the end of the month without any public engagement at all, however, I can only conclude that it’s deliberately ignoring the matter in hopes that it will go away. I hope that I’m wrong. But so far, it’s a discouraging omission from a magazine whose stories on Harvey Weinstein and Eric Schneiderman implicitly put it at the head of an entire movement.

The New Yorker has evidently discovered that it’s harder to take such stands when they affect people whom we know or care about— which only means that it can get in line. Our historical moment has forced some of our smartest individuals and organizations to learn how to take criticism as well as to give it, and it’s often those whose observations about others have been the sharpest who turn out to be singularly incapable, as Clarice Starling once put it, when it comes to pointing that high-powered perception on themselves. (In this list, which is constantly being updated, I include Groening, Musk, The New Yorker, and about half the cast of Arrested Development.) But I can sympathize with their predicament, because I feel it nearly every day. My opinion of Musk has always been rather mixed, but nothing can dislodge the affection and gratitude that I feel toward the first eight seasons of The Simpsons, and I expect to approvingly link to an article in The New Yorker this time next week. But if our disenchantment forces us to question the icons whose influence is fundamental to our conception of ourselves, then perhaps it will have been worth the pain. Separating our affection for the product from those who produced it is a problem that we all have to confront, and it isn’t going to get any easier. As I was thinking about this post yesterday, the news broke that Morgan Freeman had been accused by multiple women of inappropriate behavior. In response, he issued a statement that read in part: “I apologize to anyone who felt uncomfortable or disrespected.” It reminded me a little of another man who once grudgingly said of some remarks that were caught on tape: “I apologize if anyone was offended.” But it sounds a lot better when you imagine it in Morgan Freeman’s voice.

Written by nevalalee

May 25, 2018 at 9:21 am

The bedtime story

leave a comment »

Earlier this morning, I finally got my hands on the companion book to James Cameron’s Story of Science Fiction, which is airing this month on AMC. Naturally, I immediately looked for references to the four main subjects of Astounding, and the passage that caught my eye first was an exchange between Cameron and Steven Spielberg:

Spielberg: The working title of E.T. was Watch the Skies. Which is sort of the last line from The Thing. I just remember looking at the sky because of the influence of my father, and saying, only good should come from that. If it ain’t an ICBM coming from the Soviet Union, only good should come from beyond our gravitational hold…He was a visionary about that, yet he read all the Analog. Those paperbacks? And Amazing Stories, the paperbacks of that. I used to read that along with him. Sometimes, he’d read those books to me, those little tabloids to me at night.

Cameron: Asimov, Heinlein, all those guys were all published in those pulp magazines.

Spielberg: They were all published in those magazines, and a lot of them were optimists. They weren’t always calculating our doom. They were finding ways to open up our imagination and get us to dream and get us to discover and get us to contribute to the greater good.

The discussion quickly moves on to other subjects, but not before hinting at the solution to a mystery that I’ve been trying to figure out for years, which is why the influence of Astounding and its authors can be so hard to discern in the work of someone like Spielberg. In part, it’s a matter of timing. Spielberg was born in 1946, which means that he would have been thirteen when John W. Campbell announced that that his magazine was changing its title to Analog. As a result, at a point at which he should have been primed to devour science fiction, Spielberg doesn’t seem to have found its current incarnation all that interesting, for which you can hardly blame him. Instead, his emotional associations with the pulps were evidently passed down through his father, Arnold Spielberg, an electrical engineer who worked for General Electric and RCA. The elder Spielberg, remarkably, is still active at the age of 101, and just two months ago, he said in an interview with GE Reports:

I was also influenced by science fiction. There were twins in our neighborhood who read one of the first sci-fi magazines, called Astounding Stories of Science and Fact. They gave me one copy, and when I brought it home, I was hooked. The magazine is now called Analog Science Fiction and Fact, and I still get it.

And while I don’t think that there’s any way of verifying it, if Arnold Spielberg—the father of Steven Spielberg—isn’t the oldest living subscriber to Analog, he must be close.

This sheds light on his son’s career, although perhaps not in the way that you might think. Spielberg is such a massively important figure that his very existence realigns the history of the genre, and when he speaks of his influences, we need to be wary of the shadow cast by his inescapable personality. But there’s no denying the power—and truth—of the image of Arnold Spielberg reading from the pulps aloud to his son. It feels like an image from one of Spielberg’s own movies, which has been shaped from the beginning by the tradition of oral storytelling. (It’s worth noting, though, that the father might recall things differently than the son. In his biography of the director, Joseph McBride quotes Arnold Spielberg: “I’ve been reading science fiction since I was seven years old, all the way back to the earliest Amazing Stories. Amazing, Astounding, Analog—I still subscribe. I still read ’em. My kids used to complain, ‘Dad’s in the bathroom with a science-fiction magazine. We can’t get in.'”) For Spielberg, the stories seem inextricably linked with the memory of being taken outside by his father to look at the stars:

My father was the one that introduced me to the cosmos. He’s the one who built—from a big cardboard roll that you roll rugs on—a two-inch reflecting telescope with an Edmund Scientific kit that he had sent away for. [He] put this telescope together, and then I saw the moons of Jupiter. It was the first thing he pointed out to me. I saw the rings of Saturn around Saturn. I’m six, seven years old when this all happened.

Spielberg concludes: “Those were the stories, and just looking up at the sky, that got me to realize, if I ever get a chance to make a science fiction movie, I want those guys to come in peace.”

But it also testifies to the ways in which a strong personality will take exactly what it needs from its source material. Elsewhere in the interview, there’s another intriguing reference:

Spielberg: I always go for the heart first. Of course, sometimes I go for the heart so much I get a little bit accused of sentimentality, which I’m fine [with] because…sometimes I need to push it a little further to reach a little deeper into a society that is a little less sentimental than they were when I was a young filmmaker.

Cameron: You pushed it in the same way that John W. Campbell pushed science fiction [forward] from the hard-tech nerdy guys who had to put PhD after their name to write science fiction. It was all just about the equations and the math and the physics [and evolved to become much more] human stories [about] the human heart.

I see what Cameron is trying to say here, but if you’ve read enough of the magazine that turned into Analog, this isn’t exactly the impression that it leaves. It’s true that Campbell put a greater emphasis than most of his predecessors on characterization, at least in theory, but the number of stories that were about “the human heart” can be counted on two hands, and none were exactly Spielbergian—although they might seem that way when filtered through the memory of his father’s voice. And toward the end, the nerds took over again. In Dangerous Visions, which was published in 1967, Harlan Ellison wrote of “John W. Campbell, Jr., who used to edit a magazine that ran science fiction, called Astounding, and who now edits a magazine that runs a lot of schematic drawings, called Analog.” It was the latter version of the magazine that Spielberg would have seen as a boy—which may be why, when the time came, he made a television show called Amazing Stories.

The multiverse theory

leave a comment »

Yesterday, I flew back from the Grappling with the Futures symposium, which was held over the course of two days at Harvard and Boston University. I’d heard about the conference from my friend Emanuelle Burton, a scholar at the University of Illinois at Chicago, whom I met two years ago through the academic track at the World Science Fiction Convention in Kansas City. Mandy proposed that we collaborate on a presentation at this event, which was centered on the discipline of futures studies, a subject about which I knew nothing. For reasons of my own, though, I was interested in making the trip, and we put together a talk titled Fictional Futures, which included a short history of the concept of psychohistory. The session went fine, even if we ended up with more material than we could reasonably cover in twenty minutes. But I was equally interested in studying the people around me, who were uniformly smart, intense, quirky, and a little mysterious. Futures studies is an established academic field that draws on many of the tools and concepts of science fiction, but it uses a markedly different vocabulary. (One of the scheduled keynote speakers has written and published a climate change novella, just like me, except that she describes it as a “non-numerical simulation model.”) It left me with the sense of a closed world that evolved in response to the same problems and pressures that shaped science fiction, but along divergent lines, and I still wonder what might come of a closer relationship between the two communities.

As it happened, I had to duck out after the first day, because I had something else to do in Boston. Ever since I started work on Astounding, I’ve been meaning to pay a visit to the Isaac Asimov collection at the Howard Gotlieb Archival Research Center at Boston University, which houses the majority of Asimov’s surviving papers, but which can only be viewed in person. Since I was going to be in town anyway, I left the symposium early and headed over to the library, where I spent five hours yesterday going through what I could. When you arrive at the reading room, you sign in, check your bag and cell phone, and are handed a massive finding aid, an inventory of the Asimov collection that runs to more than three hundred pages. (The entire archive, which consists mostly of work that dates from after the early sixties, fills four hundred boxes.) After marking off the items that you want, you’re rewarded with a cart loaded with archival cartons and a pair of white gloves. At the back of my mind, I wasn’t expecting to find much—I’ve been gathering material for this book for years. As it turned out, there were well over a hundred letters between Asimov, Campbell, and Heinlein alone that I hadn’t seen before. You aren’t allowed to take pictures or make photocopies, so I typed up as many notes as I could before I had to run to catch my plane. For the most part, they fill out parts of the story that I already have, and they won’t fundamentally change the book. But in an age of digital research, I was struck by the fact that all this paper, of which I just scratched the surface, is only accessible to scholars who can physically set foot in the reading room at the Mugar Library.

After two frantic days, I finally made it home, where my wife and I watched last night’s premiere of James Cameron’s Story of Science Fiction on AMC. At first glance, this series might seem like the opposite of my experiences in Boston. Instead of being set apart from the wider world, it’s an ambitious attempt to appeal to the largest audience possible, with interviews with the likes of Steven Spielberg and Christopher Nolan and discussions of such works as Close Encounters and Alien. I’ve been looking forward to this show for a long time, not least because I was hoping that it would lead to a spike in interest in science fiction that would benefit my book, and the results were more or less what I expected. In the opening sequence, you briefly glimpse Heinlein and Asimov, and there’s even a nod to The Thing From Another World, although no mention of John W. Campbell himself. For the most part, though, the series treats the literary side as a precursor to its incarnations in the movies and television, which is absolutely the right call. You want to tell this story as much as possible through images, and the medium lends itself better to H.R. Geiger than to H.P. Lovecraft. But when I saw a brief clip of archival footage of Ray Bradbury, in his role in the late seventies as an ambassador for the genre, I found myself thinking of the Bradbury whom I know best—the eager, unpublished teenager in the Great Depression who wrote fan letters to the pulps, clung to the edges of the Heinlein circle, and never quite managed to break into Astounding. It’s a story that this series can’t tell, and I can’t blame it, because I didn’t really do it justice, either.

Over the last few days, I’ve been left with a greater sense than ever before of the vast scope and apparently irreconcilable aspects of science fiction, which consists of many worlds that only occasionally intersect. It’s a realization, or a recollection, that might seem to come at a particularly inopportune time. The day before I left for the symposium, I received the page proofs for Astounding, which normally marks the point at which a book can truly be said to be finished. I still have time to make a few corrections and additions, and I plan to fix as much of it as I can without driving my publisher up the wall. (There are a few misplaced commas that have been haunting my dreams.) I’m proud of the result, but when I look at the proofs, which present the text as an elegant and self-contained unit, it seems like an optical illusion. Even if I don’t take into account what I learned when it was too late, I’m keenly aware of everything and everyone that this book had to omit. I’d love to talk more about futures studies, or the letters that I dug up in the Asimov archives, or the practical effects in John Carpenter’s remake of The Thing, but there just wasn’t room or time. As it stands, the book tries to strike a balance between speaking to obsessive fans and appealing to a wide audience, which meant excluding a lot of fascinating material that might have survived if it were being published by a university press. It can’t possibly do everything, and the events of the weekend have only reminded me that there are worlds that I’ve barely even explored. But if that isn’t the whole point of science fiction—well, what is?

%d bloggers like this: