Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Donald Trump

Statues of limitations

leave a comment »

Over half a century ago, the British historian Michael Howard published an influential essay titled “The Use and Abuse of Military History.” It opens with a consideration of the ends to which historical studies—particularly on the military side—can legitimately be turned, with a particular emphasis on “myth-making,” which Howard defines with deliberate precision:

When I use the phrase “myth-making,” I mean the creation of the image of the past, through careful selection and interpretation, in order to create or sustain certain emotions or beliefs. Historians have been expected to do this almost since history began to be written at all, in order to encourage patriotic or religious feeling, or to create support for a dynasty or for a political regime. They usually have done so with no sense of professional dishonesty, and much splendid work they have produced in the process…In totalitarian regimes it is difficult and sometimes impossible to write any other kind of history. Even in mature democracies, subject to very careful qualifications, the “myth,” this selective and heroic view of the past, has its uses…Like Plato I believe that the myth does have a useful social function. I do not consider it to be an “abuse” of military history at all, but something quite different, to be judged by different standards. It is “nursery history,” and I use the phrase without any disparaging implications. Breaking children in properly to the facts of life is a highly skilled affair, as most of you know; and the realities of war are among the most disagreeable facts of life that we are ever called upon to face.

It seems obvious that the equestrian statue of a soldier in a park can hardly be anything else than “nursery history.” Historical myths tend to divide the world into heroes and villains, but there isn’t even a villain here, just a hero on horseback fighting on behalf of a cause that necessarily must remain undefined. There’s no room for nuance or complexity. A great work of public art like the Vietnam Veterans Memorial encourages reflection, but the vast majority of such statues weren’t erected for their aesthetic merits. You could even make a good case that this kind of statue gains meaning from only two moments in its existence: when it goes up and when it comes down. (As this excellent infographic created by the Southern Poverty Law Center reminds us, the timing of the dedication of Confederate monuments supplies a piece of information in itself. These symbols went up to send messages to very specific communities at particular times, and in many cases, I have a hunch that they were commissioned mostly as an excuse to hold a ceremony. They were installed with great fanfare and then promptly faded into the background, unseen by most of the people in the park who pass them every day—although they’re arguably more visible to members of certain groups who were around to receive that message in the first place.) The real irony of the current debate over Confederate statues is that by taking them down, or even by raising the question of whether we should do so, we’ve made them visible to a huge swath of the population by whom they had ceased to be seen. Far from erasing history, as the most historically incurious president of modern times has argued, the discussion has made it alive again, even if it means blowing up the nursery myths on which these statues were founded.

The real question is whether this kind of history has any value for adults. Howard argues convincingly that under certain circumstances, it does, citing its use within the military itself:

The regimental historian, for instance, has, consciously or unconsciously, to sustain the view that his regiment has usually been flawlessly brave and efficient, especially during its recent past. Without any sense of ill-doing he will emphasize the glorious episodes in its history and pass with a light hand over its murkier passages, knowing full well that his work is to serve a practical purpose in sustaining regimental morale in the future…The young soldier in action for the first time may find it impossible to bridge the gap between war as it has been painted and war as it really is—between the way in which he, his peers, his officers, and his subordinates should behave, and the way in which they actually do. He may be dangerously unprepared for cowardice and muddle and horror when he actually encounters them, unprepared even for the cumulative attrition of dirt and fatigue. But nevertheless the “myth” can and often does sustain him, even when he knows, with half his mind, that it is untrue.

There’s a big difference between serving as a soldier and living in a civil society, but there’s a grain of truth in the notion that nursery history can console and motivate us when we’re faced with the contrast between our lives as we want them to be and how they actually are. White nationalism exists in large part as an outlet for the frustrations of men and women who feel angry, helpless, and ignored, just as more innocuous attempts to sentimentalize history arise from a fear of the future. But what separates a romantic notion of the Confederacy from other nursery myths is that it isn’t predicated on the demonization of a foreign enemy in the past, but on the marginalization of human beings who live in the same towns in which these statues stand.

And if we content ourselves with nothing but nursery history, we limit ourselves. Myths have a way of closing off thought, telling us that we’re just fine the way we are simply because of the circumstances in which we happen to have been born. A more unsparing look at history enables the “hard preliminary thinking” that prepares us for action when the time comes, which is one of the reasons that Howard gives for studying the subject at all. (He notes that many officers spend so much energy on administrative issues in peacetime that they’re left unprepared for war itself, which leads him to the lovely line: “The advantage enjoyed by sailors in this respect is a very marked one; for nobody commanding a vessel at sea, whether battleship or dingy, is ever wholly at peace.”) Howard argues that one function of the historian is to puncture these myths for the good of the nation: “Inevitably the honest historian discovers, and must expose, things which are not compatible with the national myth; but to allow him to do so is necessary, not simply to conform to the values which the war was fought to defend, but to preserve military efficiency for the future.” Even in civilian life, the point still holds. Nursery history has its place, but as we put away childish things, we have to be prepared to give up our illusions about the past. And sometimes, as Howard acknowledges, it hurts, which doesn’t make it any less necessary:

The process of disillusionment is necessarily a disagreeable one and often extremely painful. For many of us, the “myth” has become so much a part of our world that it is anguish to be deprived of it…Such disillusion is a necessary part of growing up in and belonging to an adult society; and a good definition of the difference between a Western liberal society and a totalitarian one—whether it be Communist, Fascist, or Catholic authoritarian—is that in the former the government treats its citizens as responsible adults and in the latter it cannot.

Written by nevalalee

August 17, 2017 at 9:21 am

The gospel of singlemindedness

with one comment

It is difficult to get a man to understand something when his salary depends upon his not understanding it.

—Upton Sinclair

A few years ago, I started reading, but didn’t manage to finish, the New Age classic The Teachings of Don Juan. Its author, the late Carlos Castaneda, has been convincingly revealed as a talented writer of fiction who sold millions of copies of his books by presenting his work as anthropology. As Richard de Mille writes in Castaneda’s Journey: “[Castaneda’s] stories are packed with truth, though they are not true stories, which he says they are…[He is] an ambiguous spellbinder dealing simultaneously in contrary commodities—wisdom and deception.” (De Mille would have known—he was one of L. Ron Hubbard’s earliest followers before breaking away from the dianetics movement.) But I’ll take good advice wherever I can find it, and there’s one passage in The Teachings of Don Juan that I think about all the time:

A man of knowledge needed a rigid will in order to endure the obligatory quality that every act possessed when it was performed in the context of his knowledge…A man of knowledge needed frugality because the majority of the obligatory acts dealt with instances or with elements that were either outside the boundaries of ordinary everyday life, or were not customary in ordinary activity, and the man who had to act in accordance with them needed an extraordinary effort every time he took action. It was implicit that one could have been capable of such an extraordinary effort only by being frugal with every other activity that did not deal directly with such predetermined actions.

I think that Castaneda, whatever his flaws, is getting at something important here that I haven’t seen fully explored in other discussions of frugality and simplicity. A simple life is undoubtedly worth pursuing for its own sake, but it’s also a means to an end, allowing for an almost frightening degree of concentration and prolonged attention to problems that would otherwise be impossible to address. When you look closely at our most celebrated exemplars of voluntary poverty, from Spinoza to Thoreau, you realize that their underlying motivation is an ethical or intellectual ambition too vast—and maybe too dangerous—to be contained within a conventional life, and this extends to the most famous role model of all. In Jesus: An Historian’s Review of the Gospels, the Cambridge professor Michael Grant writes:

The disciples had to be equally singleminded. Not only must food, drink, and clothing be totally unimportant in their eyes, but they must abandon everything they possess in order to take part in Jesus’ installation of the Kingdom…Peter declared: “We have left everything to become your followers.” By doing so they had become “pure in heart,” singleminded and free from the tyranny of a divided self…It was because of his insistence on this singlemindedness that [Jesus] took so much notice of children, allowing them to be in his company and praising their simplicity…Feelings of kindly tenderness were not in fact the reason why he paid them so much attention…It is the total receptivity of children that he is praising: and for his disciples, too, the implication was that they must be equally receptive in their wholehearted devotion to the one and only aim that is worth pursuing: admission to the Kingdom.

Not surprisingly, the examples of simplicity that stick in our heads tend to be the ones that that seem the most difficult to emulate. Jesus, as Grant notes, “emphasized the point in terms which, even allowing for Middle Eastern hyperbole, displayed formidable starkness,” as when he said to the disciple who asked for permission to bury his father first: “Follow me, and leave the dead to bury their dead.” Thoreau couldn’t even live by his own principles for more than a couple of years, while Castaneda’s version is basically imaginary. Yet it’s only by measuring ourselves against such extreme cases that we can hope to make incremental changes in our own lives. If you’re a writer, you learn to pare away everything else to the extent you can, simply because of “the extraordinary effort” required each time you face a blank page. And doing good work of any kind calls for resources that we can only allocate to it if we’ve taken the time to structure our lives accordingly beforehand. Human beings are fallible and weak, and most of us can only act with moral integrity after we’ve systematically reduced the obstacles that prevent us from doing so. It’s hard enough as it is, so there’s no point in making it any more difficult than necessary. In theory, we could gain the freedom for what Castaneda calls “the exercise of volition” by becoming sufficiently rich and powerful, but in practice, it’s easier and less compromising to go about it the other way. As Thoreau famously writes: “Most of the luxuries, and many of the so-called comforts of life, are not only not indispensable, but positive hindrances to the elevation of mankind…None can be an impartial or wise observer of human life but from the vantage ground of what we should call voluntary poverty.”

The key word here is “impartial.” Our lives are full of entanglements, and our ability to respond in a crisis requiring extraordinary effort has less to do with our inherent worth than with the pragmatic choices that we’ve already made. For another extreme case, you don’t need to look any further than the events of the past weekend. If Donald Trump resisted making the denunciation of white nationalism that even members of his own party were able to provide, it isn’t so much because he’s uniquely horrible as because of certain facts about his rise to power. Trump can seemingly pick fights with anyone except for Vladimir Putin and white supremacists. He cannot do it. And it’s in large part because he’s trapped. I don’t know what Trump actually believes, but he’s the embodiment of Upton Sinclair’s man whose salary depends on his not understanding something. True empathy requires “extraordinary effort” and singlemindedness, and the choices that we’ve made in the past affect our actions in the future. In his discussion of the Parable of the Unjust Steward, which is perhaps the strangest story in the gospels, Grant writes:

How shocking…to find Jesus actually praising this shady functionary. He praised him because, when confronted with a crisis, he had acted. You, declared Jesus to his audience, are faced with a far graver crisis, a far more urgent need for decision and action. As this relentless emergency approaches you cannot just hit with your hands folded. Keep your eyes open and be totally apart and prepared to act if you want to be among the Remnant who will endure the terrible time.

Strip away the eschatological language, and you’re left with the message that this crisis is happening all the time. The only way to act properly is to remove everything that prevents us from doing otherwise. And if we wait until the emergency is here, we’ll find that it’s already too late.

Edit: Never mind—it turns out that we do know what he actually believes

The ultimate trip

with 2 comments

On Saturday, I was lucky enough to see 2001: A Space Odyssey on the big screen at the Music Box Theatre in Chicago. I’ve seen this movie well over a dozen times, but watching it on a pristine new print from the fourth row allowed me to pick up on tiny details that I’d never noticed before, such as the fact that David Bowman, stranded at the end in his celestial hotel room, ends up wearing a blue velvet robe startlingly like Isabella Rossellini’s. I was also struck by the excellence of the acting, which might sound like a joke, but it isn’t. Its human protagonists have often been dismissed—Roger Ebert, who thought it was one of the greatest films of all time, called it “a bloodless movie with faceless characters”—and none of the actors, aside from Douglas Rain as the voice of HAL, are likely to stick in the memory. (As Noël Coward reputedly said: “Keir Dullea, gone tomorrow.”) But on an objective level, these are nothing less than the most naturalistic performances of any studio movie of the sixties. There isn’t a trace of the affectation or overacting that you see in so much science fiction, and Dullea, Gary Lockwood, and particularly William Sylvester, in his nice dry turn as Heywood Floyd, are utterly believable. You could make a strong case that their work here has held up better than most of the more conventionally acclaimed performances from the same decade. This doesn’t make them any better or worse, but it gives you a sense of what Kubrick, who drew his characters as obsessively as his sets and special effects, was trying to achieve. He wanted realism in his acting, along with everything else, and this is how it looks, even if we aren’t used to seeing it in space.

The result is still the most convincing cinematic vision of space exploration that we have, as well as the most technically ambitious movie ever made, and its impact, like that of all great works of art, appears in surprising places. By coincidence, I went to see 2001 the day after Donald Trump signed an executive order to reinstate the National Space Council, at a very peculiar ceremony that was held with a minimum of fanfare. The event was attended by Buzz Aldrin, who has played scenes across from Homer Simpson and Optimus Prime, and I can’t be sure that this didn’t strike him as the strangest stage he had ever shared. Here are a few of Trump’s remarks, pulled straight from the official transcript:

Security is going to be a very big factor with respect to space and space exploration.  At some point in the future, we’re going to look back and say, how did we do it without space? The Vice President will serve as the council’s chair….Some of the most successful people in the world want to be on this board…Our journey into space will not only make us stronger and more prosperous, but will unite us behind grand ambitions and bring us all closer together. Wouldn’t that be nice? Can you believe that space is going to do that? I thought politics would do that. Well, we’ll have to rely on space instead…We will inspire millions of children to carry on this proud tradition of American space leadership—and they’re excited—and to never stop wondering, hoping, and dreaming about what lies beyond the stars.

Taking a seat, Trump opened the executive order, exclaiming: “I know what this is. Space!” Aldrin then piped up with what was widely reported as a reference to Toy Story: “Infinity and beyond!” Trump seemed pleased: “This is infinity here. It could be infinity. We don’t really don’t know. But it could be. It has to be something—but it could be infinity, right?”

As HAL 9000 once said: “Yes, it’s puzzling.” Aldrin may have been quoting Toy Story, but he might well have been thinking of 2001, too, the last section of which is titled “Jupiter and Beyond the Infinite.” (As an aside, I should note that the line “To infinity and beyond” makes its first known appearance, as far as I can tell, in John W. Campbell’s 1934 serial The Mightiest Machine.) It’s an evocative but meaningless phrase, with the same problems that led Arthur C. Clarke to express doubts about Kubrick’s working title, Journey Beyond the Stars—which Trump, you’ll notice, also echoed. Its semantic content is nonexistent, which is only fitting for a ceremony that underlined the intellectual bankruptcy of this administration’s approach to space. I don’t think I’m overstating the matter when I say that Trump and Mike Pence have shown nothing but contempt for other forms of science. The science division of the Office of Science and Technology Policy lies empty. Pence has expressed bewilderment at the fact that climate change has emerged, “for some reason,” as an issue on the left. And Trump has proposed significant cuts to science and technology funding agencies. Yet his excitement for space seems unbounded and apparently genuine. He asked eagerly of astronaut Peggy Whitson: “Tell me, Mars, what do you see a timing for actually sending humans to Mars? Is there a schedule and when would you see that happening?” And the reasons behind his enthusiasm are primarily aesthetic and emotional. One of his favorite words is “beautiful,” in such phrases as “big, beautiful wall” and “beautiful military equipment,” and it was much in evidence here: “It is America’s destiny to be at the forefront of humanity’s eternal quest for knowledge and to be the leader amongst nations on our adventure into the great unknown. And I could say the great and very beautiful unknown. Nothing more beautiful.”

But the truly scary thing is that if Trump believes that the promotion of space travel can be divorced from any concern for science itself, he’s absolutely right. As I’ve said here before, in the years when science fiction was basically a subcategory of adventure fiction, with ray guns instead of revolvers, space was less important in itself than as the equivalent of the unexplored frontier of the western: it stood for the unknown, and it was a perfect backdrop for exciting plots. Later, when the genre began to take itself more seriously as a predictive literature, outer space was grandfathered in as a setting, even if it had little to do with any plausible vision of the future. Space exploration seemed like an essential part of our destiny as a species because it happened to be part of the genre already. As a result, you can be excited by the prospect of going to Mars while actively despising or distrusting everything else about science—which may be the only reason that we managed to get to the moon at all. (These impulses may have less to do with science than with religion. The most haunting image from the Apollo 11 mission, all the more so because it wasn’t televised, may be that of Aldrin taking communion on the lunar surface.) Science fiction made it possible, and part of the credit, or blame, falls on Kubrick. Watching 2001, I had tears in my eyes, and I felt myself filled with all my old emotions of longing and awe. As Kubrick himself stated: “If 2001 has stirred your emotions, your subconscious, your mythological yearnings, then it has succeeded.” And it did, all too well, at the price of separating our feelings for space even further from science, and of providing a place for those subconscious urges to settle while leaving us consciously indifferent to problems closer to home. Kubrick might not have faked the moon landing, but he faked a Jupiter mission, and he did it beautifully. And maybe, at least for now, it should save us the expense of doing it for real.

A most pitiful ambition

with 3 comments

In Magic and Showmanship, which is one of my favorite books on storytelling of any kind, the magician and polymath Henning Nelms sets forth a principle that ought to be remembered by all artists:

An illusion is, by definition, untrue. In every field, we detect untruth by inconsistency. We recognize statements as false when they contradict themselves. An actor who does something which is not in keeping with his role falls out of character, and the spell of the play is broken. If a conjurer’s words and actions fail to match the powers he claims, he pricks the bubble of illusion; he may still entertain his audience with a trick, but he loses the magic of drama. Consistency is the key to conviction.

Nelms adds that consistency is also the key to entertainment, and that it achieves its greatest impact when all of its resources are directed toward the same goal. He continues:

Consistency implies a standard. We cannot merely be consistent; we must be consistent with something. In creating an illusion, our standard is the theme. Once you realize this, you will find that the theme provides a guide to every detail of your presentation. This is a tremendous asset. It answers many questions almost before you can ask them.

And Nelms concludes with a powerful rule: “Plan a routine as if every element of the theme—personalities, phenomena, purpose, and proof—were literally true.”

To some extent, this is simply a restatement of what John Gardner calls “the vivid and continuous fictional dream.” Any lapse or inconsistency will draw viewers or readers out of the performance, and it can be hard to get them back again. As Nelms puts it:

Although the “as if” rule is an inspiring guide, it is also a strict taskmaster. Consistency is essential to any suspension of disbelief. No conviction is so deep that it cannot be destroyed by a discrepancy in the presentation. On the contrary, the more profoundly the spectators are enthralled by a performance, the more likely they are to be jerked back to reality by anything which is not in harmony with the illusion.

Even more usefully, Nelms frames this rule as a courtesy to the magician himself, since it provides a source of information at times when we might otherwise be lost: “It not only helps us to make decisions, but suggests ideas.” He also helpfully observes that it can be more productive, on a creative level, to focus on eliminating discrepancies, rather than on heightening the elements that are already effective:

My whole procedure as a showman is based on a technique of hunting for faults and ruthlessly eliminating them…The good parts of a play or routine take care of themselves. If I see a way to improve them, I do so. But I never worry about them. Instead, I concentrate on spotting and correcting the flaws. These are the places that offer the greatest opportunities for improvement. Hence, they are also the places where time and effort devoted to improvement will produce the greatest results.

On a practical level, Nelms suggests that you write down an outline of the illusion as if it were literally true, and then see where you have to depart from this ideal for technical reasons—which is where you should concentrate your attention to minimize any obvious discrepancies. This all seems like common sense, and if writers and performers sometimes forget this, it’s because they get attached to inconsistencies that provide some other benefit in the short term. Nelms writes:

Many dramas have been ruined by actors who tried to enliven serious scenes by being funny. The spectators laughed at the comedy, but they were bored by the play. The same law holds true for conjuring: No matter how effective an inconsistent part may be, the damage that it does to the routine as a whole more than offsets whatever advantages it may have in itself.

He continues: “Directors and performers alike are so flattered by hearing an audience laugh or exclaim over some line or action that they blind themselves to the harm it does to the play or the illusion.” This tendency is as old as drama itself, as we see in Hamlet’s advice to the players, and it can have a troubling effect on the audience:

A discrepancy may escape conscious notice and still weaken conviction. The suspension of disbelief is a subconscious process. No one says to himself, “If I am to enjoy this performance to the full, I must accept it as true and close my mind to the fact that I know it to be false.” Spectators can be led to adopt this attitude, but they must do so without thinking—and without realizing that they have done anything of the kind.

Which brings us, unfortunately, to Donald Trump. If you’re a progressive who is convinced that the president is trying to put one over on the public, you also have to confront the fact that he isn’t especially good at it. Not only are the discrepancies glaring, but they occur with a clockwork regularity that would be funny if it weren’t so horrifying. After the Washington Post reported that Trump had disclosed classified information—remember that?—to the Russian foreign minister and ambassador, his national security adviser said: “I was in the room. It did not happen.” The next day, Trump tweeted that he “wanted to share” the facts with Russia, as he had “the absolute right to do.” After James Comey was fired, the White House issued a statement saying that Trump had acted on the advice of the Justice Department, which based its recommendation on Comey’s handling of the investigation into Hilary Clinton’s emails. Two days later, Trump contradicted both points in an interview with Lester Holt: “I was going to fire Comey. My decision…In fact, when I decided to just do it, I said to myself, I said: ‘You know, this Russia thing with Trump and Russia is a made-up story.’” And when his staff repeatedly asserted that the refugee order wasn’t a travel ban, only to have Trump insist that it was, it felt like a cutaway gag on Arrested Development. You’ll sometimes see arguments that Trump is a chess master, creating distractions like a magician utilizing the technique of misdirection, which strikes me as a weird form of liberal consolation. (It reminds me of what Cooder the carny says of being grifted by Homer Simpson: “Well, there’s no shame in bein’ beaten by the best.” When his son tries to point out that Homer didn’t seem very smart, Cooder interrupts angrily: “We were beaten by the best.”) But the real answer is close at hand. Let’s look at Hamlet’s speech again:

And let those that play your clowns speak no more than is set down for them, for there be of them that will themselves laugh, to set on some quantity of barren spectators to laugh too, though in the meantime some necessary question of the play be then to be considered. That’s villainous and shows a most pitiful ambition in the fool that uses it.

This may be the best thing ever written about the Trump administration. Trump has been trained for years to go for the easy laugh or the quick reaction from the crowd, and he’ll continue to do so, even as “necessary questions” need to be considered. He’s done pretty well with it so far. And he has a receptive audience that seems willing to tell itself exactly what Nelms thought was impossible: “If I am to enjoy this performance to the full, I must accept it as true and close my mind to the fact that I know it to be false.”

Written by nevalalee

June 9, 2017 at 9:02 am

The Comey Files

with one comment

About a month ago, for some reason, I decided to write a blog post about James Comey. I was inspired by an article by Nate Silver titled “The Comey Letter Probably Cost Clinton the Election,” which outlined its case in compelling terms:

Hillary Clinton would probably be president if FBI Director James Comey had not sent a letter to Congress on Oct. 28. The letter, which said the FBI had “learned of the existence of emails that appear to be pertinent to the investigation” into the private email server that Clinton used as secretary of state, upended the news cycle and soon halved Clinton’s lead in the polls, imperiling her position in the Electoral College…At a maximum, it might have shifted the race by three or four percentage points toward Donald Trump, swinging Michigan, Pennsylvania, Wisconsin and Florida to him, perhaps along with North Carolina and Arizona. At a minimum, its impact might have been only a percentage point or so. Still, because Clinton lost Michigan, Pennsylvania and Wisconsin by less than one point, the letter was probably enough to change the outcome of the Electoral College.

Silver added that this fact has largely gone unacknowledged by the media: “From almost the moment that Trump won the White House, many mainstream journalists have been in denial about the impact of Comey’s letter…The motivation for this seems fairly clear: If Comey’s letter altered the outcome of the election, the media may have some responsibility for the result.” He concluded:

One can believe that the Comey letter cost Clinton the election without thinking that the media cost her the election—it was an urgent story that any newsroom had to cover. But if the Comey letter had a decisive effect and the story was mishandled by the press…the media needs to grapple with how it approached the story.

Of course, there’s more than one way to read the evidence. Silver’s doppelgänger, Nate Cohn of the New York Times, looked at the data for “the Comey effect” and argued against it:

These polls are consistent with an alternative election narrative in which the Comey letter had no discernible effect on the outcome. In this telling, Mrs. Clinton had a big lead after the third presidential debate…But her advantage dwindled over the following week, as post-debate coverage faded and Republican-leaning voters belatedly and finally decided to back their traditional party’s nontraditional candidate…In such a close election, anything and everything could have plausibly been decisive.

The italics are mine. No matter what you believe about what Comey did, it’s that “anything and everything” that haunts me, at least in terms of what politicians—and the rest of us—can hope to learn from the whole mess. As Nate Silver said at the end of his piece, again with my emphasis:

In normal presidential campaigns, preparing for the debates, staging the conventions and picking a solid running mate are about as high-stakes as decisions get…If I were advising a future candidate on what to learn from 2016, I’d tell him or her to mostly forget about the Comey letter and focus on the factors that were within the control of Clinton and Trump.

When you think about it, this is an extraordinary statement. Comey’s letter may have been decisive, but it isn’t the kind of development that a candidate can anticipate, so the best policy is still to concentrate on the more controllable factors that can cause a race to tighten in the first place.

As far as takeaways are concerned, this one isn’t too bad. It’s basically a reworking of the familiar advice that we should behave as prudently and consistently as we can, independent of luck, which positions us to deal with unforeseen events as they arise. I was preparing to write a post on that subject. Then a lot of other stuff happened, and I dropped it. Now that Comey is back in the news, I’ve been mulling it over again, and it occurs to me that the real case study in behavior here isn’t Clinton or Trump, but Comey himself—and it speaks as much to the limits of this approach as to its benefits. Regardless of how you feel about the consequences of his choices, there’s no doubt, at least in my mind, that he has behaved consistently, making decisions based on his own best judgment and thinking through the alternatives before committing himself to a course of action, however undesirable it might be. This didn’t exactly endear him to Democrats during the election, and afterward, it left him isolated in an administration that placed a premium on other qualities. Comey’s prepared testimony is remarkable, and as Nick Asbury of McSweeney’s points out, it reads weirdly in places like a Kazuo Ishiguro novel, but this is the paragraph that sticks with me the most:

Near the end of our dinner, the President returned to the subject of my job, saying he was very glad I wanted to stay, adding that he had heard great things about me from Jim Mattis, Jeff Sessions, and many others. He then said, “I need loyalty.” I replied, “You will always get honesty from me.” He paused and then said, “That’s what I want, honest loyalty.” I paused, and then said, “You will get that from me.”

Comey adds: “It is possible we understood the phrase ‘honest loyalty’ differently, but I decided it wouldn’t be productive to push it further. The term—honest loyalty—had helped end a very awkward conversation.” And many of his actions over the next few months seem to have been designed to avoid such awkwardness again, to the point where he reportedly asked Attorney General Jeff Sessions not to leave him alone with Trump.

Comey, in short, was behaving like a man who had arrived at the uncomfortable realization that despite his adherence to a personal code, he was stranded in a world in which it no longer mattered. If he had managed to hang on, he would have endured as the least popular man in Washington, distrusted by Democrats and Republicans alike. Then Trump fired him, which was an unforeseeable event in his life comparable to the havoc that his letter had wreaked on the election. The logic behind the move, to the extent that it had any at all, was expressed by Jared Kushner, a strong advocate of the firing, as the New York Times outlined his alleged reasoning: “It would be a political ‘win’ that would neutralize protesting Democrats because they had called for Mr. Comey’s ouster over his handling of Hillary Clinton’s use of a private email server.” That isn’t quite how it turned out, and there’s something undeniably funny in how quickly progressives like me rallied behind Comey, like Homer Simpson deciding that he’s been an Isotopes fan all along. Yet this was also when Comey’s strategy finally paid off. It wouldn’t have been as easy for liberals to flip the switch if it hadn’t been obvious that Comey was fundamentally a decent man, regardless of how badly he misjudged the political environment in which his actions would be received. (The evidence suggests that Comey made the same mistake as a lot of other rational actors—he simply thought that Clinton would win the election no matter what he did.) It’s a moment of vindication for the unfashionable virtues of punctuality, personal attention, courage, and thoroughness, which have been trampled into the mud over the last six months, in no small part because of Comey’s letter. If anything undermines Nate Silver’s argument that political candidates should focus on “factors that were within the control of Clinton and Trump,” it’s Trump himself, who has handled countless matters as badly as one could imagine and still fallen backwards into a position of incomprehensible power. If you were to freeze the picture here, you could only conclude that everything you believed about how to act in life was wrong. Maybe it is. But I don’t think so. The story isn’t over yet. And if there’s any lesson that we can take from the Comey affair, it’s that we should all act with an eye on the long game.

Written by nevalalee

June 8, 2017 at 8:27 am

On a wing and a prayer

leave a comment »

“It was the greatest career move in the history of entertainment,” David Thomson writes in an entry in The New Biographical Dictionary of Film. He’s speaking, of course, of Ronald Reagan:

He was a hugely successful and evasive president, as blind to disaster, inquiry, and humiliation as he was to the Constitution. And he was as lucky as he had been a loser in pictures…To paraphrase Gore Vidal, the wisdom and integrity of someone told where to stand and what to say for twenty years were made manifest. The fraudulence of the presidency was revealed so that the office could never quite be honored again.

When I look at these lines now, especially that last sentence, they can start to seem rather quaint. But Reagan has a lot to tell us about Trump, and not simply because he looks so much better by comparison. “An actor is playing the president,” Paul Slansky lamented in The Clothes Have No Emperor, a book—with its painstaking chronology of the unlikely events of the Reagan Administration—that looks increasingly funny, resonant, and frightening these days. Yet the presidency has always been something of a performance. As Malcolm Gladwell recently noted to The Undefeated, most presidents have been white men of a certain age and height:

Viewed statistically it’s absurd. Why would you limit your search for the most important job in the land to this tiny group of people? But it’s an incredibly common thing. We do a category selection before we do individual analysis.

In other words, we cast men who look the part, and then we judge them by how well they fulfill our idea of the role.

Reagan, like Trump, was unusually prone to improvising, or, in Thomson’s words, “deftly feeding the lines and situations of Warner Brothers in the 1940s back into world affairs.” Occasionally, he would tell a story to put himself in a favorable light, as when he made the peculiar claim—to Yitzhak Shamir and Simon Wiesenthal, no less—that he had personally shot documentary film of the concentration camps after World War II. (In reality, Reagan spent the war in Hollywood, where he assisted in processing footage taken by others in Europe.) But sometimes his reasons were harder to pin down. On December 12, 1983, Reagan told a story in a speech to the annual convention of the Congressional Medal Honor Society:

A B‑17 was coming back across the channel from a raid over Europe, badly shot up by anti‑aircraft; the ball turret that hung underneath the belly of the plane had taken a hit. The young ball‑turret gunner was wounded, and they couldn’t get him out of the turret there while flying. But over the channel, the plane began to lose altitude, and the commander had to order, “Bail out.” And as the men started to leave the plane, the last one to leave—the boy, understandably, knowing he was being left behind to go down with the plane, cried out in terror—the last man to leave the plane saw the commander sit down on the floor. He took the boy’s hand and said, “Never mind, son, we’ll ride it down together.” Congressional Medal of honor posthumously awarded.

Reagan recounted this story on numerous other occasions. But as Lars-Erik Nelson, the Washington bureau chief for the New York Daily News, subsequently determined, after checking hundreds of Medal of Honor citations from World War II: “It didn’t happen. It’s a Reagan story…The president of the United States went before an audience of three hundred real Congressional Medal of Honor winners and told them about a make‑believe Medal of Honor winner.”

There’s no doubt that Reagan, who often grew visibly moved as he recounted this story, believed that it was true, and it has even been used as a case study in the creation of false memories. Nelson traced it back to a scene in the 1944 movie Wing and a Prayer, as well as to a similar apocryphal item that appeared that year in Reader’s Digest. (The same story, incidentally, later became the basis for an episode of Amazing Stories, “The Mission,” starring Kevin Costner and Kiefer Sutherland and directed by Steven Spielberg. Tony Kushner once claimed that Spielberg’s movies “are the flagship aesthetic statements of Reaganism,” and this is the most compelling point I’ve seen in that argument’s favor.) But the most Trumpian aspect of the entire incident was the response of Reagan’s staff. As the Washington Post reported a few days later:

A determined White House is searching the records of American servicemen awarded the Medal of Honor in an effort to authenticate a disputed World War II story President Reagan told last week at a ceremony honoring recipients of the medal…The White House then began checking records to document the episode. Reagan is said by aides to be certain that he saw the citation exactly as he recounted it. The citations are summarized in a book published by Congress, but none of these summaries seem to fit precisely the episode Reagan described, although some are similar…The White House is now attempting to look beyond the summaries to more detailed accounts to see if one of the episodes may be the one Reagan mentioned. “We will find it,” said Misty Church, a researcher for the White House.

They never did. And the image of White House staffers frantically trying to justify something that the president said off the cuff certainly seems familiar today.

But what strikes me the most about this story is that Reagan himself had nothing to gain from it. Most of Trump’s fabrications are designed to make him look better, more successful, or more impressive than he actually is, while Reagan’s fable is rooted in a sentimental ideal of heroism itself. (It’s hard to even imagine a version of this story that Trump might have told, since the most admirable figure in it winds up dead. As Trump might say, he likes pilots who weren’t shot down.) Which isn’t to say that Reagan’s mythologizing isn’t problematic in itself, as Nelson pointed out:

[It’s] the difference between a make-believe pilot, dying nobly and needlessly to comfort a wounded boy, and the real-life pilots, bombardiers and navigators who struggled to save their planes, their crews and themselves and died trying. It’s the difference between war and a war story.

And while this might seem preferable to Trump’s approach, which avoids any talk of sacrifice in favor of scenarios in which everybody wins, or we stick other people with the cost of our actions, it still closes off higher levels of thought in favor of an appeal to emotion. Reagan was an infinitely more capable actor than Trump, and he was much easier to love, which shouldn’t blind us to what they have in common. They were both winging it. And the most characteristic remark to come out of the whole affair is how Larry Speakes, the White House spokesman under Reagan, responded when asked if the account was accurate: “If you tell the same story five times, it’s true.”

Who we are in the moment

with 59 comments

Jordan Horowitz and Barry Jenkins

By now, you’re probably sick of hearing about what happened at the Oscars. I’m getting a little tired of it, too, even though it was possibly the strangest and most riveting two minutes I’ve ever seen on live television. It left me feeling sorry for everyone involved, but there are at least three bright spots. The first is that it’s going to make a great case study for somebody like Malcolm Gladwell, who is always looking for a showy anecdote to serve as a grabber opening for a book or article. So many different things had to go wrong for it to happen—on the levels of design, human error, and simple dumb luck—that you can use it to illustrate just about any point you like. A second silver lining is that it highlights the basically arbitrary nature of all such awards. As time passes, the list of Best Picture winners starts to look inevitable, as if Cimarron and Gandhi and Chariots of Fire had all been canonized by a comprehensible historical process. If anything, the cycle of inevitability is accelerating, so that within seconds of any win, the narratives are already locking into place. As soon as La La Land was announced as the winner, a story was emerging about how Hollywood always goes for the safe, predictable choice. The first thing that Dave Itzkoff, a very smart reporter, posted on the New York Times live chat was: “Of course.” Within a couple of minutes, however, that plot line had been yanked away and replaced with one for Moonlight. And the fact that the two versions were all but superimposed onscreen should warn us against reading too much into outcomes that could have gone any number of ways.

But what I want to keep in mind above all else is the example of La La Land producer Jordan Horowitz, who, at a moment of unbelievable pressure, simply said: “I’m going to be really proud to hand this to my friends from Moonlight.” It was the best thing that anybody could have uttered under those circumstances, and it tells us a lot about Horowitz himself. If you were going to design a psychological experiment to test a subject’s reaction under the most extreme conditions imaginable, it’s hard to think of a better one—although it might strike a grant committee as possibly too expensive. It takes what is undoubtedly one of the high points of someone’s life and twists it instantly into what, if perhaps not the worst moment, at least amounts to a savage correction. Everything that the participants onstage did or said, down to the facial expressions of those standing in the background, has been subjected to a level of scrutiny worthy of the Zapruder film. At the end of an event in which very little occurs that hasn’t been scripted or premeditated, a lot of people were called upon to figure out how to act in real time in front of an audience of hundreds of millions. It’s proverbial that nobody tells the truth in Hollywood, an industry that inspires insider accounts with titles like Hello, He Lied and Which Lie Did I Tell? A mixup like the one at the Oscars might have been expressly conceived as a stress test to bring out everyone’s true colors. Yet Horowitz said what he did. And I suspect that it will do more for his career than even an outright win would have accomplished.

Kellyanne Conway

It also reminds me of other instances over the last year in which we’ve learned exactly what someone thinks. When we get in trouble for a remark picked up on a hot mike, we often say that it doesn’t reflect who we really are—which is just another way of stating that it doesn’t live up to the versions of ourselves that we create for public consumption. It’s far crueler, but also more convincing, to argue that it’s exactly in those unguarded, unscripted moments that our true selves emerge. (Freud, whose intuition on such matters was uncanny, was onto something when he focused on verbal mistakes and slips of the tongue.) The justifications that we use are equally revealing. Maybe we dismiss it as “locker room talk,” even if it didn’t take place anywhere near a locker room. Kellyanne Conway excused her reference to the nonexistent Bowling Green Massacre by saying “I misspoke one word,” even though she misspoke it on three separate occasions. It doesn’t even need to be something said on the spur of the moment. At his confirmation hearing for the position of ambassador to Israel, David M. Friedman apologized for an opinion piece he had written before the election: “These were hurtful words, and I deeply regret them. They’re not reflective of my nature or my character.” Friedman also said that “the inflammatory rhetoric that accompanied the presidential campaign is entirely over,” as if it were an impersonal force that briefly took possession of its users and then departed. We ask to be judged on our most composed selves, not the ones that we reveal at our worst.

To some extent, that’s a reasonable request. I’ve said things in public and in private that I’ve regretted, and I wouldn’t want to be judged solely on my worst moments as a writer or parent. At a time when a life can be ruined by a single tweet, it’s often best to err on the side of forgiveness, especially when there’s any chance of misinterpretation. But there’s also a place for common sense. You don’t refer to an event as a “massacre” unless you really think of it that way or want to encourage others to do so. And we judge our public figures by what they say when they think that nobody is listening, or when they let their guard down. It might seem like an impossibly high standard, but it’s also the one that’s effectively applied in practice. You can respond by becoming inhumanly disciplined, like Obama, who in a decade of public life has said maybe five things he has reason to regret. Or you can react like Trump, who says five regrettable things every day and trusts that its sheer volume will reduce it to a kind of background noise—which has awakened us, as Trump has in so many other ways, to a political option that we didn’t even knew existed. Both strategies are exhausting, and most of us don’t have the energy to pursue either path. Instead, we’re left with the practical solution of cultivating the inner voice that, as I wrote last week, allows us to act instinctively. Kant writes: “Live your life as though your every act were to become a universal law.” Which is another way of saying that we should strive to be the best version of ourselves at all times. It’s probably impossible. But it’s easier than wearing a mask.

Written by nevalalee

February 28, 2017 at 9:00 am

%d bloggers like this: