Posts Tagged ‘Facebook’
The chosen ones
In his recent New Yorker profile of Mark Zuckerberg, Evan Osnos quotes one of the Facebook founder’s close friends: “I think Mark has always seen himself as a man of history, someone who is destined to be great, and I mean that in the broadest sense of the term.” Zuckerberg feels “a teleological frame of feeling almost chosen,” and in his case, it happened to be correct. Yet this tells us almost nothing abut Zuckerberg himself, because I can safely say that most other undergraduates at Harvard feel the same way. A writer for The Simpsons once claimed that the show had so many presidential jokes—like the one about Grover Cleveland spanking Grandpa “on two non-consecutive occasions”—because most of the writers secretly once thought that they would be president themselves, and he had a point. It’s very hard to do anything interesting in life without the certainty that you’re somehow one of the chosen ones, even if your estimation of yourself turns out to be wildly off the mark. (When I was in my twenties, my favorite point of comparison was Napoleon, while Zuckerberg seems to be more fond of Augustus: “You have all these good and bad and complex figures. I think Augustus is one of the most fascinating. Basically, through a really harsh approach, he established two hundred years of world peace.”) This kind of conviction is necessary for success, although hardly sufficient. The first human beings to walk on Mars may have already been born. Deep down, they know it, and this knowledge will determine their decisions for the rest of their lives. Of course, thousands of others “know” it, too. And just a few of them will turn out to be right.
One of my persistent themes on this blog is how we tend to confuse talent with luck, or, more generally, to underestimate the role that chance plays in success or failure. I never tire of quoting the economist Daniel Kahneman, who in Thinking Fast and Slow shares what he calls his favorite equation:
Success = Talent + Luck
Great Success = A little more talent + A lot of luck
The truth of this statement seems incontestable. Yet we’re all reluctant to acknowledge its power in our own lives, and this tendency only increases as the roles played by luck and privilege assume a greater importance. This week has been bracketed by news stories about two men who embody this attitude at its most extreme. On the one hand, you have Brett Kavanaugh, a Yale legacy student who seems unable to recognize that his drinking and his professional success weren’t mutually exclusive, but closer to the opposite. He occupied a cultural and social stratum that gave him the chance to screw up repeatedly without lasting consequences, and we’re about to learn how far that privilege truly extends. On the other hand, you have yesterday’s New York Times exposé of Donald Trump, who took hundreds of millions of dollars from his father’s real estate empire—often in the form of bailouts for his own failed investments—while constantly describing himself as a self-made billionaire. This is hardly surprising, but it’s still striking to see the extent to which Fred Trump played along with his son’s story. He understood the value of that myth.
This gets at an important point about privilege, no matter which form it takes. We have a way of visualizing these matters in spatial terms—”upper class,” “lower class,” “class pyramid,” “rising,” “falling,” or “stratum” in the sense that I used it above. But true privilege isn’t spatial, but temporal. It unfolds over time, by giving its beneficiaries more opportunities to fail and recover, when those living at the edge might not be able to come back from the slightest misstep. We like to say that a privileged person is someone who was born on third base and thinks he hit a triple, but it’s more like being granted unlimited turns at bat. Kavanaugh provides a vivid reminder, in case we needed one, that a man who fits a certain profile has the freedom to make all kinds of mistakes, the smallest of which would be fatal for someone who didn’t look like he did. And this doesn’t just apply to drunken misbehavior, criminal or otherwise, but even to the legitimate failures that are necessary for the vast majority of us to achieve real success. When you come from the right background, it’s easier to survive for long enough to benefit from the effects of luck, which influences the way that we talk about failure itself. Silicon Valley speaks of “failing faster,” which only makes sense when the price of failure is humiliation or the loss of investment capital, not falling permanently out of the middle class. And as I’ve noted before, Pixar’s creative philosophy, which Andrew Stanton described as a process in which “the films still suck for three out of the four years it takes to make them,” is only practicable for filmmakers who look and sound like their counterparts at the top, which grants them the necessary creative freedom to fail repeatedly—a luxury that women are rarely granted.
This may all come across as unbelievably depressing, but there’s a silver lining, and it took me years to figure it out. The odds of succeeding in any creative field—which includes nearly everything in which the standard career path isn’t clearly marked—are minuscule. Few who try will ever make it, even if they have “a teleological frame of feeling almost chosen.” This isn’t due to a lack of drive or talent, but of time and second chances. When you combine the absence of any straightforward instructions with the crucial role played by luck, you get a process in which repeated failure over a long period is almost inevitable. Those who drop out don’t suffer from weak nerves, but from the fact that they’ve used up all of their extra lives. Privilege allows you to stay in the game for long enough for the odds to turn in your favor, and if you’ve got it, you may as well use it. (An Ivy League education doesn’t guarantee success, but it drastically increases your ability to stick around in the middle class in the meantime.) In its absence, you can find strategies of minimizing risk in small ways while increasing it on the highest levels, which just another word for becoming a bohemian. And the big takeaway here is that since the probability of success is already so low, you may as well do exactly what you want. It can be tempting to tailor your work to the market, reasoning that it will increase your chances ever so slightly, but in reality, the difference is infinitesimal. An objective observer would conclude that you’re not going to make it either way, and even if you do, it will take about the same amount of time to succeed by selling out as it would by staying true to yourself. You should still do everything that you can to make the odds more favorable, but if you’re probably going to fail anyway, you might as well do it on your own terms. And that’s the only choice that matters.
The sin of sitzfleisch
Yesterday, I was reading the new profile of Mark Zuckerberg by Evan Osnos in The New Yorker when I came across one of my favorite words. It appears in a section about Zuckerberg’s wife, Priscilla Chan, who describes her husband’s reaction to the recent controversies that have swirled around Facebook:
When I asked Chan about how Zuckerberg had responded at home to the criticism of the past two years, she talked to me about Sitzfleisch, the German term for sitting and working for long periods of time. “He’d actually sit so long that he froze up his muscles and injured his hip,” she said.
Until now, the term sitzfleisch, or literally “buttocks,” was perhaps most widely known in chess, in which it evokes the kind of stoic, patient endurance capable of winning games by making one plodding move after another, but you sometimes see it in other contexts as well. Just two weeks ago, Paul Joyce, a lecturer in German at Portsmouth University, was quoted in an article by the BBC: “It’s got a positive sense, [it] positively connotes a sense of endurance, reliability, not just flitting from one place to another, but it is also starting to be questioned as to whether it matches the experience of the modern world.” Which makes it all the more striking to hear it applied to Zuckerberg, whose life’s work has been the systematic construction of an online culture that makes such virtues seem obsolete.
The concept of sitzfleisch is popular among writers—Elizabeth Gilbert has a nice blog post on the subject—but it also has its detractors. A few months ago, I posted a quote from Twilight of the Idols in which Friedrich Nietzsche comes out strongly against the idea. Here’s the full passage, which appears in a section of short maxims and aphorisms:
On ne peut penser et écrire qu’assis (G. Flaubert). Now I’ve got you, you nihilist! Sitting still [sitzfleisch] is precisely the sin against the holy ghost. Only thoughts which come from walking have any value.
The line attributed to Flaubert, which can be translated as “One can think and write only when sitting down,” appears to come from a biographical sketch by Guy de Maupassant. When you read it in context, you can see why it irritated Nietzsche:
From his early infancy, the two distinctive traits of [Flaubert’s] nature were great ingenuousness and a dislike of physical action. All his life he remained ingenuous and sedentary. He could not see any one walking or moving about near him without becoming exasperated; and he would declare in his sharp voice, sonorous and always a little theatrical, that motion was not philosophical. “One can think and write only when seated,” he would say.
On some level, Nietzsche’s attack on sitzfleisch feels like a reaction against his own inescapable habits—he can hardly have written any of his books without the ability to sit in solitude for long periods of time. I’ve noted elsewhere that the creative life has to be conducted both while seated and while engaging in other activities, and that your course of action at any given moment can be guided by whether or not you happen to be sitting down. And it can be hard to strike the right balance. We have to spend time at a desk in order to write, but we often think better by walking, going outside, and pointedly not checking Facebook. In the recent book Nietzsche and Montaigne, the scholar Robert Miner writes:
Both Montaigne and Nietzsche strongly favor mobility over sedentariness. Montaigne is a “sworn enemy” of “assiduity (assiduité)” who goes “mostly on horseback, where my thoughts range most widely.” Nietzsche too finds that “assiduity (Sitzfleisch) is the sin against the Holy Spirit” but favors walking rather than riding. As Dahlkvist observes, Nietzsche may have been inspired by Beethoven’s habit of walking while composing, which he knew about from his reading of Henri Joly’s Psychologie des grand hommes.
That’s possible, but it also reflects the personal experience of any writer, who is often painfully aware of the contradiction of trying to say something about life while spending most of one’s time alone.
And Nietzsche’s choice of words is also revealing. In describing sitzfleisch as a sin against the Holy Ghost, he might have just been looking for a colorful phrase, or making a pun on a “sin of the flesh,” but I suspect that it went deeper. In Catholic dogma, a sin against the Holy Ghost is specifically one of “certain malice,” in which the sinner acts on purpose, repeatedly, and in full knowledge of his or her crime. Nietzsche, who was familiar with Thomas Aquinas, might have been thinking of what the Summa Theologica has to say on the subject:
Augustine, however…says that blasphemy or the sin against the Holy Ghost, is final impenitence when, namely, a man perseveres in mortal sin until death, and that it is not confined to utterance by word of mouth, but extends to words in thought and deed, not to one word only, but to many…Hence they say that when a man sins through weakness, it is a sin “against the Father”; that when he sins through ignorance, it is a sin “against the Son”; and that when he sins through certain malice, i.e. through the very choosing of evil…it is a sin “against the Holy Ghost.”
Sitzfleisch, in short, is the sin of those who should know better. It’s the special province of philosophers, who know exactly how badly they fall short of ordinary human standards, but who have no choice if they intend to publish “not one word only, but many.” Solitary work is unhealthy, even inhuman, but it can hardly be avoided if you want to write Twilight of the Idols. As Nietzsche notes elsewhere in the same book: “To live alone you must be an animal or a god—says Aristotle. He left out the third case: you must be both—a philosopher.”
The war of ideas
Over the last few days, I’ve been thinking a lot about a pair of tweets. One is from Susan Hennessy, an editor for the national security blog Lawfare, who wrote: “Much of my education has been about grasping nuance, shades of gray. Resisting the urge to oversimplify the complexity of human motivation. This year has taught me that, actually, a lot of what really matters comes down to good people and bad people. And these are bad people.” This is a remarkable statement, and in some ways a heartbreaking one, but I can’t disagree with it, and it reflects a growing trend among journalists and other commentators to simply call what we’re seeing by its name. In response to the lies about the students of Marjory Stoneman Douglas High School—including the accusation that some of them are actors—Margaret Sullivan of the Washington Post wrote:
When people act like cretins, should they be ignored? Does talking about their misdeeds merely give them oxygen? Maybe so. But the sliming—there is no other word for it—of the survivors of last week’s Florida high school massacre is beyond the pale…Legitimate disagreement over policy issues is one thing. Lies, conspiracy theories and insults are quite another.
And Paul Krugman went even further: “America in 2018 is not a place where we can disagree without being disagreeable, where there are good people and good ideas on both sides, or whatever other bipartisan homily you want to recite. We are, instead, living in a kakistocracy, a nation ruled by the worst, and we need to face up to that unpleasant reality.”
The other tweet that has been weighing on my mind was from Rob Goldman, a vice president of advertising for Facebook. It was just one of a series of thoughts—which is an important detail in itself—that he tweeted out on the day that Robert Mueller indicted thirteen Russian nationals for their roles in interfering in the presidential election. After proclaiming that he was “very excited” to see the indictments, Goldman said that he wanted to clear up a few points. He had seen “all of the Russian ads” that appeared on Facebook, and he stated: “I can say very definitively that swaying the election was not the main goal.” But his most memorable words, at least for me, were: “The majority of the Russian ad spend happened after the election. We shared that fact, but very few outlets have covered it because it doesn’t align with the main media narrative of Tump [sic] and the election.” This is an astounding statement, in part because it seems to defend Facebook by saying that it kept running these ads for longer than most people assume. But it’s also inexplicable. It may well be, as some observers have contended, that Goldman had a “nuanced” point to make, but he chose to express it on a forum that is uniquely vulnerable to being taken out of context, and to unthinkingly use language that was liable to be misinterpreted. As Josh Marshall wrote:
[Goldman] even apes what amounts to quasi-Trumpian rhetoric in saying the media distorts the story because the facts “don’t align with the main media narrative of Trump and the election.” This is silly. Elections are a big deal. It’s hardly surprising that people would focus on the election, even though it’s continued since. What is this about exactly? Is Goldman some kind of hardcore Trumper?
I don’t think he is. But it also doesn’t matter, at least not when his thoughts were retweeted approvingly by the president himself.
This all leads me to a point that the events of the last week have only clarified. We’re living in a world in which the lines between right and wrong seem more starkly drawn than ever, with anger and distrust rising to an unbearable degree on both sides. From where I stand, it’s very hard for me to see how we recover from this. When you can accurately say that the United States has become a kakistocracy, you can’t just go back to the way things used to be. Whatever the outcome of the next election, the political landscape has been altered in ways that would have been unthinkable even two years ago, and I can’t see it changing during my lifetime. But even though the stakes seem clear, the answer isn’t less nuance, but more. If there’s one big takeaway from the last eighteen months, it’s that the line between seemingly moderate Republicans and Donald Trump was so evanescent that it took only the gentlest of breaths to blow it away. It suggests that we were closer to the precipice than we ever suspected, and unpacking that situation—and its implications for the future—requires more nuance than most forms of social media can provide. Rob Goldman, who should have known better, didn’t grasp this. And while I hope that the students at Marjory Stoneman Douglas do better, I also worry about how effective they can really be. Charlie Warzel of Buzzfeed recently argued that the pro-Trump media has met its match in the Parkland students: “It chose a political enemy effectively born onto the internet and innately capable of waging an information war.” I want to believe this. But it may also be that these aren’t the weapons that we need. The information war is real, but the only way to win it may be to move it into another battlefield entirely.
Which brings us, in a curious way, back to Robert Mueller, who seems to have assumed the same role for many progressives that Nate Silver once occupied—the one man who was somehow going to tell us that everything was going to be fine. But their differences are also telling. Silver generated reams of commentary, but his reputation ultimately came down to his ability to provide a single number, updated in real time, that would indicate how worried we had to be. That trust is clearly gone, and his fall from grace is less about his own mistakes than it’s an overdue reckoning for the promises of data journalism in general. Mueller, by contrast, does everything in private, avoids the spotlight, and emerges every few months with a mountain of new material that we didn’t even know existed. It’s nuanced, qualitative, and not easy to summarize. As the coverage endlessly reminds us, we don’t know what else the investigation will find, but that’s part of the point. At a time in which controversies seem to erupt overnight, dominate the conversation for a day, and then yield to the next morning’s outrage, Mueller embodies the almost anachronistic notion that the way to make something stick is to work on it diligently, far from the public eye, and release each piece only when you’re ready. (In the words of a proverbial saying attributed to everyone from Buckminster Fuller to Michael Schrage: “Never show fools unfinished work.” And we’re all fools these days.) I picture him fondly as the head of a monastery in the Dark Ages, laboriously preserving information for the future, or even as the shadowy overseer of Asimov’s Foundation. Mueller’s low profile allows him to mean whatever we want to us, of course, and for all I know, he may not be the embodiment of all the virtues that Ralph Waldo Emerson identified as punctuality, personal attention, courage, and thoroughness. I just know that he’s the only one left who might be. Mueller can’t save us by himself. But his example might just show us the way.