Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Facebook

The chosen ones

with one comment

In his recent New Yorker profile of Mark Zuckerberg, Evan Osnos quotes one of the Facebook founder’s close friends: “I think Mark has always seen himself as a man of history, someone who is destined to be great, and I mean that in the broadest sense of the term.” Zuckerberg feels “a teleological frame of feeling almost chosen,” and in his case, it happened to be correct. Yet this tells us almost nothing abut Zuckerberg himself, because I can safely say that most other undergraduates at Harvard feel the same way. A writer for The Simpsons once claimed that the show had so many presidential jokes—like the one about Grover Cleveland spanking Grandpa “on two non-consecutive occasions”—because most of the writers secretly once thought that they would be president themselves, and he had a point. It’s very hard to do anything interesting in life without the certainty that you’re somehow one of the chosen ones, even if your estimation of yourself turns out to be wildly off the mark. (When I was in my twenties, my favorite point of comparison was Napoleon, while Zuckerberg seems to be more fond of Augustus: “You have all these good and bad and complex figures. I think Augustus is one of the most fascinating. Basically, through a really harsh approach, he established two hundred years of world peace.”) This kind of conviction is necessary for success, although hardly sufficient. The first human beings to walk on Mars may have already been born. Deep down, they know it, and this knowledge will determine their decisions for the rest of their lives. Of course, thousands of others “know” it, too. And just a few of them will turn out to be right.

One of my persistent themes on this blog is how we tend to confuse talent with luck, or, more generally, to underestimate the role that chance plays in success or failure. I never tire of quoting the economist Daniel Kahneman, who in Thinking Fast and Slow shares what he calls his favorite equation:

Success = Talent + Luck
Great Success = A little more talent + A lot of luck

The truth of this statement seems incontestable. Yet we’re all reluctant to acknowledge its power in our own lives, and this tendency only increases as the roles played by luck and privilege assume a greater importance. This week has been bracketed by news stories about two men who embody this attitude at its most extreme. On the one hand, you have Brett Kavanaugh, a Yale legacy student who seems unable to recognize that his drinking and his professional success weren’t mutually exclusive, but closer to the opposite. He occupied a cultural and social stratum that gave him the chance to screw up repeatedly without lasting consequences, and we’re about to learn how far that privilege truly extends. On the other hand, you have yesterday’s New York Times exposé of Donald Trump, who took hundreds of millions of dollars from his father’s real estate empire—often in the form of bailouts for his own failed investments—while constantly describing himself as a self-made billionaire. This is hardly surprising, but it’s still striking to see the extent to which Fred Trump played along with his son’s story. He understood the value of that myth.

This gets at an important point about privilege, no matter which form it takes. We have a way of visualizing these matters in spatial terms—”upper class,” “lower class,” “class pyramid,” “rising,” “falling,” or “stratum” in the sense that I used it above. But true privilege isn’t spatial, but temporal. It unfolds over time, by giving its beneficiaries more opportunities to fail and recover, when those living at the edge might not be able to come back from the slightest misstep. We like to say that a privileged person is someone who was born on third base and thinks he hit a triple, but it’s more like being granted unlimited turns at bat. Kavanaugh provides a vivid reminder, in case we needed one, that a man who fits a certain profile has the freedom to make all kinds of mistakes, the smallest of which would be fatal for someone who didn’t look like he did. And this doesn’t just apply to drunken misbehavior, criminal or otherwise, but even to the legitimate failures that are necessary for the vast majority of us to achieve real success. When you come from the right background, it’s easier to survive for long enough to benefit from the effects of luck, which influences the way that we talk about failure itself. Silicon Valley speaks of “failing faster,” which only makes sense when the price of failure is humiliation or the loss of investment capital, not falling permanently out of the middle class. And as I’ve noted before, Pixar’s creative philosophy, which Andrew Stanton described as a process in which “the films still suck for three out of the four years it takes to make them,” is only practicable for filmmakers who look and sound like their counterparts at the top, which grants them the necessary creative freedom to fail repeatedly—a luxury that women are rarely granted.

This may all come across as unbelievably depressing, but there’s a silver lining, and it took me years to figure it out. The odds of succeeding in any creative field—which includes nearly everything in which the standard career path isn’t clearly marked—are minuscule. Few who try will ever make it, even if they have “a teleological frame of feeling almost chosen.” This isn’t due to a lack of drive or talent, but of time and second chances. When you combine the absence of any straightforward instructions with the crucial role played by luck, you get a process in which repeated failure over a long period is almost inevitable. Those who drop out don’t suffer from weak nerves, but from the fact that they’ve used up all of their extra lives. Privilege allows you to stay in the game for long enough for the odds to turn in your favor, and if you’ve got it, you may as well use it. (An Ivy League education doesn’t guarantee success, but it drastically increases your ability to stick around in the middle class in the meantime.) In its absence, you can find strategies of minimizing risk in small ways while increasing it on the highest levels, which just another word for becoming a bohemian. And the big takeaway here is that since the probability of success is already so low, you may as well do exactly what you want. It can be tempting to tailor your work to the market, reasoning that it will increase your chances ever so slightly, but in reality, the difference is infinitesimal. An objective observer would conclude that you’re not going to make it either way, and even if you do, it will take about the same amount of time to succeed by selling out as it would by staying true to yourself. You should still do everything that you can to make the odds more favorable, but if you’re probably going to fail anyway, you might as well do it on your own terms. And that’s the only choice that matters.

Written by nevalalee

October 3, 2018 at 8:59 am

The sin of sitzfleisch

leave a comment »

Yesterday, I was reading the new profile of Mark Zuckerberg by Evan Osnos in The New Yorker when I came across one of my favorite words. It appears in a section about Zuckerberg’s wife, Priscilla Chan, who describes her husband’s reaction to the recent controversies that have swirled around Facebook:

When I asked Chan about how Zuckerberg had responded at home to the criticism of the past two years, she talked to me about Sitzfleisch, the German term for sitting and working for long periods of time. “He’d actually sit so long that he froze up his muscles and injured his hip,” she said.

Until now, the term sitzfleisch, or literally “buttocks,” was perhaps most widely known in chess, in which it evokes the kind of stoic, patient endurance capable of winning games by making one plodding move after another, but you sometimes see it in other contexts as well. Just two weeks ago, Paul Joyce, a lecturer in German at Portsmouth University, was quoted in an article by the BBC: “It’s got a positive sense, [it] positively connotes a sense of endurance, reliability, not just flitting from one place to another, but it is also starting to be questioned as to whether it matches the experience of the modern world.” Which makes it all the more striking to hear it applied to Zuckerberg, whose life’s work has been the systematic construction of an online culture that makes such virtues seem obsolete.

The concept of sitzfleisch is popular among writers—Elizabeth Gilbert has a nice blog post on the subject—but it also has its detractors. A few months ago, I posted a quote from Twilight of the Idols in which Friedrich Nietzsche comes out strongly against the idea. Here’s the full passage, which appears in a section of short maxims and aphorisms:

On ne peut penser et écrire qu’assis (G. Flaubert). Now I’ve got you, you nihilist! Sitting still [sitzfleisch] is precisely the sin against the holy ghost. Only thoughts which come from walking have any value.

The line attributed to Flaubert, which can be translated as “One can think and write only when sitting down,” appears to come from a biographical sketch by Guy de Maupassant. When you read it in context, you can see why it irritated Nietzsche:

From his early infancy, the two distinctive traits of [Flaubert’s] nature were great ingenuousness and a dislike of physical action. All his life he remained ingenuous and sedentary. He could not see any one walking or moving about near him without becoming exasperated; and he would declare in his sharp voice, sonorous and always a little theatrical, that motion was not philosophical. “One can think and write only when seated,” he would say.

On some level, Nietzsche’s attack on sitzfleisch feels like a reaction against his own inescapable habits—he can hardly have written any of his books without the ability to sit in solitude for long periods of time. I’ve noted elsewhere that the creative life has to be conducted both while seated and while engaging in other activities, and that your course of action at any given moment can be guided by whether or not you happen to be sitting down. And it can be hard to strike the right balance. We have to spend time at a desk in order to write, but we often think better by walking, going outside, and pointedly not checking Facebook. In the recent book Nietzsche and Montaigne, the scholar Robert Miner writes:

Both Montaigne and Nietzsche strongly favor mobility over sedentariness. Montaigne is a “sworn enemy” of “assiduity (assiduité)” who goes “mostly on horseback, where my thoughts range most widely.” Nietzsche too finds that “assiduity (Sitzfleisch) is the sin against the Holy Spirit” but favors walking rather than riding. As Dahlkvist observes, Nietzsche may have been inspired by Beethoven’s habit of walking while composing, which he knew about from his reading of Henri Joly’s Psychologie des grand hommes.

That’s possible, but it also reflects the personal experience of any writer, who is often painfully aware of the contradiction of trying to say something about life while spending most of one’s time alone.

And Nietzsche’s choice of words is also revealing. In describing sitzfleisch as a sin against the Holy Ghost, he might have just been looking for a colorful phrase, or making a pun on a “sin of the flesh,” but I suspect that it went deeper. In Catholic dogma, a sin against the Holy Ghost is specifically one of “certain malice,” in which the sinner acts on purpose, repeatedly, and in full knowledge of his or her crime. Nietzsche, who was familiar with Thomas Aquinas, might have been thinking of what the Summa Theologica has to say on the subject:

Augustine, however…says that blasphemy or the sin against the Holy Ghost, is final impenitence when, namely, a man perseveres in mortal sin until death, and that it is not confined to utterance by word of mouth, but extends to words in thought and deed, not to one word only, but to many…Hence they say that when a man sins through weakness, it is a sin “against the Father”; that when he sins through ignorance, it is a sin “against the Son”; and that when he sins through certain malice, i.e. through the very choosing of evil…it is a sin “against the Holy Ghost.”

Sitzfleisch, in short, is the sin of those who should know better. It’s the special province of philosophers, who know exactly how badly they fall short of ordinary human standards, but who have no choice if they intend to publish “not one word only, but many.” Solitary work is unhealthy, even inhuman, but it can hardly be avoided if you want to write Twilight of the Idols. As Nietzsche notes elsewhere in the same book: “To live alone you must be an animal or a god—says Aristotle. He left out the third case: you must be both—a philosopher.”

The war of ideas

with 2 comments

Over the last few days, I’ve been thinking a lot about a pair of tweets. One is from Susan Hennessy, an editor for the national security blog Lawfare, who wrote: “Much of my education has been about grasping nuance, shades of gray. Resisting the urge to oversimplify the complexity of human motivation. This year has taught me that, actually, a lot of what really matters comes down to good people and bad people. And these are bad people.” This is a remarkable statement, and in some ways a heartbreaking one, but I can’t disagree with it, and it reflects a growing trend among journalists and other commentators to simply call what we’re seeing by its name. In response to the lies about the students of Marjory Stoneman Douglas High School—including the accusation that some of them are actors—Margaret Sullivan of the Washington Post wrote:

When people act like cretins, should they be ignored? Does talking about their misdeeds merely give them oxygen? Maybe so. But the sliming—there is no other word for it—of the survivors of last week’s Florida high school massacre is beyond the pale…Legitimate disagreement over policy issues is one thing. Lies, conspiracy theories and insults are quite another.

And Paul Krugman went even further: “America in 2018 is not a place where we can disagree without being disagreeable, where there are good people and good ideas on both sides, or whatever other bipartisan homily you want to recite. We are, instead, living in a kakistocracy, a nation ruled by the worst, and we need to face up to that unpleasant reality.”

The other tweet that has been weighing on my mind was from Rob Goldman, a vice president of advertising for Facebook. It was just one of a series of thoughts—which is an important detail in itself—that he tweeted out on the day that Robert Mueller indicted thirteen Russian nationals for their roles in interfering in the presidential election. After proclaiming that he was “very excited” to see the indictments, Goldman said that he wanted to clear up a few points. He had seen “all of the Russian ads” that appeared on Facebook, and he stated: “I can say very definitively that swaying the election was not the main goal.” But his most memorable words, at least for me, were: “The majority of the Russian ad spend happened after the election. We shared that fact, but very few outlets have covered it because it doesn’t align with the main media narrative of Tump [sic] and the election.” This is an astounding statement, in part because it seems to defend Facebook by saying that it kept running these ads for longer than most people assume. But it’s also inexplicable. It may well be, as some observers have contended, that Goldman had a “nuanced” point to make, but he chose to express it on a forum that is uniquely vulnerable to being taken out of context, and to unthinkingly use language that was liable to be misinterpreted. As Josh Marshall wrote:

[Goldman] even apes what amounts to quasi-Trumpian rhetoric in saying the media distorts the story because the facts “don’t align with the main media narrative of Trump and the election.” This is silly. Elections are a big deal. It’s hardly surprising that people would focus on the election, even though it’s continued since. What is this about exactly? Is Goldman some kind of hardcore Trumper?

I don’t think he is. But it also doesn’t matter, at least not when his thoughts were retweeted approvingly by the president himself.

This all leads me to a point that the events of the last week have only clarified. We’re living in a world in which the lines between right and wrong seem more starkly drawn than ever, with anger and distrust rising to an unbearable degree on both sides. From where I stand, it’s very hard for me to see how we recover from this. When you can accurately say that the United States has become a kakistocracy, you can’t just go back to the way things used to be. Whatever the outcome of the next election, the political landscape has been altered in ways that would have been unthinkable even two years ago, and I can’t see it changing during my lifetime. But even though the stakes seem clear, the answer isn’t less nuance, but more. If there’s one big takeaway from the last eighteen months, it’s that the line between seemingly moderate Republicans and Donald Trump was so evanescent that it took only the gentlest of breaths to blow it away. It suggests that we were closer to the precipice than we ever suspected, and unpacking that situation—and its implications for the future—requires more nuance than most forms of social media can provide. Rob Goldman, who should have known better, didn’t grasp this. And while I hope that the students at Marjory Stoneman Douglas do better, I also worry about how effective they can really be. Charlie Warzel of Buzzfeed recently argued that the pro-Trump media has met its match in the Parkland students: “It chose a political enemy effectively born onto the internet and innately capable of waging an information war.” I want to believe this. But it may also be that these aren’t the weapons that we need. The information war is real, but the only way to win it may be to move it into another battlefield entirely.

Which brings us, in a curious way, back to Robert Mueller, who seems to have assumed the same role for many progressives that Nate Silver once occupied—the one man who was somehow going to tell us that everything was going to be fine. But their differences are also telling. Silver generated reams of commentary, but his reputation ultimately came down to his ability to provide a single number, updated in real time, that would indicate how worried we had to be. That trust is clearly gone, and his fall from grace is less about his own mistakes than it’s an overdue reckoning for the promises of data journalism in general. Mueller, by contrast, does everything in private, avoids the spotlight, and emerges every few months with a mountain of new material that we didn’t even know existed. It’s nuanced, qualitative, and not easy to summarize. As the coverage endlessly reminds us, we don’t know what else the investigation will find, but that’s part of the point. At a time in which controversies seem to erupt overnight, dominate the conversation for a day, and then yield to the next morning’s outrage, Mueller embodies the almost anachronistic notion that the way to make something stick is to work on it diligently, far from the public eye, and release each piece only when you’re ready. (In the words of a proverbial saying attributed to everyone from Buckminster Fuller to Michael Schrage: “Never show fools unfinished work.” And we’re all fools these days.) I picture him fondly as the head of a monastery in the Dark Ages, laboriously preserving information for the future, or even as the shadowy overseer of Asimov’s Foundation. Mueller’s low profile allows him to mean whatever we want to us, of course, and for all I know, he may not be the embodiment of all the virtues that Ralph Waldo Emerson identified as punctuality, personal attention, courage, and thoroughness. I just know that he’s the only one left who might be. Mueller can’t save us by himself. But his example might just show us the way.

The A/B Test

with 2 comments

In this week’s issue of The New York Times Magazine, there’s a profile of Mark Zuckerberg by Farhad Manjoo, who describes how the founder of Facebook is coming to terms with his role in the world in the aftermath of last year’s election. I find myself thinking about Zuckerberg a lot these days, arguably even more than I use Facebook itself. We just missed overlapping in college, and with one possible exception, which I’ll mention later, he’s the most influential figure to emerge from those ranks in the last two decades. Manjoo depicts him as an intensely private man obliged to walk a fine line in public, leading him to be absurdly cautious about what he says: “When I asked if he had chatted with Obama about the former president’s critique of Facebook, Zuckerberg paused for several seconds, nearly to the point of awkwardness, before answering that he had.” Zuckerberg is trying to figure out what he believes—and how to act—under conditions of enormous scrutiny, but he also has more resources at his disposal than just about anyone else in history. Here’s the passage in the article that stuck with me the most:

The people who work on News Feed aren’t making decisions that turn on fuzzy human ideas like ethics, judgment, intuition, or seniority. They are concerned only with quantifiable outcomes about people’s actions on the site. That data, at Facebook, is the only real truth…This ideal runs so deep that the people who make News Feed often have to put aside their own notions of what’s best. “One of the things we’ve all learned over the years is that our intuition can be wrong a fair amount of the time,” John Hegeman, the vice president of product management and a News Feed team member, told me. “There are things you don’t expect will happen. And we learn a lot from that process: Why didn’t that happen, and what might that mean?”

Reading this, I began to reflect on how rarely we actually test our intuitions. I’ve spoken a lot on this blog about the role of intuitive thinking in the arts and sciences, mostly because it doesn’t get the emphasis it deserves, but there’s also no guarantee that intuition will steer us in the right direction. The psychologist Daniel Kahneman has devoted his career to showing how we tend to overvalue our gut reactions, particularly if we’ve been fortunate enough to be right in the past, and the study of human irrationality has become a rich avenue of research in the social sciences, which are often undermined by poor hunches of their own. It may not even be a matter of right or wrong. An intuitive choice may be better or worse than the alternative, but for the most part, we’ll never know. One of the quirks of Silicon Valley culture is that it claims to base everything on raw data, but it’s often in the service of notions that are outlandish, untested, and easy to misrepresent. Facebook comes closer than any company in existence to the ideal of an endless A/B test, in which the user base is randomly divided into two or more groups to see which approaches are the most effective. It’s the best lab ever developed for testing our hunches about human behavior. (Most controversially, Facebook modified the news feeds of hundreds of thousands of users to adjust the number of positive or negative posts, in order to gauge the emotional impact, and it has conducted similar tests on voter turnout.) And it shouldn’t surprise us if many of our intuitions turn out to be mistaken. If anything, we should expect them to be right about half the time—and if we can nudge that percentage just a little bit upward, in theory, it should give us a significant competitive advantage.

So what good is intuition, anyway? I like to start with William Goldman’s story about the Broadway producer George Abbott, who once passed a choreographer holding his head in his hands while the dancers stood around doing nothing. When Abbott asked what was wrong, the choreographer said that he couldn’t figure out what to do next. Abbott shot back: “Well, have them do something! That way we’ll have something to change.” Intuition, as I’ve argued before, is mostly about taking you from zero ideas to one idea, which you can then start to refine. John W. Campbell makes much the same argument in what might be his single best editorial, “The Value of Panic,” which begins with a maxim from the Harvard professor Wayne Batteau: “In total ignorance, try anything. Then you won’t be so ignorant.” Campbell argues that this provides an evolutionary rationale for panic, in which an animal acts “in a manner entirely different from the normal behavior patterns of the organism.” He continues:

Given: An organism with N characteristic behavior modes available. Given: An environmental situation which cannot be solved by any of the N available behavior modes, but which must be solved immediately if the organism is to survive. Logical conclusion: The organism will inevitably die. But…if we introduce Panic, allowing the organism to generate a purely random behavior mode not a member of the N modes characteristically available?

Campbell concludes: “When the probability of survival is zero on the basis of all known factors—it’s time to throw in an unknown.” In extreme situations, the result is panic; under less intense circumstances, it’s a blind hunch. You can even see them as points on a spectrum, the purpose of which is to provide us with a random action or idea that can then be revised into something better, assuming that we survive for long enough. But sometimes the animal just gets eaten.

The idea of refinement, revision, or testing is inseparable from intuition, and Zuckerberg has been granted the most powerful tool imaginable for asking hard questions and getting quantifiable answers. What he does with it is another matter entirely. But it’s also worth looking at his only peer from college who could conceivably challenge him in terms of global influence. On paper, Mark Zuckerberg and Jared Kushner have remarkable similarities. Both are young Jewish men—although Kushner is more observant—who were born less than four years and sixty miles apart. Kushner, whose acceptance to Harvard was so manifestly the result of his family’s wealth that it became a case study in a book on the subject, was a member of the final clubs that Zuckerberg badly wanted to join, or so Aaron Sorkin would have us believe. Both ended up as unlikely media magnates of a very different kind: Kushner, like Charles Foster Kane, took over a New York newspaper from a man named Carter. Yet their approaches to their newfound positions couldn’t be more different. Kushner has been called “a shadow secretary of state” whose portfolio includes Mexico, China, the Middle East, and the reorganization of the federal government, but it feels like one long improvisation, on the apparent assumption that he can wing it and succeed where so many others have failed. As Bruce Bartlett writes in the New York Times, without a staff, Kushner “is just a dilettante meddling in matters he lacks the depth or the resources to grasp,” and we may not have a chance to recover if his intuitions are wrong. In other words, he resembles his father-in-law, as Frank Bruni notes:

I’m told by insiders that when Trump’s long-shot campaign led to victory, he and Kushner became convinced not only that they’d tapped into something that everybody was missing about America, but that they’d tapped into something that everybody was missing about the two of them.

Zuckerberg and Kushner’s lives ran roughly in parallel for a long time, but now they’re diverging at a point at which they almost seem to be offering us two alternate versions of the future, like an A/B test with only one possible outcome. Neither is wholly positive, but that doesn’t make the choice any less stark. And if you think this sounds farfetched, bookmark this post, and read it again in about six years.

The Ian Malcolm rule

with one comment

Jeff Goldblum in Jurassic Park

A man is rich in proportion to the number of things he can afford to leave alone.

—Henry David Thoreau, Walden

Last week, at the inaugural town hall meeting at Facebook headquarters, one brave questioner managed to cut through the noise and press Mark Zuckerberg on the one issue that really matters: what’s the deal with that gray shirt he always wears? Zuckerberg replied:

I really want to clear my life to make it so I have to make as few decisions as possible about anything except best how to serve this community…I’m in this really lucky position where I get to wake up every day and help serve more than a billion people. And I feel like I’m not doing my job if I spend any of my energy on things that are silly or frivolous about my life…So even though it kind of sounds silly—that that’s my reason for wearing a gray t-shirt every day—it also is true.

There’s a surprising amount to unpack here, starting with the fact, as Allison P. Davis of New York Magazine points out, that it’s considerably easier for a young white male to always wear the same clothes than a woman in the same situation. It’s also worth noting that wearing the exact same shirt each day turns simplicity into a kind of ostentation: there are ways of minimizing the amount of time you spend thinking about your wardrobe without calling attention to it so insistently.

Of course, Zuckerberg is only the latest in a long line of high-achieving nerds who insist, rightly or wrongly, that they have more important things to think about than what they’re going to wear. There’s more than an echo here of the dozens of black Issey Miyake turtlenecks that were stacked in Steve Jobs’s closet, and in the article linked above, Vanessa Friedman of The New York Times also notes that Zuckerberg sounds a little like Obama, who told Michael Lewis in Vanity Fair: “You’ll see I wear only gray or blue suits. I’m trying to pare down decisions. I don’t want to make decisions about what I’m eating or wearing. Because I have too many other decisions to make.” Even Christopher Nolan gets into the act, as we learn in the recent New York Times Magazine profile by Gideon Lewis-Kraus:

Nolan’s own look accords with his strict regimen of optimal resource allocation and flexibility: He long ago decided it was a waste of energy to choose anew what to wear each day, and the clubbable but muted uniform on which he settled splits the difference between the demands of an executive suite and a tundra. The ensemble is smart with a hint of frowzy, a dark, narrow-lapeled jacket over a blue dress shirt with a lightly fraying collar, plus durable black trousers over scuffed, sensible shoes.

Mark Zuckerberg

If you were to draw a family tree between all these monochromatic Vulcans, you’d find that, consciously or not, they’re all echoing their common patron saint, Ian Malcolm in Jurassic Park, who says:

In any case, I wear only two colors, black and gray…These colors are appropriate for any occasion…and they go well together, should I mistakenly put on a pair of gray socks with my black trousers…I find it liberating. I believe my life has value, and I don’t want to waste it thinking about clothing.

As Malcolm speaks, Crichton writes, “Ellie was staring at him, her mouth open”—apparently stunned into silence, as all women would be, at this display of superhuman rationality. And while it’s easy to make fun of it, I’m basically one of those guys. I eat the same breakfast and lunch every day; my daily uniform of polo shirt, jeans, and New Balance sneakers rarely, if ever, changes; and I’ve had the same haircut for the last eighteen years. If pressed, I’d probably offer a rationale more or less identical to the ones given above. As a writer, I’m called upon to solve a series of agonizingly specific problems each time I sit down at my desk, so the less headspace I devote to everything else, the better.

Which is all well and good. But it’s also easy to confuse the externals with their underlying intention. The world, or at least the Bay Area, is full of young guys with the Zuckerberg look, but it doesn’t matter how little time you spend getting dressed if you aren’t mindfully reallocating the time you save, or extending the principle beyond the closet. The most eloquent defense of minimizing extraneous thinking was mounted by the philosopher Alfred North Whitehead, who writes:

It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking of what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them. Operations of thought are like cavalry charges in a battle—they are strictly limited in number, they require fresh horses, and must only be made at decisive moments.

Whitehead isn’t talking about his shirts here; he’s talking about the Arabic number system, a form of “good notation” that frees the mind to think about more complicated problems. Which only reminds us that the shirts you wear won’t make you more effective if you aren’t being equally thoughtful about the decisions that really count. Otherwise, they’re only an excuse for laziness or indifference, which is just as contagious as efficiency. And it often comes to us as a wolf in nerd’s clothing.

%d bloggers like this: