Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Archive for March 2017

Twenty-four hour potty people

leave a comment »

Toilet Training in Less Than a Day

Note: I’m taking a few days off, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on January 5, 2016.

Toilet Training in Less Than a Day is a slim little book originally published in 1974 by the child psychologists Nathan H. Azrin and Richard M. Foxx. I picked up a copy a year ago, after realizing that I was long overdue in potty training my daughter, who had just turned three. Until then, my wife and I had somehow hoped that the problem would solve itself: we bought a Baby Bjorn toilet trainer seat, and after Beatrix took an interest in it, we allowed ourselves to think that she might even teach herself on her own. That didn’t happen, of course—after an initial burst of activity, the seat just gathered dust for a year, and as the days continued to pass, I figured that I might as well do it for once and for all. Her winter break from playschool, which happened to coincide with a slow period in my writing, presented an obvious opportunity. And I was drawn to the book by Azrin and Foxx primarily because of the promise conveyed in its title: given enough advance preparation on the parent’s part, toilet training could be over in a day or two of focused work, rather than conducted at a lower level of intensity over a period of weeks. (I don’t intend to go into the method in detail, and you really do need to read the text in its entirety, but you can find a summary of the approach here.) But as I studied the book and gathered the necessary equipment, my wife cautioned me against having unrealistic expectations: “If it were really possible to toilet train in a day,” she said, “potty training wouldn’t be such a thing.”

At this point in the story, you’d expect some kind of a twist: I’d tell you that the training didn’t work at all, or that there were shortcomings in the method that weren’t evident from a cursory reading. But it worked just great, with a few caveats that I’ll discuss in a moment. The title is a bit misleading: the training itself took about a day and a half, but it was preceded by three nights in which I read the book twice from cover to cover, took notes, and did my best to memorize the instructions. What it proposes, in effect, is a kind of four-hour seminar on the subject of potty training: you sit your child down in the kitchen or some other room with an easily washable floor and minimal distractions, and you essentially conduct a motivational talk, a la Tony Robbins, about the benefits of the toilet, complete with rehearsals and demonstrations. It requires a doll that wets, which was surprisingly hard to find, and a potty that provides feedback when the child successfully does her business. (The one that I ended up purchasing plays a merry tune and says: “Yay!”) You use the doll to demonstrate the process a few times, including a dramatization of what happens when she wets her pants, and then proceed to a series of practice runs, with dryness checks conducted every five minutes and potty trials every quarter of an hour. A supply of snacks, drinks, and other rewards is kept on hand, along with a “friends who care” list to remind her that her grandparents, friends, and Santa Claus are all proud of her. And you’re supposed to talk about nothing else but the potty for however long it takes.

Toilet Training in Less Than a Day

The result, within twenty-four hours, was that my daughter was able to run to the potty by herself whenever she needed to go, lower her pants, relieve herself, wipe, put the tissue in the adjacent toilet, raise her pants, empty out the pot, flush, and even use the closed potty seat as a stool to wash her hands in the sink. Azrin and Foxx aim to have the entire process be totally autonomous, so that you aren’t even aware that your child is using the potty until you hear the flushing sound from the next room. We’ve had a few accidents, and keeping dry overnight is a challenge we’re still saving for another day, but a year later, I have no complaints. And reports from other parents mostly seem to verify this. When you look at reviews of the book online, which are overwhelmingly positive, you find that they fall into two categories. If it worked for your child, the book is great; if it didn’t, it needs to be taken out of print. And it’s easy to overlook the fact that a lot of luck seems to be involved in either case. (There’s also a touch of negative reinforcement that might make some parents, including me, a little uncomfortable: when your child has an accident, you’re supposed to say “No!” in a firm voice, and then conduct ten rapid practice runs before having her change her pants and clean up her mess. Beatrix sure didn’t like it, but it seemed to work.) I also have a feeling that it was the extended amount of time that we spent together, rather than any specific drills, that made the difference: the pants inspections and practices are useful in themselves, but they’re even more valuable as a way to structure the hours upon hours of attention that the approach requires.

And I ended up reflecting a lot about the tradeoffs involved. Yesterday, while talking about baking bread using the no-knead method, I pointed out the inverse relationship between time and intensity in everything we do: you can make up for one by ramping up the other, and the results are much the same whether you engage in intense effort for a short stretch of time or less focused work for a longer period. In my case, I decided that I preferred one day of concentrated training, rather than drawing the process out over weeks, but the amount of energy that I expended ended up being pretty much equivalent. There were also small, unpredictable externalities that arose with the accelerated approach: Beatrix became restless at night, apparently because she was worried about wetting her pull-ups, and I ended up spending a lot of extra time trying to figure out what she needed. There’s no free lunch, in other words: the universe of potty training is basically an efficient market, and what you gain in saved time with the Azrin and Foxx method you pay for in other ways, as a kind of karmic compensation. That’s true of anything in life, but it’s particularly true of parenting: anything that looks like a shortcut probably isn’t, when you take the larger picture into account. I’m proud of my daughter and myself for having gotten through the process so quickly, but it isn’t for everyone, and it probably isn’t an accident that most parents choose a less intensive approach. You think that you’re training your child to use the potty, but when you’re done, you find that you’ve also been quietly training yourself.

Written by nevalalee

March 31, 2017 at 9:00 am

Quote of the Day

leave a comment »

All those whose success in life depends neither upon a job which satisfies some specific and unchanging social need, like a farmer’s, nor, like a surgeon’s, upon some craft which he can be taught by others and improve by practice, but upon “inspiration,” the lucky hazard of ideas, live by their wits, a phrase which carries a slightly pejorative meaning. Every “original” genius, be he an artist or a scientist, has something a bit shady about him, like a gambler or madman.

W.H. Auden, The Dyer’s Hand

Written by nevalalee

March 31, 2017 at 7:30 am

The rendering time

leave a comment »

No-knead bread

Note: I’m taking a few days off, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on December 30, 2015.

Last year, I went through a period in which I was baking a lot of bread at home, initially as an activity to share with my daughter. Not surprisingly, I relied entirely on the no-knead recipe first developed by Jim Lahey and popularized by Mark Bittman over a decade ago in the New York Times. As many amateur bakers know, it’s simplicity itself: instead of kneading, you mix a very wet dough with a tiny amount of yeast, and then let it rise for about eighteen hours. Bittman quotes Harold McGee, author of the legendary tome On Food and Cooking, who says:

It makes sense. The long, slow rise does over hours what intensive kneading does in minutes: it brings the gluten molecules into side-by-side alignment to maximize their opportunity to bind to each other and produce a strong, elastic network. The wetness of the dough is an important piece of this because the gluten molecules are more mobile in a high proportion of water, and so can move into alignment easier and faster than if the dough were stiff.

Bittman continues: “Mr. McGee said he had been kneading less and less as the years have gone by, relying on time to do the work for him.” And the results, I can confirm, are close to foolproof: even if you’re less than precise or make a few mistakes along the way, as I tend to do, you almost always get a delicious, light, crusty loaf.

And the idea that you can use the power of time to achieve results that would otherwise require intensive work is central to much of modernist cuisine, as the freelance food scientist Nathan Myhrvold notes in his massive book of the same name. Government food safety guidelines, he points out, are based on raising the core temperature of meat to a certain minimum, which is often set unreasonably high to account for different cooking styles and impatient chefs. In reality, most pathogens are killed by temperatures as low as 120 degrees Fahrenheit—but only if the food has been allowed to cook for a sufficient length of time. The idea that a lower temperature can be counterbalanced by a longer time is the basic premise behind sous vide, in which food is cooked in a warm water bath for hours rather than more rapidly over high heat. This works because you’re trading one kind of precision for another: the temperature is carefully controlled over the course of the cooking process, but once you’re past a certain point, you can be less precise about the time. If you’ve ever prepared a meal in a crock pot, you already know this, and the marvel of sous vide lies in how it applies the same basic insight to a wider variety of recipes. (In fact, there’s a little gadget that you can buy for less than a hundred dollars that can convert any crock pot into a sous vide machine, and although I haven’t bought one for myself yet, I intend to try it one of these days.)

Sous vide

But the relationship between intensity and time has applications far beyond the kitchen. Elsewhere, I’ve talked about the rendering time that all creative acts seem to require: it seems that you just have to live with a work of art for a certain period, and if your process has become more efficient, you still fill that time by rendering or revising the work. As Blinn’s Law states: “As technology advances, rendering time remains constant.” And rendering, of course, is also a term from the food industry, in which the inedible waste from the butcher shop is converted, using time and heat, into something useful or delicious. But one lesson that artists quickly learn is that time can be used in place of intensity, as well as the other way around. Many of the writing rules that I try to follow—trim ten percent from each draft, cut the beginning and ending of every scene, overlap the action, remove transitional moments—are tricks to circumvent a protracted revision process, with intense work and scrutiny over a focused window taking the place of a longer, less structured engagement. If I just sat and fiddled with the story for months or years, I’d probably end up making most of the same changes, but I use these rules of thumb to hurry up the revisions that I would have made anyway. They aren’t always right, and they can’t entirely take the place of an extended period of living with a story, but I can rely on them to get maybe ninety percent of the way there, and the time I save more than compensates for that initial expenditure of energy.

And art, like cooking, often consists of finding the right balance between time and intensity. I’ve found that I write best in bursts of focused activity, which is why I try to keep my total working time for a short story to a couple of weeks or so. But I’ve also learned to set the resulting draft aside for a while before the final revision and submission, which allows me to subconsciously work through the remaining problems and find any plot holes. (On a few occasions that I haven’t done this, I’ve submitted a story only to realize within a day or two that I’d overlooked something important.) The amount of real work I do remains the same, but like dough rising quietly on the countertop, the story has time to align itself in my brain while I’m occupied with other matters. And while time can do wonders for any work of art, the few good tricks I use to speed up the process are still necessary: you aren’t likely to give up on your dough just because it takes an extra day to rise, but the difference between a novel that takes twelve months to write and one that takes three years often amounts to one you finish and one you abandon. The proper balance depends on many outside factors, and you may find that greater intensity and less time, or vice versa, is the approach you need to make it fit with everything else in your life. But baking no-knead bread reminded me that we have a surprising amount of control over the relationship between the two. And even though I’m no longer baking much these days, I’m always thinking about what I can set to rise, or render, right now.

Written by nevalalee

March 30, 2017 at 9:00 am

Quote of the Day

leave a comment »

[The] spending of the best part of one’s life earning money in order to enjoy a questionable liberty during the least valuable part of it reminds me of the Englishman who went to India to make a fortune first, in order that he might return to England and live the life of a poet. He should have gone up garret at once.

Henry David Thoreau, Walden

Written by nevalalee

March 30, 2017 at 7:30 am

Cutty Sark and the semicolon

leave a comment »

Vladimir Nabokov

Note: I’m taking a few days off, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on December 22, 2015.

In an interview that was first published in The Paris Review, the novelist Herbert Gold asked Vladimir Nabokov if an editor had ever offered him any useful advice. This is what Nabokov said in response:

By “editor” I suppose you mean proofreader. Among these I have known limpid creatures of limitless tact and tenderness who would discuss with me a semicolon as if it were a point of honor—which, indeed, a point of art often is. But I have also come across a few pompous avuncular brutes who would attempt to “make suggestions” which I countered with a thunderous “stet!”

I’ve always adored that thunderous stet, which tells us so much about Nabokov and his imperious resistance to being edited by anybody. Today, however, I’m more interested in the previous sentence. A semicolon, as Nabokov puts it, can indeed be a point of honor. Nabokov was perhaps the most painstaking of all modern writers, and it’s no surprise that the same perfectionism that produced such conceptual and structural marvels as Lolita and Pale Fire would filter down to the smallest details. But I imagine that even ordinary authors can relate to how a single punctuation mark in a manuscript can start to loom as large as the finger of God on the Sistine Chapel ceiling.

And there’s something about the semicolon that seems to inspire tussles between writers and their editors—or at least allows it to stand as a useful symbol of the battles that can occur during the editorial process. Here’s an excerpt from a piece by Charles McGrath in The New York Times Magazine about the relationship between Robert Caro, author of The Years of Lyndon Johnson, and his longtime editor Robert Gottlieb:

“You know that insane old expression, ‘The quality of his defect is the defect of his quality,’ or something like that?” Gottlieb asked me. “That’s really true of Bob. What makes him such a genius of research and reliability is that everything is of exactly the same importance to him. The smallest thing is as consequential as the biggest. A semicolon matters as much as, I don’t know, whether Johnson was gay. But unfortunately, when it comes to English, I have those tendencies, too, and we could go to war over a semicolon. That’s as important to me as who voted for what law.”

It’s possible that the semicolon keeps cropping up in such stories because its inherent ambiguity lends itself to disagreement. As Kurt Vonnegut once wrote: “Here is a lesson in creative writing. First rule: Do not use semicolons. They are transvestite hermaphrodites representing absolutely nothing. All they do is show you’ve been to college.” And I’ve more or less eliminated semicolons from my own work for much the same reason.

Robert De Niro and Martin Scorsese on the set of Raging Bull

But the larger question here is why artists fixate on things that even the most attentive reader would pass over without noticing. On one level, you could take a fight over a semicolon as an illustration of the way that the creative act—in which the artist is immersed in the work for months on end—tends to turn mountains into molehills. Here’s one of my favorite stories about the making of Raging Bull:

One night, when the filmmakers were right up against the deadline to make their release date, they were working on a nothing little shot that takes place in a nightclub, where a minor character turns to the bartender and orders a Cutty Sark. “I can’t hear what he’s saying,” [Martin Scorsese] said. Fiddling ensued—extensive fiddling—without satisfying him. [Producer Irwin] Winkler, who was present, finally deemed one result good enough and pointed out that messengers were standing by to hand-carry release prints to the few theaters where the picture was about to premiere. At which point, Scorsese snapped. “I want my name taken off the picture,” he cried—which bespeaks his devotion to detail. It also bespeaks his exhaustion at the end of Raging Bull, not to mention the craziness that so often overtakes movies as they wind down. Needless to say, he was eventually placated. And you can more or less hear the line in the finished print.

And you could argue that this kind of microscopic attention is the only thing that can lead to a work that succeeds on the largest possible scale.

But there’s yet another story that gets closer to truth. In Existential Errands, Norman Mailer describes a bad period in his life—shortly after he was jailed for stabbing his second wife Adele—in which he found himself descending into alcoholism and unable to work. His only source of consolation were the scraps of paper, “little crossed communications from some wistful outpost of my mind,” that he would find in his jacket pocket after a drunken night. Mailer writes of these poems:

I would go to work, however, on my scraps of paper. They were all I had for work. I would rewrite them carefully, printing in longhand and ink, and I would spend hours whenever there was time going over these little poems…And since I wasn’t doing anything else very well in those days, I worked the poems over every chance I had. Sometimes a working day would go by, and I might put a space between two lines and remove a word. Maybe I was mending.

Which just reminds us that a seemingly minuscule change can be the result of a prolonged confrontation with the work as a whole. You can’t obsess over a semicolon without immersing yourself in the words around it, and there are times when you need such a focal point to structure your engagement with the rest. It’s a little like what is called a lakshya in yoga: the tiny spot on the body or in the mind on which you concentrate while meditating. In practice, the lakshya can be anything or nothing, but without it, your attention tends to drift. In art, it can be a semicolon, a word, or a line about Cutty Sark. It may not be much in itself. But when you need to tether yourself to something, even a semicolon can be a lifeline.

Quote of the Day

leave a comment »

Why should the final test of plot, character, story, and the other ingredients of a novel lie in their power to imitate life? Why should a real chair be better than an imaginary elephant?

Virginia Woolf, The Essays of Virginia Woolf: 1925-1928

Written by nevalalee

March 29, 2017 at 7:30 am

The weight of paper

leave a comment »

Geological map by Henry Darwin Rogers

Note: I’m taking a few days off, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on December 12, 2015.

Take a look at the map above, which was the work of the American geologist Henry Darwin Rogers. As the legend on the right indicates, its various colors represent different rock formations. It’s obvious that some areas are larger than others, but how would you measure the difference? When Charles Darwin—no relation—was writing The Origin of Species, he was faced with exactly this problem, and his answer was an elegant one: “I have estimated the areas by cutting out and weighing the paper.” And while his solution reminds us, in the words of Stanley Edgar Hyman, that “there is something formidable and relentless about [Darwin’s] active involvement” in personally investigating everything that affected his argument, it also testifies to the weight of paper. We often treat paper as a two-dimensional surface with zero thickness, but it isn’t, of course. In the old days, anyone who sent a letter by airmail became acutely aware of its physical properties, and publishers still have to think about it today. Above a certain size, a book becomes harder and more expensive to produce, which has subtly influenced the length of the books we’re used to reading. (A few titles, like Robert Caro’s The Power Broker or Vincent Bugliosi’s Reclaiming History, seem determined to push the limits of how many words can be packed between two covers.) But while I’ve spoken frequently here about the importance of using pen and paper to work out ideas, I’ve generally thought of it in terms of the act of writing with ink, and I haven’t given nearly enough emphasis to the properties of the paper itself.

I got to thinking about this while reading a blog post a while back by the tabletop game designer Max Temkin—most famous for Cards Against Humanity—on the testing process behind a game called Secret Hitler. It’s full of useful advice, like this: “Jon Sharp taught me a great rule for iterating based on observed player feedback: ‘double or half.’ If something isn’t working, double it or cut it in half to quickly diagnose the problem. I like to think of this as the ‘Dr. House’ approach to game design.” But what I liked about it the most, aside from its fantastic pictures of game prototypes, is how the physical feedback provided by the paper itself informed the design process. Temkin started testing the game with blank playing cards and generic card sleeves, and if you want to get even cheaper, he recommends pasting slips of paper over cards from the free sample packs you get at Magic: The Gathering events. (Temkin writes: “Nobody wants them except for game designers, who usually jump at the opportunity to fill their backpacks with cheap cardboard rectangles that are great for prototyping.” Which reminds me of how I like to hoard business cards, which are the perfect size for notes or putting together an outline.) And the physical cards led to immediate insights about what had to be fixed. For instance: “Secret Hitler uses several different kinds of cards, and we found that players were sometimes confused about what was what…Once the policy cards were a different size and shape, players could easily differentiate them from other cards in the game.”

Prototype for Secret Hitler

And while this sort of prototype seems like an obvious step in testing a tabletop game, it can also be useful for games that are meant to be played in a digital form. In his excellent book The Art of Game Design—which Kevin Kelly of Cool Tools has called “one of the best guides for designing anything that demands complex interaction”—Jesse Schell writes:

If you are clever, you can prototype your fancy video game idea as a simple board game, or what we sometimes call a paper prototype. Why do this? Because you can make board games fast, and often capture the same gameplay. This lets you spot problems sooner—much of the process of prototyping is about looking for problems, and figuring out how to fix them, so paper prototyping can be a real time saver.

Schell goes on to note that this approach is more intuitive for a turn-based game, but it can even be useful for games that unfold in real time. To prototype Tetris, for example, you could cut out pieces of cardboard with a razor blade and move them around the table: “This would not be a perfect Tetris experience, but it might be close enough for you to see if you had the right kinds of shapes, and also enough to give you some sense of how fast the pieces should drop.” And even for a game like Doom, you could put together something with graph paper, paper tokens, and a metronome to tick off the seconds: “This will give the feeling of playing the whole thing in slow motion, but that can be a good thing, because it gives you time to think about what is working and not working while you are playing the game.”

And what all these approaches have in common is the fact that paper, which is inherently rather slow and clumsy to manipulate, forces you to think more urgently about what is interfering with the user experience. Anything that the player shouldn’t have to think about consciously while playing, like physically keeping track of the cards, ought to be ruthlessly edited out, and the paper prototype magnifies such problems so that they can’t be ignored. (They can also be revealing in other ways. Temkin notes, delightfully, that the game piece being handled by the players who were assigned the role of Hitler became more worn than the rest, since it was the role that generated the most anxiety.) And this seems to be as true of outlining a novel as it is of testing a game. When I use cards to map out the action of a story, I stack them in piles—sorting each card by character, scene, or theme—and I can tell at a glance which piles are larger than the others. A stack that seems too small should either be beefed up or combined with something else, while one that is too large to handle comfortably should be culled or split into two or more pieces. You can even draw conclusions from which cards have become tattered from being handled the most, and I imagine that for projects of a certain size, you could even weigh the cards, as Darwin did, to get a quick sense of each section’s relative bulk. You don’t get this kind of information when you’re laying out the whole thing in text files, as I’ve recently found myself doing, which is just a reminder that I really should get back to my cards. In writing, as in any creative endeavor, you can’t afford to ignore any potential source of insight, and if you put it down on paper, you’ll do a better job of playing the hand you’ve been dealt.

Quote of the Day

leave a comment »

Written by nevalalee

March 28, 2017 at 7:30 am

Posted in Quote of the Day

Tagged with

The Chinese Room

with 4 comments

In 1980, the philosopher John Searle presented a thought experiment that has become known as the Chinese Room. I first encountered it in William Poundstone’s book Labyrinths of Reason, which describes it as follows:

Imagine that you are confined to a locked room. The room is virtually bare. There is a thick book in the room with the unpromising title What to Do If They Shove Chinese Writing Under the Door. One day a sheet of paper bearing Chinese script is shoved underneath the locked door. To you, who know nothing of Chinese, it contains meaningless symbols, nothing more…You are supposed to scan the text for certain Chinese characters and keep track of their occurrences according to complicated rules outlined in the book…The next day, you receive another sheet of paper with more Chinese writing on it…The book has further instructions for correlating and manipulating the Chinese symbols on the second sheet, and combining this information with your work from the first sheet. The book ends with instructions to copy certain Chinese symbols…onto a fresh sheet of paper. Which symbols you copy depends, in a very complicated way, on your previous work. Then the book says to shove the new sheet under the door of your locked room. This you do.

Unknown to you, the first sheet of Chinese characters was a Chinese short story, and the second sheet was questions about the story, such as might be asked in a reading test…You have been manipulating the characters via a very complicated algorithm written in English…The algorithm is so good that the “answers” you gave are indistinguishable from those that a native speaker of Chinese would give, having read the same story and been asked the same questions.

Searle concludes that this scenario is essentially identical to that of a computer program operating on a set of symbols, and that it refutes the position of strong artificial intelligence, which he characterizes as the belief that “the appropriately programmed computer with the right inputs and outputs would thereby have a mind in exactly the same sense human beings have minds.” According to Searle, it’s clear that there isn’t any “mind” or “understanding” involved here:

As regards the first claim, it seems to me quite obvious in the example that I do not understand a word of the Chinese stories. I have inputs and outputs that are indistinguishable from those of the native Chinese speaker, and I can have any formal program you like, but I still understand nothing.

I’ve never been convinced by this argument, in part because I approached it through the work of Douglas R. Hofstadter, who calls it “a quintessential ‘bad meme’—a fallacious but contagious virus of an idea, similar to an annoying childhood disease such as measles or chicken pox.” (If it’s a bad meme, it’s one of the all-time greats: the computer scientist Pat Hayes once jokingly defined cognitive science as “the ongoing research program of showing Searle’s Chinese Room Argument to be false.”) The most compelling counterargument, at least to me, is that Searle is deliberately glossing over how this room really would look. As Hofstadter notes, any program capable of performing in the manner described would consist of billions or trillions of lines of code, which would require a library the size of an aircraft carrier. Similarly, even the simplest response would require millions of individual decisions, and the laborious approach that Searle presents here would take years for a single exchange. If you try to envision a version of the Chinese Room that could provide answers in real time, you end up with something considerably more impressive, of which the human being in the room—with whom we intuitively identify—is just a single component. In this case, the real “understanding” resides in the fantastically complicated and intricate system as a whole, a stance of which Searle dismissively writes in his original paper: “It is not easy for me to imagine how someone who was not in the grip of an ideology would find the idea at all plausible.”

In other news, a lawsuit was filed last week against John Searle and the Regents of the University of California, where he has taught for decades, accusing him of sexual harassment. The plaintiff is a twenty-four-year-old woman, Joanna Ong, who was employed as Searle’s research assistant for three months. The complaint states:

On or about July 22, 2016, after only a week of working together, Searle sexually assaulted Ong. On that date, he asked his previous research assistant to leave his office. He then locked the door behind the assistant and then went directly to Ong to grope her. Professor Searle slid his hands down the back of her spine to her buttocks and told Ong that “they were going to be lovers,” that he had an “emotional commitment to making her a public intellectual,” and that he was “going to love her for a long time.”

When Ong took her story to the director of the John Searle Center for Social Ontology, she was allegedly told that Searle “has had sexual relationships with his students and others in the past in exchange for academic, monetary, or other benefits.” No further attempt was made to investigate or respond to her claim, and the incidents continued. According to Ong, Searle asked her to log onto a “sugar daddy” website on his behalf and watched online pornography in her presence. The complaint adds: “On one occasion, when Ong”—who is Asian-American—“brought up the topic of American Imperialism as a discussion topic, Searle responded: ‘American Imperialism? Oh boy, that sounds great honey! Let’s go to bed and do that right now.’” When Ong complained again, the lawsuit states, she was informed that none of these issues would be addressed, and she ultimately lost her job. Earlier this month, Searle ceased to teach his undergraduate course on “Philosophy of Mind,” with university officials alluding to undisclosed “personal reasons.” As far as I know, neither Searle’s attorney nor anyone at the university has commented on the allegations.

Now let’s get back to the Chinese Room. At its heart, the argument comes down to a contest between dueling intuitions. Proponents of strong artificial intelligence have the intuition, or the “ideology,” that consciousness can emerge from a substrate other than the biological material of the brain, and Searle doesn’t. To support his position, he offers up a thought experiment, which Daniel C. Dennett once called “an intuition pump,” that is skewed to encourage the reader to arrive at a misleading conclusion. As Hofstadter puts it: “Either Searle…[has] a profound disrespect for the depth of the human mind, or—far more likely—he knows it perfectly well but is being coy about it.” It reduces an incomprehensibly complicated system to a user’s manual and a pencil, and it encourages us to identify with a human figure who is really just a cog in a much vaster machine. Even the use of Chinese itself, which Searle says he isn’t sure he could distinguish from “meaningless squiggles,” is a rhetorical trick: it would come off as subtly different to many readers if it involved, say, Hungarian. (In a response to one of his critics, Searle conceives of a system of water pipes in which “each water connection corresponds to a synapse in the Chinese brain,” while a related scenario asks what would happen if every Chinese citizen were asked to play the role of a single neuron. I understand that these thought experiments are taking their cues from Searle’s original paper, but maybe we should just leave the Chinese alone.) And while I don’t know if Searle’s actions amounted to sexual harassment, Ong’s sense of humiliation seems real enough, which implies that he was guilty, if nothing else, of a failure of empathy—which is really just a word for our intuition about the inner life of another person. In many cases, sexual harassment can be generously viewed as a misreading of what another person needs, wants, or feels, and it’s often a willful one: the harasser skews the evidence to justify a pattern of behavior that he has already decided to follow. If the complaint can be believed, Searle evidently has trouble empathizing with or understanding minds that are different from his own. Maybe he even convinced himself that he was in the right. But it wouldn’t have been the first time.

Written by nevalalee

March 27, 2017 at 9:07 am

Quote of the Day

leave a comment »

Written by nevalalee

March 27, 2017 at 7:30 am

The whole and the good

leave a comment »

The task of maintaining oneself as a locus for the free resolution of conflicting responses will make a far greater demand upon one’s “moral” energy than any that has been made before. For the good person to realize that it is better to be whole than to be good is to enter on a straight and narrow path compared to which one’s previous rectitude was flowery license. To have no more responsibility for oneself is to become incessantly responsible; and from the place where that paradox has meaning it is easy to discern that what is called “moral responsibility” is only a somewhat crooked expedient for avoiding all real responsibility whatever.

John Middleton Murry, God: An Introduction to the Science of Metabiology

Written by nevalalee

March 26, 2017 at 7:30 am

The utility composer

with 2 comments

In literature, the man who has neither the vision, the imagination, the sense of beauty, or the wit that are popularly supposed to go to the production of a poem, novel, or play, can turn his literary skill, such as it is, to the production of advertisements, book reviews, and crime reports. He is a utility or workaday writer. In painting, the same type of man, able to use a pencil and brush with some skill without attempting to be a Cézanne or a Picasso, can profitably and pleasantly spend his time in such varied ways as the designing of book jackets, the faking of old masters, and the painting of presentation portraits. In the three-dimensional arts one can distinguish even more clearly between art and craft, and the carpenter who makes a chair can claim to be satisfying a universal demand which is not met by the sculptor. A chair is undoubtedly more comfortable to sit on than all save a few examples of the sculptor’s art. But in music there can be no such thing as a chair opposed to a painting, or the craftsman opposed to the pure artist.

The whole theory of utility music is based on the misconception that one can distinguish between the aesthetic and the useful in this particular medium. Apart from music for organized and non-aesthetic action such as military marches and foxtrots…music is only useful if it is good music, whether light or serious. Unless it provides one with some vital experience which no other art can convey it is not only useless but a nuisance. The objective craftsman that Hindemith sets up as an ideal is far more of a sentimental luxury than the despised aesthetic “tone poet.” His daily covering of music paper is a task as essentially fruitless as those strange tasks assigned to the innocent dupes in the stories of Sherlock Holmes, the man in “The Red-Headed League” who copied out the Encyclopedia Britannica or the stockbroker’s clerk who was set to making a list of the pottery firms in Paris…

With an altogether praiseworthy modesty Hindemith appears to imagine that by ceasing to write for his own satisfaction he is necessarily writing for the satisfaction of others. There is an old and trite saying “If you don’t believe in yourself, nobody else will,” and in music it may with equal truth be said that if a composer is not interested in his own music he can hardly expect others to be. Even the most nauseating of popular tunes, that would appear to be written solely with the desire to satisfy the public taste at its least critical and most mawkish, must mean something to the composer, and be primarily written for his satisfaction, if it is to “get the public.” Purely “occasional” music whether deliberately vulgar or deliberately refined always brings boredom and distrust in its wake. Unless the composer has some definite reason for putting pen to paper, he had far better play patience or do a little gardening.

Constant Lambert, Music Ho!

Written by nevalalee

March 25, 2017 at 7:01 am

The dark side of the moon

with 8 comments

In March 1969, Robert A. Heinlein flew with his wife Ginny to Brazil, where he had been invited to serve as a guest of honor at a film festival in Rio de Janeiro. Another passenger on their plane was the director Roman Polanski, who introduced Heinlein to his wife, the actress Sharon Tate, at a party at the French embassy a few days after their arrival. (Tate had been in Italy filming The Thirteen Chairs, her final movie role before her death, which she had taken largely out of a desire to work with Orson Welles.) On August 8, Tate and four others were murdered in Los Angeles by members of the Manson Family. Two months later, Heinlein received a letter from a woman named “Annette or Nanette or something,” who claimed that police helicopters were chasing her and her friends. Ginny was alarmed by its incoherent tone, and she told her husband to stay out of it: “Honey, this is worse than the crazy fan mail. This is absolutely insane. Don’t have anything to do with it.” Heinlein contented himself with calling the Inyo County Sheriff’s Office, which confirmed that a police action was underway. In fact, it was a joint federal, state, and county raid of the Myers and Barker Ranches, where Charles Manson and his followers had been living, as part of an investigation into an auto theft ring—their connection to the murders had not yet been established. Manson was arrested, along with two dozen others. And the woman who wrote to Heinlein was probably Lynette “Squeaky” Fromme, another member of the Manson Family, who would be sentenced to life in prison for a botched assassination attempt six years later on President Gerald Ford.

On January 8, 1970, the San Francisco Herald-Examiner ran a story on the front page with the headline “Manson’s Blueprint? Claim Tate Suspect Used Science Fiction Plot.” Later that month, Time published an article, “A Martian Model,” that began:

In the psychotic mind, fact and fantasy mingle freely. The line between the real and the imagined easily blurs or disappears. Most madmen invent their own worlds. If the charges against Charles Manson, accused along with five members of his self-styled “family” of killing Sharon Tate and six other people, are true, Manson showed no powers of invention at all. In the weeks since his indictment, those connected with the case have discovered that he may have murdered by the book. The book is Robert A. Heinlein’s Stranger in a Strange Land, an imaginative science-fiction novel long popular among hippies…

Not surprisingly, the Heinleins were outraged by the implication, although Robert himself was in no condition to respond—he was hospitalized with a bad case of peritonitis. In any event, the parallels between the career of Charles Manson and Heinlein’s fictional character Valentine Michael Smith were tenuous at best, and the angle was investigated by the prosecutor Vincent Bugliosi, who dismissed it. A decade later, in a letter to the science fiction writer and Heinlein fan J. Neil Schulman, Manson stated, through another prisoner, that he had never read the book. Yet the novel was undeniably familiar to members of his circle, as it was throughout the countercultural community of the late sixties. The fact that Fromme wrote to Heinlein is revealing in itself, and Manson’s son, who was born on April 15, 1968, was named Valentine Michael by his mother.

Years earlier, Manson had been exposed—to a far more significant extent—to the work of another science fiction author. In Helter Skelter, his account of the case, Bugliosi writes of Manson’s arrival at McNeil Island Federal Penitentiary in 1961:

Manson gave as his claimed religion “Scientologist,” stating that he “has never settled upon a religious formula for his beliefs and is presently seeking an answer to his question in the new mental health cult known as Scientology”…Manson’s teacher, i.e. “auditor” was another convict, Lanier Rayner. Manson would later claim that while in prison he achieved Scientology’s highest level, “theta clear.”

In his own memoir, Manson writes: “A cell partner turned me on to Scientology. With him and another guy I got pretty heavy into dianetics and Scientology…There were times when I would try to sell [fellow inmate Alan Karpis] on the things I was learning through Scientology.” In total, Manson appears to have received about one hundred and fifty hours of auditing, and his yearly progress report noted: “He appears to have developed a certain amount of insight into his problems through his study of this discipline.” The following year, another report stated: “In his effort to ‘find’ himself, Manson peruses different religious philosophies, e.g. Scientology and Buddhism; however, he never remains long enough with any given teachings to reap material benefits.” In 1968, Manson visited a branch of the Church of Scientology in Los Angeles, where he asked the receptionist: “What do you do after ‘clear?'” But Bugliosi’s summary of the matter seems accurate enough:

Although Manson remained interested in Scientology much longer than he did in any other subject except music, it appears that…he stuck with it only as long as his enthusiasm lasted, then dropped it, extracting and retaining a number of terms and phrases (“auditing,” “cease to exist,” “coming to Now”) and some concepts (karma, reincarnation, etc.) which, perhaps fittingly, Scientology had borrowed in the first place.

So what should we make of all this? I think that there are a few relevant points here. The first is that Heinlein and Hubbard’s influence on Manson—or any of his followers, including Fromme, who had been audited as well—appears to have been marginal, and only in the sense that you could say that he was “influenced” by the Beatles. Manson was a scavenger who assembled his notions out of scraps gleaned from whatever materials were currently in vogue, and science fiction had saturated the culture to an extent that it would have been hard to avoid it entirely, particularly for someone who was actively searching for such ideas. On some level, it’s a testament to the cultural position that both Hubbard and Heinlein had attained, although it also cuts deeper than this. Manson represented the psychopathic fringe of an impulse for which science fiction and its offshoots provided a convenient vocabulary. It was an urge for personal transformation in the face of what felt like apocalyptic social change, rooted in the ideals that Campbell and his authors had defined, and which underwent several mutations in the decades since its earliest incarnation. (And it would mutate yet again. The Aum Shinrikyo cult, which was responsible for the sarin gas attacks in the Japanese subway system in 1995, borrowed elements of Asimov’s Foundation trilogy for its vision of a society of the elect that would survive the coming collapse of civilization.) It’s an aspect of the genre that takes light and dark forms, and it sometimes displays both faces simultaneously, which can lead to resistance from both sides. The Manson Family murders began with the killing of a man named Gary Hinman, who was taken hostage on July 25, 1969, a day in which the newspapers were filled with accounts of the successful splashdown of Apollo 11. The week before, at the ranch where Manson’s followers were living, a woman had remarked: “There’s somebody on the moon today.” And another replied: “They’re faking it.”

Written by nevalalee

March 24, 2017 at 10:09 am

Quote of the Day

leave a comment »

Written by nevalalee

March 24, 2017 at 7:30 am

Posted in Quote of the Day

Tagged with ,

The Theater of Apollo

with 2 comments

In 1972, the physiologist Albert Szent-Györgyi, who won a Nobel Prize for his work on Vitamin C and the citric acid cycle, wrote a famous letter to the journal Science. He noted that scientists, like most creative types, can be roughly divided into two categories, variously known as the classical and the romantic, the systematic and the intuitive, or, as the physicist John R. Platt proposed, the Apollonian and the Dionysian. “In science,” Szent-Györgyi wrote, “the Apollonian tends to develop established lines to perfection, while the Dionysian rather relies on intuition and is more likely to open new, unexpected alleys for research.” After defining intuition as “a sort of subconscious reasoning, only the end result of which becomes conscious,” he continued:

These are not merely academic problems. They have most important corollaries and consequences. The future of mankind depends on the progress of science, and the progress of science depends on the support it can find. Support most takes the form of grants, and the present methods of distributing grants unduly favor the Apollonian. Applying for a grant begins with writing a project. The Apollonian clearly sees the future line of his research and has no difficulty writing a clear project. Not so the Dionysian, who knows only the direction in which he wants to go out into the unknown; he has no idea what he is going to find there and how he is going to find it. Defining the unknown or writing down the subconscious is a contradiction in absurdum. In his work, the Dionysian relies, to a great extent, on accidental observation…The Dionysian is often not only unable to tell what he is going to find, he may even be at a loss to tell how he made his discovery.

Szent-Györgyi, who clearly identified as a Dionysian, went on to state that writing grant proposals was always an “agony” for him, and that while he always tried to live up to Leo Szilard’s commandment “Do not lie without need,” he often had no alternative: “I filled up pages with words and plans I knew I would not follow. When I go home from my laboratory in the late afternoon, I often do not know what I am going to do the next day. I expect to think that up during the night. How could I tell, then, what I would do a year hence?” He added that while his “fake projects” were always accepted, his attempts to write down honestly what he thought he would do were invariably rejected:

This seems quite logical to me; sitting in an easy chair I can cook up any time a project which must seem quite attractive, clear, and logical. But if I go out into nature, into the unknown, to the fringes of knowledge, everything seems mixed up and contradictory, illogical, and incoherent. This is what research does; it smooths out contradiction and makes things simple, logical, and coherent. So when I bring reality into my projects, they become hazy and are rejected. The reviewer, feeling responsible for “the taxpayer’s money,” justly hesitates to give money for research, the lines of which are not clear to the applicant himself.

Szent-Györgyi concluded by saying that in his lifetime, he made two important discoveries, both of which “were rejected offhand by the popes of the field,” and that he had no doubt that they both would have been bounced with equal dispatch if he had tried to describe them in a grant application. And he left the problem without any real solution, except the suggestion that proposals for future research should either take into account the scientist’s earlier work or consider “the vouching of an elder researcher” who can attest to the applicant’s ability.

I’ve never had to apply for a grant, and I’d be curious to hear the perspectives of readers of this blog who have. But I’ve written book proposals, which presented me with a milder version of the dilemma that Szent-Györgyi described. (It’s milder, in part, because writers often work on spec, which means that the submission process in commercial publishing isn’t subject to the same pressures that you see in academia.) A proposal is a kind of map or miniature version of the finished work, whether it’s six pages long or seventy, and the author usually prepares it in a relatively short period of time, before the research or writing process has even begun. As a result, it can’t capture the information that the writer has to discover en route, as Ted Kooser puts it. It can only hint at what the author hopes to do or find, which, depending on your point of view, amounts to either a strong inference or a lie. It’s a system set up to reward or accommodate writers whose style lends itself to that kind of presentation, or who have the skills to fake it, and there are undoubtedly gifted people whom it excludes or discourages. Like grant writing, it exists primarily for the convenience of institutions, not individuals, and it creates a parallel world of obstacles that have to be navigated to get to the real challenge of doing interesting work. You could call it a necessary evil, or, if you’re feeling generous, you could argue that it’s a proxy for kinds of talent that can’t be measured directly. If you can handle the artificial, even ritualized strictures of the grant or proposal process, it’s a sign that you can tackle more important problems. Like an audition or a job interview, it takes on aspects of a game, and we’d like to believe that the test it provides will be predictive of good results later on.

It isn’t hard to find the flaws in this argument. (Among other things, until recently, I would have argued that the organizational demands of a successful political campaign serve as a similar audition for holding high office, and we’ve all seen how that turned out.) The greatest danger is the trap presented by all rituals of admission, which is that they ultimately measure nothing but the ability to pass the test. Just as college entrance exams and whiteboard interviews have inspired a cottage industry of books, tutors, and classes designed to coach applicants who can afford to pay for it, grant writing has mutated into grantsmanship, with its own rules, experts, and infrastructure. And the risks, as Szent-Györgyi said more than forty years ago, are very real. It’s a system that rewards researchers who are content, as Peter Medawar once put it, to figure out why thirty-six percent of sea urchin eggs have a tiny little black spot, simply because it’s the kind of project that can get funding. The grant application process may also play a role in the replication crisis in the social sciences, since it encourages applicants to project an unwarranted certainty that can be hard to relinquish when the data isn’t there. Perhaps worst of all, it penalizes whole groups of people, not just our hypothetical Dionysian geniuses, but also women and minorities who can’t always afford to play the game—and Szent-Györgyi’s otherwise reasonable suggestion that weight be granted to “the vouching of an elder researcher” only compounds the problem. If an Apollonian system resulted in a society of Apollos, we might be inclined to forgive it, but that isn’t the case. To the extent that it works, it’s because the division between Apollonian and Dionysian isn’t an absolute one, and most people learn to draw on each side at different times. Those who succeed have to be less like Apollo or Dionysus than, perhaps, like Hermes, the trickster who can change in response to the demands that the situation presents. And as flawed as the current system may be, we’ll have reason to miss it if it disappears.

Written by nevalalee

March 23, 2017 at 9:30 am

Quote of the Day

with one comment

Written by nevalalee

March 23, 2017 at 7:30 am

The Mule and the Beaver

leave a comment »

If you wanted to construct the most prolific writer who ever lived, working from first principles, what features would you include? (We’ll assume, for the purposes of this discussion, that he’s a man.) Obviously, he would need to be capable of turning out clean, publishable prose at a fast pace and with a minimum of revision. He would be contented—even happy—within the physical conditions of writing itself, which requires working indoors at a desk alone for hours on end. Ideally, he would operate within a genre, either fiction or nonfiction, that lent itself to producing pages fairly quickly, but with enough variety to prevent burnout, since he’d need to maintain a steady clip every day for years. His most productive period would coincide with an era that gave him steady demand for his work, and he would have a boundless energy that was diverted early on toward the goal of producing more books. If you were particularly clever, you’d even introduce a psychological wrinkle: the act of writing would become his greatest source of satisfaction, as well as an emotional refuge, so that he would end up taking more pleasure in it than almost anything else in life. Finally, you’d provide him with cooperative publishers and an enthusiastic, although not overwhelming, readership, granting him a livelihood that was comfortable but not quite lavish enough to be distracting. Wind him up, let him run unimpeded for three or four decades, and how many books would you get? In the case of Isaac Asimov, the total comes to something like five hundred. Even if it isn’t quite enough to make him the most productive writer of all time, it certainly places him somewhere in the top ten. And it’s a career that followed all but axiomatically from the characteristics that I’ve listed above.

Let’s take these points one at a time. Asimov, like all successful pulp writers, learned how to crank out decent work on deadline, usually limiting himself to a first draft and a clean copy, with very little revision that wasn’t to editorial order. (And he wasn’t alone here. The pulps were an unforgiving school, and they quickly culled authors who weren’t able to write a sentence well enough the first time.) From a young age, Asimov was also drawn to enclosed, windowless spaces, like the kitchen at the back of his father’s candy store, and he had a persistent daydream about running a newsstand in the subway, where he could put up the shutter and read magazines in peace. After he began to write for a living, he was equally content to work in his attic office for up to ten hours a day. Yet it wasn’t fiction that accounted for the bulk of his output—which is a common misconception about his career—but a specific kind of nonfiction. Asimov was a prolific fiction writer, but no more so than many of his contemporaries. It was in nonfiction for general readers that he really shone, initially with such scientific popularizations as The Chemicals of Life and Inside the Atom. At first, his work drew on his academic and professional background in chemistry and biochemistry, but before long, he found that he was equally adept at explaining concepts from the other sciences, as well as such unrelated fields as history and literature. His usual method was to work straight from reference books, dictionaries, and encyclopedias, translating and organizing their concepts for a lay audience. As he once joked to Martin Gardner: “You mean you’re in the same racket I am? You just read books by the professors and rewrite them?”

This kind of writing is harder than it sounds. Asimov noted, correctly, that he added considerable value in arranging and presenting the material, and he was better at it than just about anyone else. (A faculty member at Boston University once observed him at work and exclaimed: “Why, you’re just copying the dictionary!” Asimov, annoyed, handed the dictionary to him and said: “Here. The dictionary is yours. Now go write the book.”) But it also lent itself admirably to turning out a lot of pages in a short period of time. Unlike fiction, it didn’t require him to come up with original ideas from scratch. As soon as he had enough projects in the hopper, he could switch between them freely to avoid becoming bored by any one subject. He could write treatments of the same topic for different audiences and cannibalize unsold material for other venues. In the years after Sputnik, there was plenty of demand for what he had to offer, and he had a ready market for short articles that could be collected into books. And since these were popular treatments of existing information, he could do all of the work from the comfort of his own office. Asimov hated to fly, and he actively avoided assignments that would require him to travel or do research away from home. Before long, his productivity became a selling point in itself, and when his wife told him that life was passing him by, Asimov responded: “If I do manage to publish a hundred books, and if I then die, my last words are likely to be, ‘Only a hundred!’” Writing became a place of security, both from life’s small crises and as an escape from an unhappy marriage, and it was also his greatest source of pleasure. When his daughter asked him what he would do if he had to choose between her and writing, Asimov said: “Why, I would choose you, dear.” But he adds: “But I hesitated—and she noticed that, too.”

Asimov was a complicated man—certainly more so than in the version of himself that he presented to the public—and he can’t be reduced to a neat set of factors. He wasn’t a robot. But those five hundred books represent an achievement so overwhelming that it cries out for explanation, and it wouldn’t exist if certain variables, both external and internal, hadn’t happened to align. In terms of his ability and ambition, Asimov was the equal of Campbell, Heinlein, or Hubbard, but in place of their public entanglements, he channeled his talents into a safer direction, where it grew to gargantuan proportions that only hint at how monstrous that energy and passion really were. (He was also considerably younger than the others, as well as more naturally cautious, and I’d like to believe that he drew a negative lesson from their example.) The result, remarkably, made him the most beloved writer of them all. It was a cultural position, outside the world of science fiction, that was due almost entirely to the body of his nonfiction work as a whole. He never had a bestseller until late in his career, but the volume and quality of his overall output were enough to make him famous. Asimov was the Mule, the unassuming superman of the Foundation series, but he conquered a world from his typewriter. He won the game. And when I think of how his talent, productivity, and love of enclosed spaces combined to produce a fortress made of books, I think of what David Mamet once said to The Paris Review. When asked to explain why he wrote, Mamet replied: “I’ve got to do it anyway. Like beavers, you know. They chop, they eat wood, because if they don’t, their teeth grow too long and they die. And they hate the sound of running water. Drives them crazy. So, if you put those two ideas together, they are going to build dams.”

Written by nevalalee

March 22, 2017 at 9:54 am

Quote of the Day

leave a comment »

If you had asked [Tennyson], at the end of the day, to describe the prosody of the poem to you, he would no doubt have had to think for a moment before he could answer you, not because he was ignorant of the terms, but because he had been writing a poem, not a metrical exercise. At every point, he was exerting his free will. And the outcome of that exertion was the form.

James Fenton, An Introduction to English Poetry

Written by nevalalee

March 22, 2017 at 7:30 am

Posted in Quote of the Day, Writing

Tagged with

Our fearful symmetry

with one comment

Yesterday, I spent an hour fixing my garage door, which got stuck halfway up and refused to budge. I went about it in the way I usually approach such household tasks: I took a flashlight and a pair of vise-grips and stared at it for a while. In this case, for once, it worked, even if I’m only postponing the inevitable service call. But it wouldn’t have occurred to me to tackle it myself in the first place—when I probably wouldn’t have tried to fix, say, my own car by trial and error—if it hadn’t been for two factors. The first is that the workings were all pretty visible. On each side, there’s basically just a torsion spring, a steel cable, and two pulleys, all of it exposed to plain sight. The second point, which was even more crucial, is that a garage door is symmetrical, and only one side was giving me trouble. Whenever I wasn’t sure how the result should look, I just had to look at the other half and mentally reflect it to its mirror image. It reminded me of how useful symmetry can be in addressing many problems, as George Pólya notes in How to Solve It:

Symmetry, in a general sense, is important for our subject. If a problem is symmetric in some ways we may derive some profit from noticing its interchangeable parts and it often pays to treat those parts which play the same role in the same fashion…Symmetry may also be useful in checking results.

And if I hadn’t been able to check my work along the way using the other side of the door, I doubt I would have attempted to fix it at all.

But it also points at a subtle bias in the way we pick our problems. There’s no question that symmetry plays an important role in the world around us, and it provides a solid foundation for the notion that we can use beauty or elegance as an investigative tool. “It seems that if one is working from the point of view of getting beauty in one’s equations, and if one has a really sound insight, one is on a sure line of progress,” Paul Dirac famously said, and Murray Gell-Mann gave the best explanation I’ve ever found of why this might be true:

There’s a quotation from Newton, I don’t remember the exact words but lots of other physicists have made the same remark since—that nature seems to have a remarkable property of self-similarity. The laws—the fundamental laws—at different levels seem to resemble one another. And that’s probably what accounts for the possibility of using elegance as a criterion [in science]. We develop a mathematical formula, say, for describing something at a particular level, and then we go to a deeper level and find that in terms of mathematics, the equations at the deeper level are beautifully equivalent. Which means that we’ve found an appropriate formula.

Gell-Mann concludes: “And that takes the human being, human judgment, out of it a little. You might object that after all we are the ones who say what elegance is. But I don’t think that’s the point.”

He’s right, of course, and there are plenty of fields in which symmetry and self-similarity are valuable criteria. Yet there’s also a sense in which we’re drawn to problems in which such structures appear, while neglecting those that aren’t as amenable to symmetrical thinking. Just as I was willing to take apart my garage door when I wouldn’t have done the same with my car—which, after all, has a perfectly logical design—it’s natural for us to prefer problems that are obviously symmetrical or that hold out the promise of elegance, much as we’re attracted to the same qualities in the human face. But there are plenty of important questions that aren’t elegant at all. I’m reminded of what Max Perutz, who described the structure of hemoglobin, said about the work of his more famous colleague James Watson:

I sometimes envied Jim. My own problem took thousands of hours of hard work, measurements, calculations. I often thought that there must be some way to cut through it—that there must be, if only I could see it, an elegant solution. There wasn’t any. For Jim’s there was an elegant solution, which is what I admired. He found it partly because he never made the mistake of confusing hard work with hard thinking; he always refused to substitute one for the other.

In The Eighth Day of Creation, Horace Freehand Judson calls this “the most exact yet generous compliment I have ever heard from one scientist to another.” But there’s also a wistful acknowledgement of the luck of the draw. Both Perutz and Watson were working on problems of enormous importance, but only one of them had an elegant solution, and there was no way of knowing in advance which one it would be.

Given the choice, I suspect that most of us would prefer to work on problems that exhibit some degree of symmetry: they’re elegant, intuitive, and satisfying. In the absence of that kind of order, we’re left with what Perutz calls “thousands of hours of hard work, measurements, calculations,” and it isn’t pretty. (As Donald Knuth says in a somewhat different context: “Without any underlying symmetry properties, the job of proving interesting results becomes extremely unpleasant.”) When we extrapolate this preference to the culture as a whole, it leads to two troubling tendencies. One is to prioritize problems that lend themselves to this sort of attack, while overlooking whole fields of messier, asymmetrical phenomena that resist elegant analysis—to the point where we might even deny that they’re worth studying at all. The other is to invent a symmetry that isn’t there. You can see both impulses at work in the social sciences, which tend to deal with problems that can’t be reduced to a series of equations, and they’re particularly insidious in economics, which is uniquely vulnerable to elegant models that confirm what existing interest groups want to hear. From there, it’s only a small step to more frightening forms of fake symmetry, as Borges writes in “Tlön, Uqbar, Orbis Tertius”: “Any symmetry with a resemblance of order—dialectical materialism, anti-Semitism, Nazism—was sufficient to entrance the minds of men.” And the first habit has a way of leading to the second. The more we seek out problems with symmetry while passing over those that lack it, the more likely we become to attribute false symmetries to the world around us. Symmetry, by definition, is a beautiful thing. But it can also turn us into suckers for a pretty face.

Quote of the Day

leave a comment »

Any attempt to define literary theory in terms of a distinctive method is doomed to failure…Just think of how many methods are involved in literary criticism. You can discuss the poet’s asthmatic childhood, or examine her peculiar use of syntax; you can detect the rustling of silk in the hissing of the s‘s, explore the phenomenology of reading, relate the literary work to the state of the class struggle, or find out how many copies it sold. These methods have nothing whatsoever of significance in common…However generously liberal-minded we aim to be, trying to combine structuralism, phenomenology, and psychoanalysis is more likely to lead to a nervous breakdown than to a brilliant literary career.

Terry Eagleton, Literary Theory: An Introduction

Written by nevalalee

March 21, 2017 at 7:30 am

%d bloggers like this: