Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Nautilus

The memory of persistence

leave a comment »

In Origins of Genius, which is one of my favorite books on creativity, the psychologist Dean Simonton makes an argument that I’ve tried to bear in mind ever since I first read it. While discussing the problem of creative productivity, Simonton states emphatically: “If the number of influential works is directly proportional to the total number of works produced, then the creators with the most masterpieces will be those with the most ignored and neglected products! Even the most supreme creative genius must have their careers punctuated with wasted efforts.” After quoting W.H. Auden, who observes that a major poet will tend to write more bad poems than a minor one, he continues:

If the creative genius is generating failures as well as successes, this seems to support the assumption that the creative process is to a certain extent blind. Even the greatest creators possess no direct and secure path to truth or beauty. They cannot guarantee that every published idea will survive further evaluation and testing at the hands of audiences or colleagues. The best the creative genius can do is to be as prolific as possible in generating products in the hope that at least some subset will survive the test of time.

This still ranks as one of the most significant insights into the creative process that I’ve ever seen, and Simonton sums it up elsewhere, like a true poet, in a form that can be easily remembered: “Quality is a probabilistic function of quantity.”

Simonton has a new book out this week, The Genius Checklist, with a long excerpt available on Nautilus. In the article, he focuses on the problem of intelligence tests, and in particular on two cases that point to the limitations of defining genius simply as the possession of a high IQ. One revolves around Lewis M. Terman, the creator of the modern intelligence scale, who had the notion of testing thousands of students and tracking the top performers over time. The result was an ongoing study of about 1,500 men and women, known as the “Termites,” some of whom are still alive today. As Simonton notes, the results didn’t exactly support Terman’s implicit assumptions:

None of [the Termites] grew up to become what many people would consider unambiguous exemplars of genius. Their extraordinary intelligence was channeled into somewhat more ordinary endeavors as professors, doctors, lawyers, scientists, engineers, and other professionals…Furthermore, many Termites failed to become highly successful in any intellectual capacity. These comparative failures were far less likely to graduate from college or to attain professional or graduate degrees, and far more likely to enter occupations that required no higher education whatsoever…Whatever their differences, intelligence was not a determining factor in those who made it and those who didn’t.

Terman also tested two future Nobel laureates—Luis Alvarez and William Shockley—who were rejected because they didn’t score highly enough. And Simonton notes that neither James Watson nor Richard Feynman, whose biography is actually called Genius, did well enough on such tests to qualify for Mensa.

Even if you’re a fan of Marilyn vos Savant, this isn’t particularly surprising. But I was even more interested in Simonton’s account of the work of Catharine Cox, Terman’s colleague, who decided to tackle the problem from the opposite direction—by starting with a list of known luminaries in all fields and trying to figure out what their tested IQs would have been, based solely on biographical information. This approach has obvious problems as well, of course, but her conclusion, which appears in her book The Early Mental Traits of Three Hundred Geniuses, seems reasonable enough: “High but not the highest intelligence, combined with the greatest degree of persistence, will achieve greater eminence than the highest degree of intelligence with somewhat less persistence.” And in her discussion of qualities that seem predictive of success, persistence is prominently mentioned:

We may conclude that the following traits and trait elements appearing in childhood and youth are diagnostic of future achievement: an unusual degree of persistence—tendency not to be changeable, tenacity of purpose, and perseverance in the face of obstacles—combined with intellective energy—mental work bestowed on special interests, profoundness of apprehension, and originality of ideas—and the vigorous ambition expressed by the possession to the highest degree of desire to excel.

Cox concludes: “Achievements…are not the accidents of a day. They are the natural outgrowth in individuals of superior general powers of persistent interest and great zeal combined with rare special talents.”

If we really want to identify the geniuses of the future, it seems, we should look for persistence as well as intelligence, and we might even be tempted to develop a test that would gauge a student’s “tenacity of purpose.” The ability to remain focused in the face of failures and setbacks is clearly related to Simonton’s rule about quality and quantity, which implies that a genius, to borrow John Gardner’s definition of the true writer, is someone who doesn’t quit. But there’s an even more important point to be made here. As I noted just the other day, it’s easier to fail repeatedly when you occupy a social position that protects you to some extent from the consequences. It can be hard to be “as prolific as possible in generating products” when even one mistake might end your creative journey forever. And our culture has been far more forgiving of some categories of people than of others. (In discussing Terman’s results, Simonton makes the hard decision to omit women from the group entirely: “We’re talking only of the males here, too. It would be unfair to consider the females who were born at a time in which all women were expected to become homemakers, no matter how bright.” And he might also have cited the cultural pressures that discourage a woman from taking risks that are granted to a man.) When you look at lists of canonical geniuses, like the authors of the great books, they can start to seem maddeningly alike—and if we define privilege in part as the freedom to make repeated mistakes, it’s no wonder. Over time, this also reduces the diversity of the ideas that are available for cultural selection, which can lead to a crisis in itself. The only solution is to increase the range of voices, and it isn’t easy. In the absence of such advantages, even the individuals who beat the odds must have been confronted at every turn by excellent reasons to give up. But nevertheless, they persisted.

Reading the rocks

leave a comment »

“[Our] ignorance of planetary history undermines any claims we may make to modernity,” the geologist Marcia Bjornerud writes in her new book Timefulness: How Thinking Like a Geologist Can Help Save the World. In an excerpt that appeared last week on Nautilus, Bjornerud makes a case for geology as a way of seeing that I find poetic and compelling:

Early in an introductory geology course, one begins to understand that rocks are not nouns but verbs—visible evidence of processes: a volcanic eruption, the accretion of a coral reef, the growth of a mountain belt. Everywhere one looks, rocks bear witness to events that unfolded over long stretches of time. Little by little, over more than two centuries, the local stories told by rocks in all parts of the world have been stitched together into a great global tapestry—the geologic timescale. This “map” of Deep Time represents one of the great intellectual achievements of humanity, arduously constructed by stratigraphers, paleontologists, geochemists, and geochronologists from many cultures and faiths. It is still a work in progress to which details are constantly being added and finer and finer calibrations being made.

This is a lovely passage in itself, but I was equally struck by how it resembles the arguments that are often advanced in defense of the great books. One of that movement’s favorite talking points is the notion of “The Great Conversation,” or the idea that canonical books and authors aren’t dead or antiquated, but engaged in a vital dialogue between themselves and the present. And its defenders frequently make their case in terms much like those that Bjornerud employs. In the book The Great Conversation, which serves as the opening volume of Great Books of the Western World, the educator Robert Maynard Hutchins writes: “This set of books is offered in no antiquarian spirit. We have not seen our task as that of taking tourists on a visit to ancient ruins or to the quaint productions of primitive peoples.” And the justifications presented for the two fields are similar as well. As Bjornerud’s subtitle indicates, she suggests that a greater awareness of geologic timescales can serve as a way for us to address the problems of our own era, while Hutchins uses language that has a contemporary ring:

We are as concerned as anybody else at the headlong plunge into the abyss that Western civilization seems to be taking. We believe that the voices that may recall the West to sanity are those which have taken part in the Great Conversation. We want them to be heard again not because we want to go back to antiquity, or the Middle Ages, or the Renaissance, or the Eighteenth Century. We are quite aware that we do not live in any time but the present, and, distressing as the present is, we would not care to live in any other time if we could.

“We want the voices of the Great Conversation to be heard again because we think they may help us to learn to live better now,” Hutchins concludes. Bjornerud sounds much the same when she speaks on behalf of geology, sounding a dire warning against “temporal illiteracy,” which leads us to ignore our own impact on environmental processes in the present. In both cases, a seemingly static body of knowledge is reimagined as timely and urgent. I’ve spent much of my life in service to this notion, in one way or another, and I badly want to believe it. Yet I sometimes have my doubts. The great books have been central to my thinking for decades, and their proponents tend to praise their role in building cultural and civic awareness, but the truth isn’t quite that simple. As Harold Bloom memorably points out in The Western Canon: “Reading the very best writers—let us say Homer, Dante, Shakespeare, Tolstoy—is not going to make us better citizens.” And a few pages later, he makes a case that strikes me as more convincing than anything that Hutchins says:

The silliest way to defend the Western Canon is to insist that it incarnates all of the seven deadly moral virtues that make up our supposed range of normative values and democratic principles. This is palpably untrue…The West’s greatest writers are subversive of all values, both ours and their own…If we read the Western Canon in order to form our social, political, or personal moral values, I firmly believe we will become monsters of selfishness and exploitation. To read in the service of any ideology is not, in my judgment, to read at all.

And while I’m certainly sympathetic to Bjornerud’s argument, I suspect that the same might hold true if we turn to geology for lessons about time. Good science, like great literature, is morally neutral, and we run into trouble when we ask it to stand for anything but itself. (Bjornerud notes in passing that many geologists are employed by petroleum companies, which doesn’t help her case that access to knowledge about the “deep, rich, grand geologic story” of our planet will lead to a better sense of environmental stewardship.) And this line of argument has a way of highlighting a field’s supposed relevance at the moments when it seems most endangered. The humanities have long fought against the possibility, as Bloom dryly puts it, that “our English and other literature departments [will] shrink to the dimensions of our current Classics departments,” and Bjornerud is equally concerned for geology:

Lowly geology has never achieved the glossy prestige of the other sciences. It has no Nobel Prize, no high school Advanced Placement courses, and a public persona that is musty and dull. This of course rankles geologists, but it also has serious consequences for society…The perceived value of a science profoundly influences the funding it receives.

When a field seems threatened, it’s tempting to make it seem urgently necessary. I’ve done plenty of this sort of thing myself, and I hope that it works. In the end, though, I have a feeling that Bjornerud’s “timefulness” has exactly the same practical value as the virtue that Bloom attributes to books, which is priceless enough: “All that the Western Canon can bring one is the proper use of one’s own solitude, that solitude whose final form is one’s confrontation with one’s own mortality.”

The science fiction sieve

leave a comment »

Note: To celebrate the World Science Fiction Convention this week in San Jose, I’m republishing a few of my favorite pieces on various aspects of the genre. This post originally appeared, in a slightly different form, on June 28, 2017.

In a remarkably lucid essay published last year in Nautilus, the mathematician Noson S. Yanofsky elegantly defines the self-imposed limitations of science. Yanofsky points out that scientists deliberately take a subset of phenomena—characterized mostly by how amenable it is to their chosen methods—for their field of study, while leaving the rest to the social sciences or humanities. (As Paul Valéry put it: “Science means simply the aggregate of all the recipes that are always successful. All the rest is literature.”) He visualizes science as a kind of sieve, which lets in some subjects while excluding others:

The reason why we see the structure we do is that scientists act like a sieve and focus only on those phenomena that have structure and are predictable. They do not take into account all phenomena; rather, they select those phenomena they can deal with…Scientists have classified the general textures and heights of different types of clouds, but, in general, are not at all interested in the exact shape of a cloud. Although the shape is a physical phenomenon, scientists don’t even attempt to study it. Science does not study all physical phenomena. Rather, science studies predictable physical phenomena. It is almost a tautology: science predicts predictable phenomena.

Yanofsky groups these criteria under the general heading “symmetry,” and he concludes: “The physicist must be a sieve and study those phenomena that possess symmetry and allow those that do not possess symmetry to slip through her fingers.” I won’t get into the rest of his argument, which draws an ingenious analogy from mathematics, except to say that it’s worth reading in its entirety. But I think his thesis is sound, and it ties into many issues that I’ve discussed here before, particularly about the uncomfortable status of the social sciences.

If you’re trying to catch this process in action, though, the trouble is that the boundaries of science aren’t determined by a general vote, or even by the work of isolated geniuses, but emerge gradually and invisibly from the contributions of countless individuals. But if I were a historian of science, I’d take a close look at the development of science fiction, in which an analogous evolution occurred in plain sight over a relatively short period of time. You can see it clearly in the career of the editor John W. Campbell, who remained skeptical of the social sciences, but whose signal contribution to the genre may have been to put them at its center. And the “sieve” that he ended up using is revealing in itself. A significant turning point was the arrival on his desk of Robert A. Heinlein’s landmark novella “If This Goes On—,” of which Campbell wrote in 1939:

Robert Heinlein, in his “If This Goes On—,” presents a civilization in which mob psychology and propaganda have become sciences. They aren’t, yet…Psychology isn’t a science, so long as a trained psychologist does—and must—say “there’s no telling how an individual man will react to a given stimulus.” Properly developed, psychology could determine that.

As an editor, Campbell began to impose psychological and sociological elements onto stories where they didn’t always fit, much as he would gratuitously insert references to uranium-235 during World War II. He irritated Isaac Asimov, for instance, by asking him to add a section to the story “Homo Sol” about “certain distinctions between the emotional reactions of Africans and Asians as compared with those of Americans and Europeans.” Asimov saw this as an early sign of Campbell’s racial views, and perhaps it was, but it pointed just as convincingly to his interest in mass psychology.

And readers took notice at a surprisingly early stage. In the November 1940 issue of Astounding, a fan named Lynn Bridges presciently wrote:

The Astounding Science Fiction of the past year has brought forth a new type of story, best described, perhaps, as “sociological” science fiction. The spaceships…are still present, but more emphasis has been placed on the one item which will have more to do with shaping the future than anything else, that strange race of bipeds known as man…Both Asimov [in “Homo Sol”] and Heinlein [in “If This Goes On—”] treat psychology as an exact science, usable in formulas, certain in results. I feel called upon to protest. Its very nature prevents psychology from achieving the exactness of mathematics…The moment men stop varying and the psychologist can say definitely that all men are alike psychologically, progress stops and the world becomes a very boring Utopia.

Campbell responded: “Psychology could improve a lot, though, without becoming dangerously oppressive!” Just two months later, in a letter in the January 1941 issue, Asimov referred to the prospect of “mathematical psychology”: “If we can understand Einstein and Hitler down to the mathematical whys and wherefores, we might try to boost along a few Einsteins and cut down on a few Hitlers, and progress might really get going.” Campbell replied much as before: “Psychology isn’t an exact science—but it can be.” Implicit in the whole discussion was the question of whether psychology could be tackled using the same hard-headed engineering approach that had worked for the genre before. And as I’ve written elsewhere, the evolution of Campbellian science fiction is largely one of writers who were so good at lecturing us about engineering that we barely even noticed when they moved on to sociology.

But what interests me now is the form it took in Astounding, which looks a lot like the sieve that Yanofsky describes. Campbell may have hoped that psychology would learn how to predict “how an individual man will react to a given stimulus,” but he seems to have sensed that this wouldn’t be credible or interesting in fiction. Instead, he turned to two subsets of psychology that were more suited to the narrative tools at his disposal. One was the treatment of simplified forms of human personality—say, for instance, in a robot. The other was the treatment of large masses of individuals. Crucially, neither was necessarily more possible than predicting the behavior of individuals, but they had the advantage that they could be more plausibly treated in fiction. Campbell’s preferred instrument at the time was Asimov, who was reliable, willing to take instruction, and geographically close enough to talk over ideas in person. As a result, Asimov’s most famous stories can be read as a series of experiments to see how the social sciences could be legitimately explored by the genre. The Three Laws of Robotics, which Campbell was the first to explicitly formulate, are really a simplified model of human behavior: Campbell later wrote that they were essentially “the basic desires of a small child, with the exception that the motivation of desire for love has been properly omitted.” At the other end of the spectrum, psychohistory looks for laws that can be applied on a mass scale, and it’s central not only to the Foundation series but even to “Nightfall,” with its theme of the cyclical rise and fall of civilizations. In science, you could draw a parallel to artificial intelligence and macroeconomics, which represent two extremes at which qualities of symmetry and predicability seem to enter the realm of psychology. In between, there’s a vast terrain of human experience that Campbell was never quite able to tackle, and that impulse ended up being channeled into dianetics. But much as science can be defined as everything that makes it through the sieve of symmetry, Campbell had a sieve of his own, and the result was the science fiction of the golden age.

Written by nevalalee

August 15, 2018 at 9:00 am

The magic window

with one comment

Last week, the magazine Nautilus published a conversation on “the science and art of time” between the composer Philip Glass and the painter Fredericka Foster. The entire article is worth a look, but my favorite detail is one that Glass shares at the very beginning:

There are many strange things about music and time. When I’m on a tour with the dance company we work in a different-sized theater every night. The first thing the dance company does when we arrive is to measure the stage. They have to reset the dance to fit that stage. So you also have to reset the time of the music: in a larger theater, you must play slower. In a smaller theater, you have to play faster. The relation of time and space in music is dynamic. I have a range of speed in mind. If the players don’t pay attention to that, it will look really funny. You can see the stage fill up with dancers because they are playing at the wrong speed.

And a few lines afterward, in a more contemplative mood, Glass continues: “I was reflecting on the universe expanding. We know that it is and can measure it, by the way time is operating, or by the way we see a star exploding far away. For various reasons, when a physicist tells me that the universe is expanding, I say ‘Okay, let’s go back to the dance floor.’ The dance floor is getting bigger, what does that mean? It means that time has to slow down.”

The relationship between the pacing of a work of art and the physical space in which it occurs is an intriguing one, and it reminds me of a trick employed by one of my heroes, the film editor Walter Murch. In his excellent book Behind the Seen, Charles Koppelman describes the “little people,” a pair of tiny paper silhouettes—one male, one female—that Murch attaches to the screening monitor in his editing room. Koppelman explains:

They are his way of dealing with the problem of scale…As an editor, Murch must remember that images in the edit room are only 1/240 the square footage of what the audience will eventually see on a thirty-foot-wide screen…It’s still easy to forget the size of a projected film, which can trick an editor into pacing a film too quickly, or using too many close-ups—styles more akin to television. The eye rapidly apprehends the relatively small, low-detail images on a TV. Large-scale faces help hold the attention of the audience sitting in a living room with lots of distractions or ambient light. But in movies, images are larger than life and more detailed, so the opposite is true. The eye needs time to peruse the movie screen and take it all in…The solution for Murch is to have these two human cutouts stand sentry on his monitor, reminding him of the film’s eventual huge proportions.

And Murch writes in his book In the Blink of an Eye: “Why don’t we just edit in large rooms with big screens? Well, with digital editing and video projection, we could, very easily, be editing with a thirty-foot screen. The real estate for the room would be expensive, however.”

And while the problems presented by a live performance and a projected image on film might seem rather different, the underlying issue, in both cases, is the audience’s ability to receive and process information. On a purely practical level, a big stage may require the tempo of the choreography to subtly change, because the dancers are moving in a larger physical space, and the music has to be adjusted accordingly. But the viewer’s relationship to the work is also affected—the eye is more likely to take in the action in pieces, rather than as a whole, and the pacing may need to be modified. A similar phenomenon occurs in the movies, as Murch writes:

I have heard directors say that they were were disappointed when they finally saw their digitally edited films projected on a big screen. They felt that the editing now seemed “choppy,” though it had seemed fine on the television monitor…With a small screen, your eye can easily take in everything at once, whereas on a big screen it can only take in sections at a time. You tend to look at a small screen, but into a big screen. If you are looking at an image, taking it all in at once, your tendency will be to cut away to the next shot sooner. With a theatrical film, particularly one in which the audience is fully engaged, the screen is not a surface, it is a magic window, sort of a looking glass through which your whole body passes and becomes engaged in the action with the characters on the screen.

Murch notes that the lack of detail on a small screen—or a compressed video file—can mislead the editor as well: “There may be so little detail that the eye can absorb all of it very quickly, leading the careless editor to cut sooner than if he had been looking at the fully detailed film image…Image detail and pace are intimately related.

And the risk of editing on a smaller screen isn’t anything new. Over thirty years ago, the director and editor Edward Dmytryk wrote in On Film Editing:

Many editors shape their editing concepts on the Moviola, a technique I consider decidedly inferior. One does not see the same things on a small Moviola screen, or even on the somewhat larger, though fuzzier, flatbed screen, that one sees in a theater. The audience sees its films only on the “big screen,” and since every cut should be made with the audience in mind, the cutter must try to see each bit of film as the viewer in the theater will eventually see it. (Even a moderate-sized television screen offers far more scope than a Moviola; therefore, it too presents a somewhat different “picture” for the viewer’s inspection.)

Today, of course, viewers can experience stories on a range of screen sizes that Dmytryk might never have anticipated, and which no editor can possibly control. And it’s unclear how editors—who, unlike Philip Glass, don’t have the luxury of measuring the space in which the film will unfold—are supposed to deal with this problem. Taken as a whole, it seems likely that the trend of editorial pacing reflects the smallest screen on which the results can be viewed, which is part of the reason why the average number of cuts per minute has steadily increased for years. And it’s not unreasonable for editors to prioritize the format in which movies will be seen for most of their lifetimes. Yet we also give up something when we no longer consider the largest possible stage. After the editor Anne V. Coates passed away last month, many obituaries paid tribute to the moment in Lawrence of Arabia that has justifiably been called the greatest cut in movie history. But it wouldn’t have nearly the same impact if it weren’t for the fact that the next shot is held for an astonishing thirty-five seconds, which might never have occurred to someone who was cutting it for a smaller screen. Even viewed on YouTube, it’s unforgettable. But in a theater, it’s a magic window.

The last questions

leave a comment »

For two decades, the writer and literary agent John Brockman has posed a single question on an annual basis to a group of scientists and other intellectuals. The notion of such a question—which changes every year—was inspired by the work of the late artist and philosopher James Lee Byars, whose declaration of intent serves as a motto for the entire project: “To arrive at the edge of the world’s knowledge, seek out the most complex and sophisticated minds, put them in a room together, and have them ask each other the questions they are asking themselves.” Brockman publishes the responses on his website, and the result resonates so strongly with just about everything that I love that I’m embarrassed to say that I hadn’t heard of it until this week. (I owe my discovery of it to an article by Brian Gallagher in the excellent magazine Nautilus.) It’s an attempt to take the pulse of what Brockman calls “the third culture, [which] consists of those scientists and other thinkers in the empirical world who, through their work and expository writing, are taking the place of the traditional intellectual in rendering visible the deeper meanings of our lives, redefining who and what we are.” Questions from recent years include “What is your favorite deep, elegant or beautiful explanation?” and “What scientific concept would improve everyone’s cognitive toolkit?” And the result is manifestly so useful, interesting, and rich that I’m almost afraid to read too much of it at once.

This year, to commemorate the twentieth anniversary of the project, Brockman issued a somewhat different challenge, asking his usual group of correspondents: “What is the last question?” By way of explanation, he quotes an essay that he originally wrote in the late sixties, when he first became preoccupied with the idea of asking questions at all:

The final elegance: assuming, asking the question. No answers. No explanations. Why do you demand explanations? If they are given, you will once more be facing a terminus. They cannot get you any further than you are at present…Our kind of innovation consists not in the answers, but in the true novelty of the questions themselves; in the statement of problems, not in their solutions. What is important is not to illustrate a truth—or even an interrogation—known in advance, but to bring to the world certain interrogations…A total synthesis of all human knowledge will not result in huge libraries filled with books, in fantastic amounts of data stored on servers. There’s no value any more in amount, in quantity, in explanation. For a total synthesis of human knowledge, use the interrogative.

Brockman strongly implies that this year’s question will be the last. (To which I can only respond with a lyric from The Simpsons: “To close this place now would be twisted / We just learned this place existed.”) And he closes by presenting the final question: “Ask ‘The Last Question,’ your last question, the question for which you will be remembered.”

I’ve just spent half an hour going through the responses, which are about as fascinating as you’d expect. As I read the questions, I felt that some of them could change lives, if they were encountered at just the right time. (If you know a bright teenager, you could do worse than to send the list his or her way. After all, you just never know.) And they’re a mine of potential ideas for science fiction writers. Here are a few of my favorites:

Jimena Canales: “When will we accept that the most accurate clocks will have to advance regularly sometimes, irregularly most of the time, and at times run counterclockwise?”
Bart Kosko: “What is the bumpiest and highest-dimensional cost surface that our best computers will be able to search and still find the deepest cost well?”
Julia Clarke: “What would comprise the most precise and complete sonic representation of the history of life?”
Stuart Firestein: “How many incommensurable ideas can we hold in our mind simultaneously?”
George Dyson: “Why are there no trees in the ocean?”
Andrew Barron: “What would a diagram that gave a complete understanding of imagination need to be?”

Not all are equally interesting, and some of the respondents were evidently daunted by the challenge. A few of the submissions feel like an answer—or an opinion—with a question mark stuck awkwardly on the end. As Gallagher notes in Nautilus: “The question ended up prompting many of the academics among the responders to just restate one of their research targets, albeit succinctly.” The computer scientist Scott Aaronson wrote on his blog:

I tried to devise a single question that gestured toward the P vs. NP problem, and the ultimate physical limits of computation, and the prospects for superintelligent AI, and the enormity of what could be Platonically lying in wait for us within finite but exponentially search spaces, and the eternal nerd’s conundrum, of the ability to get the right answers to clearly-stated questions being so ineffectual in the actual world. I’m not thrilled with the result, but reading through the other questions makes it clear just how challenging it is to ask something that doesn’t boil down to: “When will the rest of the world recognize the importance of my research topic?”

But it’s impossible to read it without wondering what your own question would be. (None of the participants went with what many science fiction fans know is the real last question: “How can the net amount of entropy of the universe be massively decreased?” But maybe they knew that there’s insufficient data for a meaningful answer.) I don’t know what mine is yet, but this one from Jonathan Gottschall comes fairly close, and it can serve as a placeholder for now: “Are stories bad for us?”

Written by nevalalee

February 8, 2018 at 8:43 am

The science fiction sieve

with one comment

In a remarkably lucid essay published last week in Nautilus, the mathematician Noson S. Yanofsky elegantly defines the self-imposed limitations of science. Yanofsky points out that scientists deliberately take a subset of phenomena—characterized mostly by how amenable it is to their chosen methods—for their field of study, while leaving the rest to the social sciences or humanities. (As Paul Valéry put it: “Science means simply the aggregate of all the recipes that are always successful. All the rest is literature.”) He visualizes science as a kind of sieve, which lets in some subjects while excluding others:

The reason why we see the structure we do is that scientists act like a sieve and focus only on those phenomena that have structure and are predictable. They do not take into account all phenomena; rather, they select those phenomena they can deal with…Scientists have classified the general textures and heights of different types of clouds, but, in general, are not at all interested in the exact shape of a cloud. Although the shape is a physical phenomenon, scientists don’t even attempt to study it. Science does not study all physical phenomena. Rather, science studies predictable physical phenomena. It is almost a tautology: science predicts predictable phenomena.

Yanofsky groups these criteria under the general heading “symmetry,” and he concludes: “The physicist must be a sieve and study those phenomena that possess symmetry and allow those that do not possess symmetry to slip through her fingers.” I won’t get into the rest of his argument, which draws an ingenious analogy from mathematics, except to say that it’s worth reading in its entirety. But I think his thesis is sound, and it ties into many issues that I’ve discussed here before, particularly about the uncomfortable status of the social sciences.

If you’re trying to catch this process in action, though, the trouble is that the boundaries of science aren’t determined by a general vote, or even by the work of isolated geniuses, but emerge gradually and invisibly from the contributions of countless individuals. But if I were a historian of science, I’d take a close look at the development of science fiction, in which an analogous evolution occurred in plain sight over a relatively short period of time. You can see it clearly in the career of the editor John W. Campbell, who remained skeptical of the social sciences, but whose signal contribution to the genre may have been to put them at its center. And the “sieve” that he ended up using is revealing in itself. A significant turning point was the arrival on his desk of Robert A. Heinlein’s landmark novella “If This Goes On—,” of which Campbell wrote in 1939:

Robert Heinlein, in his “If This Goes On—,” presents a civilization in which mob psychology and propaganda have become sciences. They aren’t, yet…Psychology isn’t a science, so long as a trained psychologist does—and must—say “there’s no telling how an individual man will react to a given stimulus.” Properly developed, psychology could determine that.

As an editor, Campbell began to impose psychological and sociological elements onto stories where they didn’t always fit, much as he would gratuitously insert references to uranium-235 during World War II. He irritated Isaac Asimov, for instance, by asking him to add a section to the story “Homo Sol” about “certain distinctions between the emotional reactions of Africans and Asians as compared with those of Americans and Europeans.” Asimov saw this as an early sign of Campbell’s racial views, and perhaps it was, but it pointed just as convincingly to his interest in mass psychology.

And readers took notice at a surprisingly early stage. In the November 1940 issue of Astounding, a fan named Lynn Bridges presciently wrote:

The Astounding Science Fiction of the past year has brought forth a new type of story, best described, perhaps, as “sociological” science fiction. The spaceships…are still present, but more emphasis has been placed on the one item which will have more to do with shaping the future than anything else, that strange race of bipeds known as man…Both Asimov [in “Homo Sol”] and Heinlein [in “If This Goes On—”] treat psychology as an exact science, usable in formulas, certain in results. I feel called upon to protest. Its very nature prevents psychology from achieving the exactness of mathematics…The moment men stop varying and the psychologist can say definitely that all men are alike psychologically, progress stops and the world becomes a very boring Utopia.

Campbell responded: “Psychology could improve a lot, though, without becoming dangerously oppressive!” Just two months later, in a letter in the January 1941 issue, Asimov referred to the prospect of “mathematical psychology”: “If we can understand Einstein and Hitler down to the mathematical whys and wherefores, we might try to boost along a few Einsteins and cut down on a few Hitlers, and progress might really get going.” Campbell replied much as before: “Psychology isn’t an exact science—but it can be.” Implicit in the whole discussion was the question of whether psychology could be tackled using the same hard-headed engineering approach that had worked for the genre before. And as I’ve written elsewhere, the evolution of Campbellian science fiction is largely one of writers who were so good at lecturing us about engineering that we barely even noticed when they moved on to sociology.

But what interests me now is the form it took in Astounding, which looks a lot like the sieve that Yanofsky describes. Campbell may have hoped that psychology would learn how to predict “how an individual man will react to a given stimulus,” but he seems to have sensed that this wouldn’t be credible or interesting in fiction. Instead, he turned to two subsets of psychology that were more suited to the narrative tools at his disposal. One was the treatment of simplified forms of human personality—say, for instance, in a robot. The other was the treatment of large masses of individuals. Crucially, neither was necessarily more possible than predicting the behavior of individuals, but they had the advantage that they could be more plausibly treated in fiction. Campbell’s preferred instrument at the time was Asimov, who was reliable, willing to take instruction, and geographically close enough to talk over ideas in person. As a result, Asimov’s most famous stories can be read as a series of experiments to see how the social sciences could be legitimately explored by the genre. The Three Laws of Robotics, which Campbell was the first to explicitly formulate, are really a simplified model of human behavior: Campbell later wrote that they were essentially “the basic desires of a small child, with the exception that the motivation of desire for love has been properly omitted.” At the other end of the spectrum, psychohistory looks for laws that can be applied on a mass scale, and it’s central not only to the Foundation series but even to “Nightfall,” with its theme of the cyclical rise and fall of civilizations. In science, you could draw a parallel to artificial intelligence and macroeconomics, which represent two extremes at which qualities of symmetry and predicability seem to enter the realm of psychology. In between, there’s a vast terrain of human experience that Campbell was never quite able to tackle, and that impulse ended up being channeled into dianetics. But much as science can be defined as everything that makes it through the sieve of symmetry, Campbell had a sieve of his own, and the result was the science fiction of the golden age.

Written by nevalalee

June 28, 2017 at 9:07 am

Quote of the Day

with one comment

Frank Wilczek

One [common feature of beauty] is what I call exuberance or productivity, where you get out more than you put in. You find some equation or law by putting together clues and making a guess, and then you can explain seven other things and you know you’re on the right track. You get out more than you put in.

Frank Wilczek, to Nautilus

Written by nevalalee

January 21, 2016 at 7:30 am

Posted in Quote of the Day

Tagged with ,

The book of numbers

with one comment

Neil Sloane

The recent Nautilus article by Siobhan Roberts about the mathematician Neil Sloane, titled “How to Build a Search Engine for Mathematics,” is the most interesting thing I’ve read online in months. I stumbled across it around six this morning, at a point when I was thinking about little more than my first cup of coffee, and when I was done, I felt energized, awake, and excited about the future. At first glance, its subject might not seem especially promising: Sloane’s baby, The On-Line Encyclopedia of Integer Sequences, sounds about as engaging as the classic bestseller A Million Random Digits with 100,000 Normal Deviates. But the more you think about Sloane and his life’s work, the more it starts to seem like what the Internet was meant to do all along. It’s a machine for generating connections between disciplines, a shortcut that turns good hunches into something more, and a means of quickly surveying an otherwise unnavigable universe of information. In short, it does for numbers, or anything that can be expressed as a sequence of integers, what Google Books theoretically should do for words. The result is a research tool that led Rutgers University professor Doron Zeilberger to call Sloane “the world’s most influential mathematician,” although, if anything, this understates the possible scope of his accomplishments. And even if you’re already familiar with OEIS, the article is well worth reading anyway, if only for how beautifully Roberts lays out its implications.

The appeal of Sloane’s encyclopedia can best be understood by going back to its origins, when its creator was a graduate student at Cornell. While writing his doctoral dissertation on a problem in artificial intelligence, he calculated an integer sequence—0, 1, 8, 78, 944, and so on—that described the firing of neurons in a neural network. As Roberts writes:

The sequence looked promising, though Sloane couldn’t figure out the pattern or formula that would give him the next and all further terms, and by extension the sequence’s rate of growth. He searched out the sequence at the library to see if it was published in a math book on combinatorics or the like, and found nothing. Along the way, however, he came upon other sequences of interest, and stashed them away for further investigation. He eventually computed the formula using a tool from 1937, Pólya’s enumeration theorem.

But the roundabout process had been frustrating. The task should not have been so difficult. He should have been able to simply look up his sequence in a comprehensive reference guide for all extant integer sequences. Since no such thing existed, he decided to build it himself. “I started collecting sequences,” he said. “I went through all the books in the Cornell library…And articles and journals and any other source I could find.”

Neil Sloane's notebook

Reading this, I was inevitably reminded of the experience of writing my own senior thesis, in the days before universal book search was available, and the kind of random scavenging through the stacks that was required back then to track down references and make connections. Sloane’s impulse to collect such sequences initially took the form of a set of punchcards, followed years later by A Handbook of Integer Sequences, published by his employers at Bell Labs. Finally, about twenty years ago, he put it online. Before long, the database began to prove its value, as when it revealed that a sequence related to the problem of placing cell towers matched one from an unrelated subject in number theory. It’s the closest thing we have to a search engine for math, as long as you can express whatever you’re doing in terms of a sequence of numbers:

Ultimately, it all comes back to counting things, and counting is a universally handy tool. Which in turn makes the encyclopedia handy, too. “Suppose you are working on a problem in one domain, say, electronics, and while solving a problem you encounter a sequence of integers,” said Manish Gupta, a coding theorist by training who runs a lab at the Dhirubhai Ambani Institute of Information and Communication Technology. “Now you can use the encyclopedia and search if this is well known. Many times it happens that this sequence may have appeared in a totally unrelated area with another problem. Since numbers are the computational output of nature, to me, these connections are quite natural.”

As Roberts concludes: “The encyclopedia’s impact on scientific research broadly speaking can be measured by its citations in journals, which currently Sloane has tallied to more than 4,500, ranging through biology, botany, zoology, chemistry, thermodynamics, optics, quantum physics, astrophysics, geology, cybernetics, engineering, epidemiology, and anthropology. It is a numerical database of the human canon.” And although the humanities go mostly unrepresented in that list, that’s probably because the translation of such concepts into numbers isn’t always intuitive. But researchers in other areas can at least appreciate its usefulness by analogy. When I think of how I use Google as a creative tool, it’s less to find specific information than to unearth connections—as when I spent a month looking up pairs of concepts like “Dadaism” and “Vehmgericht” to populate the conspiracy theory in The Icon Thief—or to verify a hunch I’ve already had. (As E.L. Doctorow once put it: “[Research] involved finding a responsible source for the lie I was about to create, and discovering that it was not a lie, which is to say someone else had thought of it first.”) Sloane’s encyclopedia essentially allows mathematicians and scientists to do the same, once they’ve converted their ideas into a searchable sequence, which can be a useful exercise in itself. And even if you aren’t in one of those fields, a few minutes browsing in OEIS is enough to remind you of how large the world is, how patterns can emerge in unexpected places, and how the first step to insight is making sure that those connections are accessible.

%d bloggers like this: