Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Origins of Genius

The memory of persistence

leave a comment »

In Origins of Genius, which is one of my favorite books on creativity, the psychologist Dean Simonton makes an argument that I’ve tried to bear in mind ever since I first read it. While discussing the problem of creative productivity, Simonton states emphatically: “If the number of influential works is directly proportional to the total number of works produced, then the creators with the most masterpieces will be those with the most ignored and neglected products! Even the most supreme creative genius must have their careers punctuated with wasted efforts.” After quoting W.H. Auden, who observes that a major poet will tend to write more bad poems than a minor one, he continues:

If the creative genius is generating failures as well as successes, this seems to support the assumption that the creative process is to a certain extent blind. Even the greatest creators possess no direct and secure path to truth or beauty. They cannot guarantee that every published idea will survive further evaluation and testing at the hands of audiences or colleagues. The best the creative genius can do is to be as prolific as possible in generating products in the hope that at least some subset will survive the test of time.

This still ranks as one of the most significant insights into the creative process that I’ve ever seen, and Simonton sums it up elsewhere, like a true poet, in a form that can be easily remembered: “Quality is a probabilistic function of quantity.”

Simonton has a new book out this week, The Genius Checklist, with a long excerpt available on Nautilus. In the article, he focuses on the problem of intelligence tests, and in particular on two cases that point to the limitations of defining genius simply as the possession of a high IQ. One revolves around Lewis M. Terman, the creator of the modern intelligence scale, who had the notion of testing thousands of students and tracking the top performers over time. The result was an ongoing study of about 1,500 men and women, known as the “Termites,” some of whom are still alive today. As Simonton notes, the results didn’t exactly support Terman’s implicit assumptions:

None of [the Termites] grew up to become what many people would consider unambiguous exemplars of genius. Their extraordinary intelligence was channeled into somewhat more ordinary endeavors as professors, doctors, lawyers, scientists, engineers, and other professionals…Furthermore, many Termites failed to become highly successful in any intellectual capacity. These comparative failures were far less likely to graduate from college or to attain professional or graduate degrees, and far more likely to enter occupations that required no higher education whatsoever…Whatever their differences, intelligence was not a determining factor in those who made it and those who didn’t.

Terman also tested two future Nobel laureates—Luis Alvarez and William Shockley—who were rejected because they didn’t score highly enough. And Simonton notes that neither James Watson nor Richard Feynman, whose biography is actually called Genius, did well enough on such tests to qualify for Mensa.

Even if you’re a fan of Marilyn vos Savant, this isn’t particularly surprising. But I was even more interested in Simonton’s account of the work of Catharine Cox, Terman’s colleague, who decided to tackle the problem from the opposite direction—by starting with a list of known luminaries in all fields and trying to figure out what their tested IQs would have been, based solely on biographical information. This approach has obvious problems as well, of course, but her conclusion, which appears in her book The Early Mental Traits of Three Hundred Geniuses, seems reasonable enough: “High but not the highest intelligence, combined with the greatest degree of persistence, will achieve greater eminence than the highest degree of intelligence with somewhat less persistence.” And in her discussion of qualities that seem predictive of success, persistence is prominently mentioned:

We may conclude that the following traits and trait elements appearing in childhood and youth are diagnostic of future achievement: an unusual degree of persistence—tendency not to be changeable, tenacity of purpose, and perseverance in the face of obstacles—combined with intellective energy—mental work bestowed on special interests, profoundness of apprehension, and originality of ideas—and the vigorous ambition expressed by the possession to the highest degree of desire to excel.

Cox concludes: “Achievements…are not the accidents of a day. They are the natural outgrowth in individuals of superior general powers of persistent interest and great zeal combined with rare special talents.”

If we really want to identify the geniuses of the future, it seems, we should look for persistence as well as intelligence, and we might even be tempted to develop a test that would gauge a student’s “tenacity of purpose.” The ability to remain focused in the face of failures and setbacks is clearly related to Simonton’s rule about quality and quantity, which implies that a genius, to borrow John Gardner’s definition of the true writer, is someone who doesn’t quit. But there’s an even more important point to be made here. As I noted just the other day, it’s easier to fail repeatedly when you occupy a social position that protects you to some extent from the consequences. It can be hard to be “as prolific as possible in generating products” when even one mistake might end your creative journey forever. And our culture has been far more forgiving of some categories of people than of others. (In discussing Terman’s results, Simonton makes the hard decision to omit women from the group entirely: “We’re talking only of the males here, too. It would be unfair to consider the females who were born at a time in which all women were expected to become homemakers, no matter how bright.” And he might also have cited the cultural pressures that discourage a woman from taking risks that are granted to a man.) When you look at lists of canonical geniuses, like the authors of the great books, they can start to seem maddeningly alike—and if we define privilege in part as the freedom to make repeated mistakes, it’s no wonder. Over time, this also reduces the diversity of the ideas that are available for cultural selection, which can lead to a crisis in itself. The only solution is to increase the range of voices, and it isn’t easy. In the absence of such advantages, even the individuals who beat the odds must have been confronted at every turn by excellent reasons to give up. But nevertheless, they persisted.

Thinking in pictures

with one comment

Last weekend, at the Printers Row Lit Fest in Chicago, I picked up a copy of Life: A User’s Manual by Georges Perec, a novel I’d been meaning to read for a long time. I’d been interested in Perec ever since reading about his work in Douglas Hofstadter’s Le Ton Beau de Marot, and while I’ve only begun dipping into Life, I’m already intrigued by the riches on display. As described in greater detail here, Life is an ambitious experimental novel, centered on a fictional apartment block in Paris, that Perec constructed using a system designed to generate a random list of items (an activity, a position of the body, a writer, even the number of pages) for each chapter, which he then had to incorporate into the narrative. The result, as Perec put it, is a “machine for inspiring stories.” Even apart from the merits of the novel itself, I find this premise tremendously exciting.

Regular readers of this blog know that one of my ongoing obsessions is finding new ways to insert randomness and constraints into the writing process. Writing a novel, at least as I tend to approach it, is such a left-brained activity that it’s necessary to create opportunities for the right brain to participate. Sometimes this happens by accident—while shaving, for example. But there are also ways of approaching randomness more deliberately. I’ve published stories based on juxtapositions of two unrelated articles from science magazines, used random selections from Shakespeare and the I Ching to guide chapters (although I’ve mostly dropped the latter, despite the fun of throwing the coins), and used mind maps to bind all these elements together. And I’m looking forward to applying some of Perec’s techniques to my own work, although probably in a much more limited sense.

Recently, I’ve also discovered another approach that might prove useful. In Origins of Genius (which, in case you haven’t noticed already, is one of the most stimulating books on creativity I’ve read in a long time), Dean Simonton describes a fascinating experiment by psychiatrist Albert Rothenberg:

[Rothenberg] and a colleague began by making up a set of visual stimuli that involved the superimposition of visual images. For example, one contained a photograph of an empty French four-poster bed placed in a period room superimposed over a group of soldiers in combat who were taking cover behind a tank. These highly incongruous homospatial images were then shown to writers and to artists, the latter including individuals selected in a national competition by faculty at the Yale School of Art. The writers were instructed to create new metaphors inspired by the stimuli, while the artists were instructed to make pastel drawings. In comparison with the control group (e.g., subjects who saw the images only separately), individuals exposed to these visual juxtapositions of unrelated images generated more creative products, as judged by independent raters.

In other words, juxtapositions of two unrelated concepts often result in ideas that would not have arisen from considering the two concepts separately, which only confirms one of my most basic convictions about the creative process.

What I find particularly interesting about Rothenberg’s experiment, though, is that the stimuli consisted of images, rather than words, which seems like an especially promising way of encouraging nonverbal, creative thought. With that in mind, I’ve started to incorporate a similar method into my own work, using images randomly chosen from three books that seem ideally suited for such an approach: Phaidon’s chaming little volumes The Art Book, The Photo Book, and The 20th Century Art Book. Each book consists of representative works by five hundred artists, one work to a page, arranged in alphabetical order—an arbitrary system that already lends itself to startling juxtapositions. For instance, in The Photo Book, by an accident of the alphabet, “A Sea of Steps” by Frederick H. Evans appears across from “Washroom in the Dog Run” by Walker Evans, exposing their haunting visual similarities. Two images, taken together, yielding a meaning that neither would have apart—that’s what art is all about, and why I’m looking forward to thinking more with pictures.

Should a writer go to college?

with 10 comments

A few years ago, I woke up with the startling realization that of all my friends from college, I was by far the least educated. I don’t mean that in any kind of absolute sense, but simply as a matter of numbers: most of my college friends went on to get master’s or professional degrees, and many of them have gone much further. By contrast, I, who loved college and would happily have spent the rest of my life in Widener Library, took my bachelor’s degree and went looking for a job, with the idea that I’d go back to school at some point after seeing something of the larger world. The reality, of course, was very different. And while I don’t regret any of the choices I’ve made, I do sometimes wonder if I might have benefited from, or at least enjoyed, some sort of postgraduate education.

Of course, it’s also possible that even my bachelor’s degree was a bad investment, a sentiment that seems increasingly common these days. College seniors, we’re frequently reminded, are graduating into a lousy job market. As Louis Menand points out in this week’s New Yorker, it’s unclear whether the American college system is doing the job it’s intended to do, whether you think of it primarily as a winnowing system or as a means of student enrichment. And then we have the controversial Thiel Fellowship, which is designed to encourage gifted entrepreneurs to drop out of college altogether. One of the fellowship’s first recipients recently argued that “higher education is broken,” a position that might be easier to credit if he wasn’t nineteen years old and hadn’t just received a $100,000 check to drop out of school. Which doesn’t necessarily make him wrong.

More interesting, perhaps, is the position of David Mamet, whose new book The Secret Knowledge includes a remarkable jeremiad against the whole idea of a liberal education. “Though much has been made of the necessity of a college education,” Mamet writes, “the extended study of the Liberal Arts actually trains one for nothing.” Mamet has said this before, most notably two years ago in a speech at Stanford University, where he compared the process of higher education to that of a laboratory rat pulling a lever to get a pellet. Of course, he’s been saying the same thing for a long time with respect to the uselessness of education for playwrights (not to mention ping-pong players). And as far as playwrights are concerned, I suspect he may be right, although he gets into trouble when he tries to expand the argument to everyone else.

So is college useful? In particular, is it useful for aspiring members of the creative class? Anecdotal information cuts both ways: for every Tom Stoppard, who didn’t go to college at all, there’s an Umberto Eco, who became a famous novelist after—and because of—a lifetime of academic achievement. Considered objectively, though, the answer seems to lie somewhere in the middle. In Origins of Genius, Dean Simonton writes:

Indeed, empirical research has often found that achieved eminence as a creator is a curvilinear, inverted-U function of the level of formal education. That is, formal education first increases the probability of attaining creative success, but after an optimum point, additional formal education may actually lower the odds. The location of this peak varies according to the specific type of creativity. In particular, for creators in the arts and humanities, the optimum is reached in the last two years of undergraduate instruction, whereas for scientific creators the optimum may be delayed until the first couple of years of graduate school. [Italics mine.]

Which implies that a few years of higher education is useful for artists, since it exposes them to interesting people and gives them a basic level of necessary knowledge, but that too much is unhelpful, or even damaging, if it encourages greater conformity. The bottom line, not surprisingly, is that if you want to be a writer, yes, you should probably go to college. But that doesn’t mean you need to stay there.

Let us now forget famous men

with 10 comments

“More books have been written about [Lincoln] than any figure in human history, with the possible exception of Jesus Christ.”

The photo above was taken three years ago by my then girlfriend, now wife, at the Abraham Lincoln Presidential Library and Museum in Springfield, Illinois. I didn’t get to go, alas—I was living in New York at the time—but the museum, as I was endlessly informed over the next few days, is tons of fun, with elaborate dioramas of the White House, Ford’s Theater, and other family-friendly attractions, including life-size figures of the entire Lincoln clan. When I saw the text of the plaque above, though, I was outraged, for reasons that might seem hard to understand at first. Here’s my verbatim response, at least as well as I can remember: “What about Napoleon?” I demanded. “What about Napoleon?”

You see, I like Napoleon. I like him a lot. Twenty or so books about Napoleon line my shelves, and I’m always on the lookout for more, the older and more adulatory, the better. Why? Emerson’s essay from Representative Men provides a decent starting point, but the short answer is that Napoleon is the most fascinating person I know in world history—”among the most perceptive, penetrating, retentive, and logical minds ever seen in one who was predominantly a man of action,” as Will Durant nicely puts it. He’s the foremost figure of Western history, a man who, for all his flaws, embodies more than any other individual the limits of human energy, intelligence, and ambition. And I was pretty sure that more books had been written about him than anyone else, including Lincoln.

And yet here’s the thing. Napoleon came from almost nothing, and became emperor of Europe. At his coronation, he took the crown out of the Pope’s hands and placed it on his own head. He was, by almost any measure, the most purely productive human being who ever lived. But these days, all that most people could say about Napoleon, if they recognized the name at all, was that he was a short little guy with a funny hat. (Not that short, by the way: he was 5 feet, 7 inches, or roughly the height of Tom Cruise.) That’s what time does: it reduces even the most monumental figures into caricatures of themselves. Two centuries is all it took to turn the leading light of Western civilization to Ian Holm in Time Bandits. It will happen to Lincoln, too, if it hasn’t already happened.

Napoleon, of course, isn’t alone. I was recently reminded of this whole kerfuffle while reading Dean Simonton’s Origins of Genius, inspired by the Malcolm Gladwell article I mentioned last week. Simonton mentions the work of the psychologist James McKeen Cattell, who, back in 1903, made one of the first systematic attempts to rank the thousand most eminent men in history—there were hardly any women on his list—by toting up mentions in major biographical dictionaries and tabulating the results. Here’s his top hundred:

Napoleon, Shakespeare, Mohammed, Voltaire, Bacon, Aristotle, Goethe, Julius Caesar, Luther, Plato, Napoleon III, Burke, Homer, Newton, Cicero, Milton, Alexander the Great, Pitt, Washington, Augustus, Wellington, Raphael, Descartes, Columbus, Confucius, Penn, Scott, Michelangelo, Socrates, Byron, Cromwell, Gautama, Kant, Leibnitz, Locke, Demosthenes, Mary Stuart [the only woman on the list], Calvin, Moliere, Lincoln, Louis Philippe, Dante, Rousseau, Nero, Franklin, Galileo, Johnson, Robespierre, Frederick the Great, Aurelius, Hegel, Petrarch, Horace, Charles V (Germany), Mirabeau, Erasmus, Virgil, Hume, Guizot, Gibbon, Pascal, Bossuet, Hobbes, Swift, Thiers, Louis XIV, Wordsworth, Louis XVI, Nelson, Henry VIII, Addison, Thucydides, Fox, Racine, Schiller, Henry IV (France), W. Herschel, Tasso, Jefferson, Ptolemy, Claudius, Augustine, Pope, Machiavelli, Swedenborg, Philip II, Leonardo da Vinci, George III, Julian, Pythagoras, Macaulay, Rubens, Burns, Mozart, Humboldt, Comte, Cousin, Cuvier, Justinian, Euripides, Camoens.

Now, much of this list remains unimpeachable. The top ten, in particular, would presumably be very similar today, though Bacon would probably give place to Newton, and we’d need to find room for Einstein and, yes, Lincoln. (Also, hopefully, for some women. The only other women, besides Mary Queen of Scots, to make Cattell’s top two hundred were Elizabeth and Joan of Arc, although, at this rate, it’s only a matter of time before we see Sarah Palin.) But with all due respect to my French readers, when I see names like Guizot, Bossuet, Thiers, Comte, and Cousin, among others, my only response is a blank stare. And this is coming from someone who loves Napoleon.

All in all, though, Cattell’s list reminds us how quickly even major reputations can fade. (For an even more sobering reminder, look no further than the bottom of his top thousand. Fauriel, Enfantin, Babeuf, anyone?) And I have no doubt that a contemporary list of the top hundred figures in history, like this one, will look equally strange to a reader a century from now. Just because you made the list once, it seems, doesn’t mean you’ll stay there.

Of mouses and men

with 8 comments

Quality is a probabilistic function of quantity.
Dean Simonton

Malcolm Gladwell’s nifty article on the evolution of the computer mouse in this week’s New Yorker is a terrific read—nobody, but nobody, is better at this sort of thing than Gladwell, which has made him deservedly rich and famous. It’s also, somewhat surprisingly, the most valuable take on the creative process I’ve seen in a long time. I’ve always been interested in the affinities between the artistic process and the work of scientists and engineers, and Gladwell makes the useful point that what most creative geniuses in both fields have in common is their extraordinary productivity. His primary example is Gary Starkweather, the legendary Xerox PARC engineer and inventor of the laser printer, whose creativity was directly linked to the sheer number of his ideas. And in a paragraph that I want to clip out and put in my wallet, Gladwell writes:

The difference between Bach and his forgotten peers isn’t necessarily that he had a better ratio of hits to misses. The difference is that the mediocre might have a dozen ideas, while Bach, in his lifetime, created more than a thousand full-fledged musical compositions. A genius is a genius, [Dean] Simonton maintains, because he can put together such a staggering number of insights, ideas, theories, random observations, and unexpected connections that he almost inevitably ends up with something great.

Gladwell concludes with the Simonton quotation cited at the start of this post, which qualifies, to my mind, as one of the great aphorisms—that is, as a startling reminder of something that should be blindingly obvious. Simonton, incidentally, is a professor of psychology at UC Davis and the author of Origins of Genius: Darwinian Perspectives on Creativity, the subtitle of which refers not to the struggles of genius against genius, as one might think, but to the natural selection of ideas. In nature, natural selection is the result of a Malthusian competition within a large population for limited resources, and it stands to reason that the fittest ideas might arise in a similar fashion. As Simonton says:

Even the greatest creators possess no direct and secure path to truth or beauty. They cannot guarantee that every published idea will survive further evaluation and testing at the hands of audiences or colleagues. The best the creative genius can do is to be as prolific as possible in generating products in the hope that at least some subset will survive the test of time. [Italics mine.]

Which seems obvious enough: most of our greatest artists, from Shakespeare to Picasso, were monsters of productivity, as were nearly all of our great scientists, like Newton. But even more interesting is the point to which Gladwell alludes above, and what Simonton elsewhere calls the “equal odds” rule—that the ratio of total hits to total attempts “tends to stay more or less constant across creators.” Which is to say that if a creative individual of any kind wants to generate more good ideas, the solution isn’t to improve one’s hit rate, but to produce more ideas overall. Productivity is the mother of creativity, by providing the necessary conditions for lasting ideas to emerge. Which is something, I think, that most artists already intuitively grasp. Thanks to Simonton and Gladwell, we’re a little closer to understanding why.

Written by nevalalee

May 16, 2011 at 9:52 am

%d bloggers like this: