Posts Tagged ‘Sigmund Freud’
The ocean swell and the wave
“A group impresses the individual as being an unlimited power and an insurmountable peril,” Sigmund Freud writes in Group Psychology and the Analysis of the Ego, which was published in 1921. After this apparently paradoxical statement, he continues:
For the moment [the subgroup] replaces the whole of human society, which is the wielder of authority, whose punishments the individual fears, and for whose sake he has submitted to so many inhibitions. It is clearly perilous for him to put himself in opposition to it, and it will be safer to follow the example of those around him and perhaps even “hunt with the pack.” In obedience to the new authority he may put his former “conscience” out of action, and so surrender to the attraction of the increased pleasure that is certainly obtained from the removal of inhibitions. On the whole, therefore, it is not so remarkable that we should see an individual in a group doing or approving things which he would have avoided in the normal conditions of life.
As Peter Gay reminds us in his valuable book Freud for Historians, this was hardly a novel insight: “Freud was by no means the first to note that collective bodies—a mob in action, an army in battle, a nation at war—yield to impulses that their members would normally control, probably disclaim, when they are not enjoying the embracing presence of likeminded believers around them.” And he goes on to note that these speculations became the particular object of study, “for highly visible political reasons,” starting around the middle of the nineteenth century.
It seems safe to say that we’re entering a period in which such questions will soon be pondered again at length, and for equally visible reasons. But it’s worth considering what Freud in particular says about the subject, precisely because his perspective has been so unfashionable for so long. As Gay observes in a book that first came out more than thirty years ago: “The last traces of Freud’s notions about the ‘racial’ mind or inherited collective psychological dispositions have been weeded out by his successors as redundant, almost embarrassing reminders of nineteenth-century scientific superstitions about a ‘group’ soul.” But just as with almost everything else that Freud wrote, his most dated speculations are studded with moments of blinding insight. For instance, he draws an important distinction between two kinds of groups:
A number of very different structures have probably been merged under the term “group” and may require to be distinguished…[Some are] groups of a short-lived character, which some passing interest has hastily agglomerated out of various sorts of individuals. The characteristics of revolutionary groups, and especially those of the great French Revolution, have unmistakably influenced [such] descriptions. The opposite opinions owe their origin to the consideration of those stable groups or associations in which mankind pass their lives, and which are embodied in the institutions of society. Groups of the first kind stand in the same sort of relation to those of the second as a high but choppy sea to a ground swell.
Disruptive social movements, in other words, ride on the back of more established organizations—the church, the military, the marketplace—that already exist, and which in most cases will continue long after the most tumultuous waves have vanished. And few of the violent, destructive, even libidinal forces that can disrupt a society could take shape if such support structures weren’t there to facilitate the process.
And Freud’s great insight is that while these institutions may look rational and orderly on the outside, they also provide a framework that allows for irrational behavior, as soon as enough individuals are willing to surrender most of the qualities that prevent him from joining the herd. Freud writes: “An individual in a group is subjected through its influence to what is often a profound alteration in his mental activity. His liability to affect becomes extraordinarily intensified, while his intellectual ability is markedly reduced, both processes being evidently in the direction of an approximation to the other individuals in the group; and this result can only be reached by the removal of those inhibitions upon his instincts which are peculiar to each individual, and by his resigning those expressions of his inclinations which are especially his own.” The italics are mine. A group becomes most effective when its members transform themselves into approximations of one another, defined by a shared set of rules, which means giving up all the inhibitions and inclinations that we’ve built up to set ourselves apart. This can be exhilarating in the moment, but the aftermath is often devastating, as Gay notes:
Hunting with the pack provides the kind of pleasure that such surrender of inhibitions usually gives; it generates a feeling of safety and skirts the danger of placing oneself into opposition to the powerful. Freud saw this abandonment of adult controls and perspectives as a luxuriant saturnalia of regression. But, for all its seductive pleasures, such an affect-laden moral holiday is rarely destined to be permanent. After prolonged reverses or in moments of panic, the libidinal ties holding the crowd together can weaken and the group may then splinter and disintegrate.
In the meantime, Gay writes, we can postpone this moment of disintegration using “two sets of unconscious identifications” that provide us with the energy that we’ve given up as individuals: “The members of the group identify with one another and, collectively, with the leader.”
This certainly sounds familiar today, at a time when the waves on the surface have grown so violent that it can be hard to make out anything deeper. (One of the first signs is the emergence of indefensible moral positions among those who would police the morality or patriotism of others, who can become shockingly willing to abandon their fundamental values for the sake of winning the fight of the moment. They see only the storm, not the sea. And another sign is the sudden, inexplicable capitulation of men and women who have defined themselves in the past as mavericks.) And perhaps the most useful insight that we can take from Freud is the close connection between anxiety and policy, each of which feeds off the energy of the other. “The pursuit of rational self-interest has its non-rational components,” Gay dryly notes, illustrating his point with the three-franchise electoral system that was enacted in Prussia in the nineteenth century, which allocated political representation based on the amount of taxes paid. It resulted in what one historian describes as “an outright plutocratic system” that concentrated power in the upper classes, but as Gay notes, it wasn’t just a matter of cold calculation:
This bit of electoral chicanery elevated into a constitutional principle was, at the same time, an astute defensive device. Sensitive to possible threats from self-confident middle-class citizens and the slowly awakening political awareness of the urban working classes, sensitive to intimations of democracy abroad and of revolution at home, the authors of the three-class electoral law helped to exorcise the anxieties of rich and influential Prussians. It will not do to simply dismiss this political stratagem as a cynical, wholly conscious defense of cherished privileges. A way of life, of traditional, once secure domestic and social pleasure, seemed at stake.
And at a time when the surface wave threatens to destroy everything in its path, Gay ends with a warning to historians that applies equally well to the rest of us: “To neglect the policy by concentrating on the anxiety is to reduce history, unduly, to a mere psychodrama; to neglect the anxiety by concentrating on the policy—which is far more likely to happen to historians—is to flatten, unduly, one’s perception of the past.” Or the present.
Quote of the Day
It may perhaps seem to you as though our theories are a kind of mythology and, in the present case, not even an agreeable one. But does not every science come in the end to a kind of mythology like this?
My great books #4: The White Goddess
Note: I’m counting down my ten favorite works of nonfiction, in order of the publication dates of their first editions, and with an emphasis on books that deserve a wider readership. You can find the earlier installments here.
One of the odd but recurrent patterns of intellectual history is that a false hypothesis proposed by a genius is often more rewarding—or at least generates more useful material, almost by accident—than a correct one offered up by an ordinary mortal. James Frazer’s theory about the priestly succession at Nemi has been rejected by most anthropologists, but without it, we wouldn’t have The Golden Bough, which is still the greatest repository of information and insight ever published on magic, ritual, and religion. You could say much the same about the theories of Freud. And while I no longer believe in the details, or even the general outline, of the historical argument that Robert Graves makes in The White Goddess, I wouldn’t give up the resulting book for the world. It reads today like the kind of conspiracy theory we find in a Dan Brown novel, although infinitely more ingenious, and even Graves knew that orthodox scholars were unlikely to embrace his work: “Though they cannot refute it, they dare not accept it.” For the general reader, fortunately, it doesn’t really matter, because The White Goddess is unsurpassed as a lucky bag of lore, ideas, and clues for other writers to take up and pursue. I’ve found myself browsing through it whenever I start a new writing project, if only on the off chance that one of Graves’s asides or digressions will spark a train of thought that never would have occurred to me otherwise.
Read with an appropriately skeptical mind, The White Goddess is still the best entry point for the intelligent reader on a dizzying range of subjects: Celtic mythology, poetic logic, the interpretation or decoding of mythic and religious iconography, the relationship between the poet and the muse, and the role of intuition in the creative process. The difficulty of his hypothesis forced Graves to range further and delve more deeply than a scholar making a more conventional case, and the material that he tosses up casually along the way has stuck with me longer than his primary argument. (I was first attracted to the book by its back cover’s promise to provide practical answers to countless unsolved riddles of the ancient world, including Thomas Browne’s “What song the sirens sang” or “What name Achilles assumed when he hid among women”—not to mention how to untie the Gordian knot, which Graves handles in a single footnote. And his “solution” to the vision of Ezekiel lies at the heart of my novel City of Exiles.) In the end, it stands as an illustration both of intuition’s possibilities and of its limits, although it also makes mere reason seem cramped by comparison. In his poem in praise of the goddess herself, Graves speaks of “tourbillions in Time made / By the strong pulling of her bladed mind / Through that ever-reluctant element.” “Bladed mind” is really a description of Graves himself, and the tourbillions, or whirlwinds, that he created in his intractable material continue to revolve in my imagination, long after more reasonable books have faded away.
Two ways of looking at the goddess
Over the last few days, I’ve been rereading Shakespeare and the Goddess of Complete Being by the poet Ted Hughes, which is one of the strangest books ever published by a major author. Hughes believed that he had uncovered the formula—which he calls the Tragic Equation—that underlies all of Shakespeare’s mature plays, and he introduces his argument in terms that would make any writer sit up and pay attention:
The immediate practical function of this equation is simply to produce, with unfailing success, an inexhaustibly interesting dramatic action…[Shakespeare] was, after all, part theater owner, part manager, part worker, part supplier of raw materials, and full-time entrepreneur in a precarious yet fiercely demanding industry. Whether it was an old play rejigged or a new piece, it had to work. Maybe, under those pressures, it was inevitable that he should do as other hack professionals have always done, and develop one or two basic reliable kits of the dynamics that make a story move on the stage.
Hughes goes on to describe the formula as “the perfect archetypal plot, one that would guarantee basic drive.” And if you regard Shakespeare as our supreme maker of plots—an aspect of his work that has often been neglected—it’s hard not to feel excited by the prospect of a poet like Hughes reducing his method to a tool that can be grasped or reproduced.
Unfortunately, or inevitably, the core argument turns out to be insanely convoluted. According to Hughes, Shakespeare’s archetypal plot arose from the fusion of two of his early poetic works, Venus and Adonis and The Rape of Lucrece. The plot, as far as I understand it, is that the hero is courted by the goddess, either in the form of Aphrodite, the ideal bride, or Persephone, the queen of hell; he rejects her advances; she kills him in the guise of a wild boar; he descends to the underworld; and finally he “pupates” into a form that rises again to slay the goddess in turn, motivated by a horror of her sexuality. (Hughes also relates this myth to the struggle between Catholicism and Puritanism in Shakespeare’s time, to the myth of Osiris, to Rosicrucianism, and to the cabala, all of which only muddy the issue further.) The trouble, at least when it comes to applying this reading to all of Shakespeare’s plays, is that Hughes reassigns and shuffles the elements of the equation so freely that they lose all meaning or specificity. Sometimes the boar is the dark side of the hero himself, or an usurping brother, or even an entire city; in Macbeth, the goddess is Scotland, as well as the witches and Lady Macbeth; in Othello, it’s Desdemona’s handkerchief. And in attempting to make everything fit, Hughes ends up explaining almost nothing.
Yet it’s still a book that I regard with a lot of respect and affection. Isolated insights and metaphors flash forth like lightning on the page, and even if the argument tells us more about Hughes than about Shakespeare, every paragraph pulsates with life. As the title implies, his book is greatly indebted to The White Goddess by Robert Graves, which Hughes elsewhere cited as a major influence on his thinking, and both books offer the fascinating prospect of a learned and intuitive mind—the kind that appears once in a generation—taking on an impossible argument. And if Graves is still read and discussed, while Hughes’s book remains a curiosity, part of it has to do with their subject matter. Graves centers his argument on a medieval Welsh poem, “The Battle of the Trees,” which few nonspecialist readers are likely to have encountered, while Hughes tackles the most famous writer in the English language, of whose works most readers have already formed an opinion. When Graves takes apart his sources and puts them back together like an enormous crossword puzzle, we’re likely to accept it at face value; when Hughes does the same to Hamlet or King Lear, we resist it, or suspect that he’s imposing a reading, albeit with enormous ingenuity, on a play that can sustain any number of interpretations. In the end, neither book can be accepted uncritically, but they still have the power to light up the imagination.
And in their shared aims, they’re agonizingly important, both to poets and to general readers. Reading them both together, I’m reminded of what Janet Malcolm says about a very different subject in Psychoanalysis: The Impossible Profession:
Soon after the Big Bang of Freud’s major discoveries…the historian of psychoanalysis notes a fork in the road. One path leads outward into the general culture, widening to become the grand boulevard of psychoanalytic influence—the multilane superhighway of psychoanalytic thought’s incursions into psychiatry, social philosophy, anthropology, law, literature, education, and child-rearing. The other is the narrow, inward-turning path of psychoanalytic therapy: a hidden, almost secret byway travelled by few (the analysts and their patients), edged by decrepit mansions with drawn shades (the training institutes and the analytic societies), marked with inscrutable road signs (the scientific papers)…As for Freud himself, he travelled both routes, extending the psychoanalytic view to literature, art, biography, anthropology, and social philosophy…as well as sticking to the theoretical and clinical core of psychoanalysis.
Substitute “poetry” for “psychoanalysis”—or one impossible profession for another—and this is a perfect summary of what both Graves and Hughes are attempting to do: taking the intense, private, inexpressible confrontation of the poet with the muse and extending it into a form that can be applied to how we think about art, history, and our own inner lives. I’m not sure either of them succeeded, any more than Freud did. But the effort still fills me with awe.
The divided self
Last night, I found myself browsing through one of the oddest and most interesting books in my library: Julian Jaynes’s The Origin of Consciousness in the Breakdown of the Bicameral Mind. I don’t know how familiar Jaynes’s work remains among educated readers these days—although the book is still in print after almost forty years—but it deserves to be sought out by anyone interested in problems of psychology, ancient literature, history, or creativity. Jayne’s central hypothesis, which still startles me whenever I type it, is that consciousness as we know it is a relatively recent development that emerged sometime within the last three thousand years, or after the dawn of language and human society. Before this, an individual’s decisions were motivated less by internal deliberation than by verbal commands that wandered from one part of the brain into another, and which were experienced as the hallucinated voice of a god or dead ancestor. Free will, as we conceive of it now, didn’t exist; instead, we acted in automatic, almost robotic obedience to those voices, which seemed to come from an entity outside ourselves.
As Richard Dawkins writes: “It is one of those books that is either complete rubbish or a work of consummate genius, nothing in between! Probably the former, but I’m hedging my bets.” It’s so outrageous, in fact, that its novelty has probably prevented it from being more widely known, even though Jaynes’s hypothesis seems more plausible—if no less shattering—the more you consider his argument. He notes, for instance, that when we read works like the Iliad, we’re confronted by a model of human behavior strikingly different from our own: as beautifully as characters like Achilles can express themselves, moments of action or decision are attributed to elements of an impersonal psychic apparatus, the thumos or the phrenes or the noos, that are less like our conception of the soul than organs of the body that stand apart from the self. (As it happens, much of my senior thesis as an undergraduate in classics was devoted to teasing out the meanings of the word noos as it appears in the poems of Pindar, who wrote at a much later date, but whose language still reflects that earlier tradition. I hadn’t read Jaynes at the time, but our conclusions aren’t that far apart.)
The idea of a divided soul is an old one: Jaynes explains the Egyptian ka, or double, as a personification of that internal voice, which was sometimes perceived as that of the dead pharaoh. And while we’ve mostly moved on to a coherent idea of the self, or of a single “I,” the concept breaks down on close examination, to the point where the old models may deserve a second look. (It’s no accident that Freud circled back around to these divisions with the id, the ego, and the superego, which have no counterparts in physical brain structure, but are rather his attempt to describe human behavior as he observed it.) Even if we don’t go as far as such philosophers as Sam Harris, who denies that free will doesn’t exist at all, there’s no denying that much of our behavior arises from parts of ourselves that are inaccessible, even alien, to that “I.” We see this clearly in patterns of compulsive behavior, in the split in the self that appears in substance abuse or other forms of addiction, and, more benignly, in the moments of intuition or insight that creative artists feel as inspirations from outside—an interpretation that can’t be separated from the etymology of the word “inspiration” itself.`
And I’ve become increasingly convinced that coming to terms with that divided self is central to all forms of creativity, however we try to explain it. I’ve spoken before of rough drafts as messages from my past self, and of notetaking as an essential means of communication between those successive, or alternating, versions of who I am. A project like a novel, which takes many months to complete, can hardly be anything but a collaboration between many different selves, and that’s as true from one minute to the next as it is over the course of a year or more. Most of what I do as a writer is a set of tactics for forcing those different parts of the brain to work together, since no one faculty—the intuitive one that comes up with ideas, the architectural or musical one that thinks in terms of structure, the visual one that stages scenes and action, the verbal one that writes dialogue and description, and the boringly systematic one that cuts and revises—could come up with anything readable on its own. I don’t hear voices, but I’m respectful of the parts of myself I can’t control, even as I do whatever I can to make them more reliable. All of us do the same thing, whether we’re aware of it or not. And the first step to working with, and within, the divided self is acknowledging that it exists.
Beethoven, Freud, and the mystery of genius
“The joy of listening to Beethoven is comparable to the pleasure of reading Joyce,” writes Alex Ross in a recent issue of The New Yorker: “The most paranoid, overdetermined interpretation is probably the correct one.” Even as someone whose ear for classical music is underdeveloped compared to his interest in other forms of art, I have to agree. Great artists come in all shapes and sizes, but the rarest of all is the kind whose work can sustain the most meticulous level of scrutiny because we’re aware that every detail is a conscious choice. When we interpret an ordinary book or a poem, our readings are often more a reflection of our own needs than the author’s intentions; even with a writer like Shakespeare, it’s hard to separate the author’s deliberate decisions from the resonances that naturally emerge from so much rich language set into motion. With Beethoven, Joyce, and a handful of others—Dante, Bach, perhaps Nabokov—we have enough information about the creative process to know that little, if anything, has happened by accident. Joyce explicitly designed his work to “keep professors busy for centuries,” and Beethoven composed for a perfect, omniscient audience that he seemed to will into existence.
Or as Colin Wilson puts it: “The message of the symphonies of Beethoven could be summarized: ‘Man is not small; he is just bloody lazy.'” When you read Ross’s perceptive article, which reviews much of the recent scholarship on Beethoven and his life, you’re confronted by the same tension that underlies any great body of work made within historical memory. On the one hand, Beethoven has undergone a kind of artistic deification, and there’s a tradition, dating back to E.T.A. Hoffmann, that there are ideas and emotions being expressed in his music that can’t be matched by any other human production; on the other, there’s the fact that Beethoven was a man like any other, with a messy personal life and his own portion of pettiness, neediness, and doubt. As Ross points out, before Beethoven, critics were accustomed to talk of “genius” as a kind of impersonal quality, but afterward, the concept shifted to that of “a genius,” which changes the terms of the conversation without reducing its underlying mystery. Beethoven’s biography provides tantalizing clues about the origins of his singular greatness—particularly his deafness, which critics tend to associate with his retreat to an isolated, visionary plane—but it leaves us with as many questions as before.
As it happens, I read Ross’s article in parallel with Howard Markel’s An Anatomy of Addiction, which focuses on the early career of another famous resident of Vienna. Freud seems to have been relatively indifferent to music: he mentions Beethoven along with Goethe and Leonardo Da Vinci as “great men” who have produced “splendid creations,” although this feels more like a rhetorical way of filling out a trio than an expression of true appreciation. Otherwise, his relative silence on the subject is revealing in itself: if he wanted to interpret an artist’s work in psychoanalytic terms, Beethoven’s life would have afforded plenty of material, and he didn’t shy from doing the same for Leonardo and Shakespeare. It’s possible that Freud avoided Beethoven because of the same godlike intentionality that makes him so fascinating to listeners and critics. If we’ve gotten into the habit of drawing a distinction between what a creative artist intends and his or her unconscious impulses, it’s largely thanks to Freud himself. Beethoven stands as a repudiation, or at least a strong counterexample, to this approach: however complicated Beethoven may have been as a man, it’s hard to make a case that there was ever a moment when he didn’t know what he was doing.
This may be why Freud’s genius—which was very real—seems less mysterious than Beethoven’s: we know more about Freud’s inner life than just about any other major intellectual, thanks primarily to his own accounts of his dreams and fantasies, and it’s easy to draw a line from his biography to his work. Markel, for instance, focuses on the period of Freud’s cocaine use, and although he stops short of suggesting that all of psychoanalysis can be understood as a product of addiction, as others have, he points out that Freud’s early publications on cocaine represent the first time he publicly mined his own experiences for insight. But of course, there were plenty of bright young Jewish doctors in Vienna in the late nineteenth century, and while many of the ideas behind analysis were already in the air, it was only in Freud that they found the necessary combination of obsessiveness, ambition, and literary brilliance required for their full expression. Freud may have done his best to complicate our ideas of genius by introducing unconscious factors into the equation, but paradoxically, he made his case in a series of peerlessly crafted books and essays, and their status as imaginative literature has only been enhanced by the decline of analysis as a science. Freud doesn’t explain Freud any more than he explains Beethoven. But this doesn’t stop him, or us, from trying.
The dreamlife of artists
Last week, I finished reading Freud’s Interpretation of Dreams, essentially for the first time. I’ve long been familiar with parts of it, but I’d never managed to work through the entire thing from cover to cover, although I suspected that I’d find it useful as a writer. (David Mamet, for one, recommends it to aspiring screenwriters in On Directing Film, noting that the sequence of cuts in movies has affinities to the procession of imagery in dreams.) What I’ve found is that while Freud’s reputation has taken hit after hit in recent years, the caricatured version of his ideas that most of us have absorbed has little to do with his real body of work. Freud was a frighteningly inventive and perceptive thinker—and also an excellent essayist—who was right about most of the big things, even if many of the particulars, as ingenious as they are, no longer stand up to scrutiny. And nowhere is his originality more clearly on display than in The Interpretation of Dreams, which at its heart is a probing, sometimes difficult, but always enlightening act of literary analysis on the most intractable texts imaginable.
What Freud is really proposing, in fact, is nothing less than a general theory of creativity, except that it happens to take place in a part of the brain that we aren’t used to observing. In Freud’s conception, a dream originates as a repressed wish, often from early childhood, and it doesn’t necessarily need to be sexual in nature: it can reflect other physical and emotional needs, as well as such desires as those for power, respect, fulfillment, or the love and safety of those close to us. This primal wish is united with details from the day before—the more trivial and insignificant the better—that happen to provide raw material for the wish to take on a concrete form. The result is then subjected to several additional processes that make the underlying meaning harder to discern. There’s condensation, in which multiple dream thoughts are fused into a single object or image; displacement, in which an unsettling wish is transferred to something else or transformed into its opposite; and the tendency for dreams to depict abstract concepts and feelings in visual terms, often in bewildering ways that owe more to wordplay and association than to waking logic.
Finally, there’s a kind of editing function involved, an attempt by the brain to rework all of this ungainly material into something resembling a coherent narrative. (Freud notes that this interstitial imagery, as the mind stitches together unfiltered components from the unconscious into a sequence of events, is usually the least convincing part of the dream.) And while I don’t intend to get into a discussion of the overall validity of Freud’s ideas, I can’t help but think that this a surprisingly accurate account of how the creative process works in waking life. A story or work of art generally originates in deep-seated impulses—ideas and feelings that have been percolating in the artist’s inner life for some time—but it builds itself up from more recent pieces, images or fragments of experience that have lately caught the creator’s eye. These elements from the real world are progressively condensed, displaced, and dramatized in tangible ways. And ultimately, they’re edited and revised, often at more than one stage in the process, so that the result has a logic that wasn’t present in earlier drafts, but at the risk, as Freud identified, of ending up with something calculated and unpersuasive.
Whether this means that creative thought really is a kind of dream, as so many artists have suggested, or that creativity and dreaming are two aspects of the same process exercised at different times, is something I won’t try to settle here. I will say, however, that I’ve grown increasingly convinced of the importance of listening to the lessons that dreams present. (Freud points out, for instance, that dreams often express temporal or causal logic in spatial terms, so that instead of showing one event causing another, the two events are simply shown side by side. This seems like a promising area of exploration for writers, who are often called upon to compress long chains of causality into a single scene or image.) As Freud, Jung, and others have pointed out, our conscious mind is there for a reason: it’s what allows us to form societies, build bridges, and write novels that can be understood by more than one person, and none of this would be possible if we didn’t keep the unconscious under control. Like an analyst, however, a writer needs to make incursions into those deeper levels on a regular basis, while always sustained by diligence and craft, and in both cases, we may find that our dreams can point the way.
“The innovator must be discontented…”
The innovator, however, must in the first place be discontented, he must doubt the value of what he is doing or question the accepted ways of doing it. And secondly, he must be prepared to take fresh paths, to venture into fields where he is by no means expert. This is true, at least, of major forms of innovation; they make it possible for other men to be expert, but are not themselves forms of expertise. Freud was not an expert psychoanalyst; before Freud wrote there was no such thing; he created the standards by which psychoanalysts are judged expert. Neither was Marx an expert in interpreting history in economic terms nor Darwin an expert in evolutionary biology. If a man is trained, purely and simply, to be expert and contented in a particular task he will not innovate; Freud would have remained an anatomist, Marx a philosopher, Darwin a field naturalist.
“First I want to get my own ideas into shape…”
First I want to get my own ideas into shape, then I shall make a thorough study of the literature on the subject, and finally make such insertions or revisions as my reading will give rise to. So long as I have not finished my own work I cannot read, and it is only in writing that I can fill in all the details.
The literature which I am now reading is reducing me to idiocy. Reading is a terrible infliction imposed upon all who write. In the process everything of one’s own drains away. I often cannot remember what I have that is new, and yet it is all new. The reading stretches ahead interminably, so far as I can see at present.
Googling the rise and fall of literary reputations
Note: To celebrate the third anniversary of this blog, I’ll be spending the week reposting some of my favorite pieces from early in its run. This post originally appeared, in a somewhat different form, on December 17, 2010.
As the New York Times recently pointed out, Google’s new online book database, which allows users to chart the evolving frequency of words and short phrases over 5.2 million digitized volumes, is a wonderful toy. You can look at the increasing frequency of George Carlin’s seven dirty words, for example—not surprisingly, they’ve all become a lot more common over the past few decades—or chart the depressing ascent of the word “alright.” Most seductively of all, perhaps, you can see at a glance how literary reputations have risen or fallen over time.
Take the five in the graph above, for instance. It’s hard not to see that, for all the talk of the death of Freud, he’s doing surprisingly well, and even passed Shakespeare in the mid-’70s (around the same time, perhaps not coincidentally, as Woody Allen’s creative peak). Goethe experienced a rapid fall in popularity in the mid-’30s, though he had recovered nicely by the end of World War II. Tolstoy, by contrast, saw a modest spike sometime around the Big Three conference in Tehran, and a drop as soon as the Soviet Union detonated its first atomic bomb. And Kafka, while less popular during the satisfied ’50s, saw a sudden surge in the paranoid decades thereafter:
Obviously, it’s possible to see patterns anywhere, and I’m not claiming that these graphs reflect real historical cause and effect. But it’s fun to think about. Even more fun is to look at the relative popularity of five leading American novelists of the last half of the twentieth century:
The most interesting graph is that for Norman Mailer, who experiences a huge ascent up to 1970, when his stature as a cultural icon was at his peak (just after his run for mayor of New York). Eventually, though, his graph—like those of Gore Vidal, John Updike, Philip Roth, and Saul Bellow—follows the trajectory that we’d suspect for that of an established, serious author: a long, gradual rise followed by a period of stability, as the author enters the official canon. Compare this to a graph of four best-selling novelists of the 1970s:
For Harold Robbins, Jacqueline Susann, Irving Wallace, and Arthur Hailey—and if you don’t recognize their names, ask your parents—we see a rapid rise in popularity followed by an equally rapid decline, which is what we might expect for authors who were once hugely popular but had no lasting value. And it’ll be interesting to see what this graph will look like in fifty years for, say, Stephenie Meyer or Dan Brown, and in which category someone like Jonathan Franzen or J.K. Rowling will appear. Only time, and Google, will tell.
How to repeat yourself
Writers are generally advised not to repeat themselves. After I’ve finished the rough draft of a story, one of my first orders of business is to go back through the manuscript and fix any passages where I’ve inadvertently repeated the same word in the same sentence, or within a short run of text. Knowing how often you can use a word is a matter of taste and intuition. Some words are so common as to be invisible to the reader, so you can, and should, use the word “said” exclusively throughout a story, even as dialogue can usually be varied in other ways. Other words or phrases are so striking that they can’t be used more than once or twice in the course of an entire novel, and I’ll sometimes catch myself maintaining a running count of how often I’ve used a word like “unaccountable.” Then there are the words that fall somewhere in the middle, where they’re useful enough to crop up on a regular basis but catch the reader’s eye to an extent that they shouldn’t be overused. Different writers fall back on different sets of words, and in my case, they tend to be verbs of cognition, like “realized,” or a handful of adverbs that I use entirely too often, like, well, “entirely.”
Whenever I’m sifting through the story like this, part of me wonders whether a reader would even notice. Some of these repetitions jar my ear to a greater extent than they would for someone reading the story more casually: I’ve often revisited these pages something like fifty times, and I’m acutely aware of the shape of each sentence. (Overfamiliarity can have its pitfalls as well, of course: I’m sometimes shocked to discover a glaring repetition in a sentence that I’ve read over and over until I can no longer really see it.) But I encounter this issue often enough in other authors’ books that I know it isn’t just me. Catching an inadvertent repetition in a novel, as when Cormac McCarthy speaks twice in Blood Meridian of something being “footed” to its reflection, has the same effect as an unintentional rhyme: it pulls you momentarily out of the story, wondering if the writer meant to repeat the same word or if he, or his editor, fell asleep at the switch. And a particularly sensitive eye can pick up on repetitions or tics that even an attentive reader might miss. In his otherwise fawning study U & I, Nicholson Baker complains about John Updike’s overuse of the verb “seemed,” which even I, a massive Updike fan, hadn’t noticed until Baker pointed it out.
But repetitions can also be a source of insight, especially when you’re coming to grips with an earlier draft. A writer can learn a lot from the words he habitually overuses. If you find yourself falling back on melodramatic adverbs like “suddenly,” you might want to rethink the tone you’re taking—it’s possible that you’re trying to drum up excitement in a story that lacks inherent dramatic interest. My own overuse of verbs like “realized” might indicate that I’m spending too much time showing characters thinking through a situation, rather than conveying character through action. You can learn even more from longer phrases that reappear by accident. As John Gardner writes in The Art of Fiction, discussing a hypothetical story about Helen of Troy:
Reading…lines he has known by heart for weeks, [the writer] discovers odd tics his unconscious has sent up to him, perhaps curious accidental repetitions of imagery: The brooch Helen threw at Menelaus the writer has described, he discovers, with the same phrase he used in describing, much later, the seal on the message for help being sent to the Trojans’ allies. Why? he wonders. Just as dreams have meaning, whether or not we can penetrate the meaning, the writer assumes that the accidents in his writing may have significance.
And the comparison to dreaming is a shrewd one. “Repetitions are magic keys,” Umberto Eco writes in Foucault’s Pendulum, and although he’s talking about something rather different—a string of sentences randomly generated by a computer—there’s a common element here. When you write a first draft, you’re operating by instinct: you accept the first words that come to mind, rather than laboriously revising the text, because you’re working in a mode closer to the events of the story itself. At its best, it’s something like a dream, and the words we select have a lot in common with the unmediated nature of dream imagery or word association in psychoanalysis. Later, we’ll smooth and polish the surface of the prose, and most of these little infelicities will be ironed away, but it doesn’t hurt to look at them first with the eye of an analyst, or a critic, to see what they reveal. This doesn’t excuse us from falling back on the same hackneyed words or phrases, and it doesn’t help a writer who thinks entirely in clichés. But it’s in our slips or mistakes, as Freud knew, that we unconsciously reveal ourselves. Mistakes need to be fixed and repetitions minimized, but it’s still useful to take a moment to ask what they really mean.
On the novelist’s couch
Recently, I’ve been thinking a lot about Freud. Psychoanalysis may be a dying science, or religion, with its place in our lives usurped by neurology and medication, but Freud’s influence on the way we talk about ourselves remains as strong as ever, not least because he was a marvelous writer. Harold Bloom aptly includes him in a line of great essayists stretching back to Montaigne, and he’s far and away the most readable and likable of all modern sages. His writings, especially his lectures and case notes, are fascinating, and they’re peppered with remarkable insights, metaphors, and tidbits of humor and practical advice. Bloom has argued convincingly for Freud as a close reader of Shakespeare, however much he might have resisted acknowledging it—he believed until the end of his days that Shakespeare’s plays had really been written by the Earl of Oxford, a conjecture known endearingly as the Looney hypothesis—and he’s as much a prose poet as he is an analytical thinker. Like most geniuses, he’s as interesting in his mistakes as in his successes, and even if you dismiss his core ideas as an ingeniously elaborated fantasy, there’s no denying that he constructed the central mythology of our century. When we talk about the libido, repression, anal retentiveness, the death instinct, we’re speaking in the terms that Freud established.
And I’ve long been struck by the parallels between psychoanalysis and what writers do for a living. Freud’s case studies read like novels, or more accurately like detective stories, with the analyst and the patient navigating through many wild guesses and wrong turns to reach the heart of the mystery. In her classic study Psychoanalysis: The Impossible Profession, Janet Malcolm writes:
In the Dora paper, Freud illustrates the double vision of the patient which the analyst must maintain in order to do his work: he must invent the patient as well as investigate him; he must invest him with the magic of myth and romance as well as reduce him to the pitiful bits and pieces of science and psychopathology. Only thus can the analyst sustain his obsessive interest in another—the fixation of a lover or a criminal investigator—and keep in sight the benign raison d’être of its relentlessness.
To “the fixation of a lover or a criminal investigator,” I might also add “of a writer.” The major figures in a novel can be as unknowable as the patient on the couch, and to sustain the obsession that finishing a book requires, a writer often has to start with an imperfect, idealized version of each character, then grope slowly back toward something more true. (Journalists, as Malcolm has pointed out elsewhere, sometimes find themselves doing the same thing.)
The hard part, for novelists and analysts alike, is balancing this kind of intense engagement with the objectivity required for good fiction or therapy. James Joyce writes that a novelist, “like the God of the creation, remains within or behind or beyond or above his handiwork, invisible, refined out of existence, indifferent, paring his fingernails,” and that’s as fine a description as any of the perfect psychoanalyst, who sits on a chair behind the patient’s couch, pointedly out of sight. It’s worth remembering that psychoanalysis, in its original form, has little in common with the more cuddly brands of therapy that have largely taken its place: the analyst is told to remain detached, impersonal, a blank slate on which the patient can project his or her emotions. At times, the formal nature of this relationship can resemble a kind of clinical cruelty, with earnest debates, for instance, over whether an analyst should express sympathy if a patient tells him that her mother has died. This may seem extreme, but it’s also a way of guarding against the greatest danger of analysis: that transference, in which the patient begins to use the analyst as an object of love or hate, can run the other way. Analysts do fall in love with their patients, as well as patients with their analysts, and the rigors of the psychoanalytic method are designed to anticipate, deflect, and use this.
It’s in the resulting dance between detachment and connection that psychoanalysis most resembles the creative arts. Authors, like analysts, are prone to develop strong feelings toward their characters, and it’s always problematic when a writer falls in love with the wrong person: witness the case of Thomas Harris and Hannibal Lecter—who, as a psychiatrist himself, could have warned his author of the risk he was taking. Here, authors can take a page from their psychoanalytic counterparts, who are encouraged to turn the same detached scrutiny on their own feelings, not for what it says about themselves, but about their patients. In psychoanalysis, everything, including the seemingly irrelevant thoughts and emotions that occur to the analyst during a session, is a clue, and Freud displays the same endless diligence in teasing out their underlying meaning as a good novelist does when dissecting his own feelings about the story he’s writing. Whether anyone is improved by either process is another question entirely, but psychoanalysis, like fiction, knows to be modest in its moral and personal claims. What Freud said of the patient may well be true of the author: “But you will see for yourself that much has been gained if we succeed in turning your hysterical misery into common unhappiness.”
Quote of the Day
Happiness comes only with the fulfillment of a childhood wish.