Articles 1. personal investment in modernism and marxism 5. college education before the sixties hit
prefacing... Before embarking, let me post a warning that this writing, in fact all my writing, is subject to change over time. It is one reason I have opted for web publication rather than the printed book (even if I had that choice!), since a few keystrokes will rearrange the symbols with little fuss. Any day and any moment, or a year from now, I may replace, eliminate, or add text, on rare occasions even reverse myself on ideas long held. It is a living text, and like the book it is a way of being alive made possible only through technological innovation. I have become unapologetic about such alterability; it is exactly how I like to write. Rather than assimilate to the usual concepts of how thinking must relate to writing, I would proclaim my way as closer to the nature of thinking. As it flows through the mind thought is instantly replaceable, even what later shows itself as clothed in the most formidable rhetoric and encased in hard copy. Thought and rhetoric are vulnerable; to work in a form that recognizes this does not weaken the work but strengthens it. A form adequate to present thinking promises a different kind of relation to the writer than the result of thought, a different ground for trust than what we are used to. In fact this way is closer to face-to-face communication--orality, which is by definition precisely now--than a text distanced by the lapse of a year or ten years previous, when it was set in stone and impervious to the elements. I think of myself as by habit culturally conservative, bookish and print-oriented, but when I write, then this and not the printed book is what gets me to the daily work of writing. You might hesitate to read something when the writer is apparently not convinced enough to put his name to it for all time. But I am more completely convinced than if I were to freeze my thoughts and expression and have to stand on that melting block. I am sixty-seven; the block melts and freezes again. Of that I am absolutely sure and pledge never to recant. This is not a work in progress, the label for tentative presentation by artists and even scholars, no, this is it, the real, finished and re-finished thing. I would not say I correct, edit, improve the text, do not seek a conclusive, finally adequate text, nor do I discern behind it an evolution narrative. Rather I enter it like a craftsman’s workshop or artist atelier, and spend my time fitting things together, taking apart, painting and painting over and scraping through to earlier layers, using everything I find to fabricate. Much as I have been playing music the last thirty years, I improvise for the sake of improvising, for the activity, the pleasure of working. Rather than exit this room bearing a work with which I have found closure, polished and varnished and bearing my pride, I have constructed a window for the sake of passersby. That window is this webpage. What you see through it is the remains, the leftovers, grist for the next day or next year when I come back and grind my way through it again. As with my music, I'm just a beginner, but that's no excuse! For all the apparent self-indulgence, I am ever aware of possible eyes glancing in through the window. Your pleasure delights me! I try not to be distracted—that is part of my discipline and concentration--yet I would not do this work except for the fantasy of being touched by your fleeting look, if not your smile.
the following from book 4, a journal written in the mid-nineties and revised in the present . [reading Malcolm Bradbury and James McFarlane, ed., Modernism, 1976] The kind of Marxism I appropriated in the sixties allowed me to dismiss the whole upsetting mishmash of modernist literature by proclaiming it bourgeois territory, destined for the dustbin, weak and womanish in its ambiguities and endless self-questioning, a manifestation of the sterility of paralyzed artists. I would surely never allow myself to become one of them. My attitude was arrogant and contemptuous; my sight was fixed on the inevitable future, when the bourgeois modernist would be a thing of the past. If the fate of all this was settled by History then my contempt was not my judgment, but merely a recognition of the way things really stood. The Marxist “class struggle” always intervened to save me, pulling me away from these worrisome pests with "you don't need to bother with them." Saved from the miasma of disconcerting fiction by the optimistic non-fiction that it would all be overcome by history, I could indulge and even immerse myself in literature. My belief that bourgeois literature lived on borrowed time in fact masked my identification with heroes and their fates, my love of language and the playfulness of fiction. On the one hand I embodied the contempt of hard-faced revolutionaries for the softness of dreamy classicism, for the self-pity of poets mourning the loss of cultural tradition, for the luxury of painful feelings, none of which can be stopped by History. All the while, I was the very one mourning such loss and trapped in those shameful feelings that were attached to the bourgeois and earlier classes, all of them doomed. I have slid far from the proud certainties of that nineteenth century faith in reason and the will to change the world, down to the shifting sands of modernist irony and skepticism, where every faith has experienced its under-mine. I am not the Marxist surveying the anticipated wreckage of others, not even the intellectual bearing a truth for others; rather I'm digging into no one's dirt and stone but my own. In Modernism [by which the editors Bradbury and McFarlane mean a broad category of European fiction and poetry, defined by common themes or style, c. 1890-1930] we experience the conjuncture of a protest against the banality of modern everyday life and flattening out of traditions, alongside a celebration of the very rush of change that inaugurated that everyday. One word buys both the resistance to the modern and the championing of it. The diversity of writers considered modernist were not members of anything so singular and self-conscious as a movement, such that these two themes could coexist in the same writers, as if modernity could overcome modernity. This contradiction was not resolved by proclaiming the priesthood of the visionary artist, for this figure celebrated change and also protested the bourgeois. It was as if the bourgeois, the original middle class, were the victim and not a motivator of that upheaval called the modern, or that change would ultimately overcome the bourgeoisie itself, as Marx predicted. Perhaps, as in Thomas Mann’s Buddenbrooks, the artist would inherit the world—a fantasy that has had a long life. Or the writer might aestheticize the dullness and violence of the modern world, and embrace intrusive technologies, such as the Italian Futurists. With their indulgent love of speed and violence, the airplane and big booming guns of war, they would stand a bit apart from the French and German mainstream, yet they represent an important exaggeration. There later evolved a post-war drift towards an aesthetic silence of non-theistic mysticism--Beckett, Blanchot, Cage (earlier, Rilke?) and some of the abstract expressionist painters. Perhaps silence is consequent upon frustration with both political action and the various positive programs of aestheticism, but then it forgets how hinged to social order is any artistic expression. Silence forgets its origins in anger and defeat and becomes exile, grim and downcast, and emitting the superior wisdom of non-engagement. Modernism extends the battle with the world that opens with Rousseau, generalizing his personal complaints as the lament of the writer, who can never stay in his closet for long. It continues with Wordsworth and Goethe, whose own lifetimes were divided along the lines of the later post-revolutionary dilemma, whether to be at war or at peace with life and society. The aesthetic mystique of Blanchot and Beckett finds its way back through Walter Pater to Novalis and Schelling, a thread of lament-retreat-renunciation, suffering the wound of experiencing a crumbling world. Did Nietzsche really love “not knowing the future” or was he trying to convince himself, here as elsewhere? Not that I blame him. I would read this tradition into Cage, so embraced today by those who envy his apolitical dismissal of suffering but can’t imagine saying that themselves in our politicized world. To envision an historical continuity of a broad response to the post-traditional world, from at least Rousseau through Melville to today, opens this art as a present question. Irving Howe, in an essay on literary critic Lionel Trilling and written around the time of Modernism, spoke of modernists as “writers of the oblique, perverse, complex, problematic.” [Celebrations and Attacks, Thirty Years of Literary and Cultural Commentary, 1979, p.213.] I might add “self-questioning” and “alienated”. If this is so, then postmodern theory, which in the eighties drew a firm boundary establishing the end of modernism, would have something of the opposite character: straight, normal, simple, unproblematic, self-referential, and unalienated. Now that postmodernism (and its optimistic neo-liberal neighbor) has had its comeuppance, can we get back to the virtues of the oblique, perverse, and so on? A hard truth for me that should have been obvious long ago: desire for war and for revolution are not so far apart; my Bolshevik spirit encompassed both. That spirit I can still access and will not dismiss to the dustbin as I once did Modernism, nor will I embody or submit to it. However at one time I did choose for Lenin, for whom pacifism was a changeable tactic and war a means to serve greater ends, and I chose against those who became confused and reluctant when they saw where Bolshevism was headed. German Expressionists, more realistically called a "movement" than Modernism, could also be classed with such militancy even if they killed no one; the very images and sense of historical importance of such avant-garde movements were drawn from the military. Art was cultural violence, politics by other means, which for some today it still is. Pre-war (WWI) German Expressionist poets longed for war to break things apart in European society and worldview, an anger and utopianism that could move equally towards fascism or communism. A war initially engaged with optimism for social and cultural change, a new life for all! What else but that was my contempt for liberal peaceniks of the late sixties, a scorn I took to be an expression of revolutionary desire. I lifted my model for America roughly from the pages of Bolshevik history--civil war would follow on the heels of the Vietnam war, which the politicians and powers of the day seemed determined to win despite massive opposition. Like the Expressionists, I didn't want things to return to normalcy but to build upwards from the social unrest then mounting. I wasn't looking the other way, I was in that thick soup, a scholar deep into books became inseparable from rage. My politics, for all its claims to be firmly rooted in the cool analysis of social conditions, fed off a closed fantasy that could not imagine retreat. My will to conform to the side I knew had to win, to share the spoils, was hardly a will capable of parting company with fellow fighters, if ever tested to stand aside and be counted among the enemy. Yet when realism caught up with me, when I saw in the mirror the stupid judgments of those who shared my rage, I could only back out and hope they never had a hand in shaping my or anyone’s destiny. The appeal of Russia to those of Western Europeans was romantic, which means mythical, participatory, working as metaphor and analogy: we can see ourselves joined in the revolutionary parade. I speak as one of those who was drawn to the fantasy; Russia was in some sense “our country”. The idealization can be distinguished from the desire for war/revolution, the love of struggle and attacking the “old world” that I speak of above. At least it qualified the energies of struggle, as the goal not to be forgotten in the midst of fighting. Its wider meaning was not the reformist one, which implies adding improvements to the world we already have, but the dream of thinking afresh, starting over. The faith was that we could share in freedom with the world’s people and create a new collective life and way of living, so it engaged our own sense of oppression and not just that of others. It was expressive, poetic, even spiritual. Here is the opening title from Eisenstein’s film of 1925, “The Battleship Potempkin”, the story of the 1905 mutiny in the midst of a widespread revolt: “The spirit of revolution soared over the Russian land. A tremendous, mysterious process was taking place in countless hearts. The individual personality, having hardly had time to become conscious of itself, dissolved in the mass, and the mass itself became dissolved in the revolutionary élan.” This movie has now been resurrected as an historical and aesthetic event, without the thought that would have come to earlier revolutionists —“these sailors are us!” It seems an established fact on the left today that sailors and other common people could not actually initiate a political stand that we would be able to join. I watch this film and cannot deny this possibility for myself: “Impotent rage was overflowing.” These were loyal military men who had had enough—can we imagine that?—who risked their lives in this mutiny, with many later executed. Watching the scene where the officers line up the rebellious sailors to be executed, I am inspired to demand, as I did in my days as a revolutionist: “Which side are you on?” I confront the present as others do, including the modernists of the last two centuries--trying to find ourselves in the present, to let ourselves loose in our own here and now. I find in many of those called modernist, as in myself, a consciousness of the juncture of created being (the product of a specific world) and the complicitous being that creates the world with others--the subject called into being by the world and the subject who calls the world into being. That we are both sides of the subject is confusing, head-spinning, yet can give depth to all one’s choices and behavior. In this context, "modern" means: all senses open to the world as it appears in the present moment, and acknowledgement of one’s unavoidable participation. As a young man I refused assent to the modern, perhaps imagining the mask of the Marxist but really fearing the modern as would a traditionalist and refusing to consume and be consumed by it. I did read many of the classic modernist texts, such as Kafka and Mann, but saw them as distanced from the values of modernity, trying to find a way to be human in this strange new world rather than affirming it. So it is now difficult to support the postmodern view that modernists were aligned with and even represent "modernity", over which History is now triumphing, partly through postmodern Theory. This seems close to the crude Marxist dismissal which I had so much difficulty overcoming and do not wish to return to. The difference with my earlier view seems to be that postmodernism celebrates the present culture that is in process of burying modernism, whereas my Marxist persona tended to be skeptical of all present culture. I remember the tottering vertigo I felt whenever I approached the awareness that I was living in the modern age, not understanding why it had to be there. Flood, avalanche, overwhelm, it evoked my painful subjectivity. For instance, I fell into Dostoevsky too easily, I thought, for my own good; it was dangerous for those like myself to get close to him, a pathological attraction. I felt oppressed by the banality of my own life if I couldn't keep some part of my being separate from the world around me. The ancient and pre-modern European past, on the other hand, was me yet not me--it had to be both to attract and sustain me. It allowed me to escape into my own alien dimensions and explore my inalienable roots hidden underground, my unconscious if you will, which I trusted to the imagination. Unlike modern writings which wrapped me in familiar identities and milieux, the ancient past was all on paper, safely confined to a world to which there was no return. For instance, traditionally revolution meant the cycle of history, patterns assimilable to wisdom, and not upheavals that slipped off the edge of meaning, as they seemed to do after the French Revolution. Once past that watershed (1789) I would have to converse with people whose fundamental conditions of instability and insecurity were my own; I would have to look them—and so myself--in the eye. Activist politics in the late sixties was the way out of the past, leaping over all that I had feared and right into the maelstrom. For the positivist goals the university promoted in the humanities and social sciences, the remote past was more appropriate than the present or anything contiguous to it, as a precautionary distance. In general, academic departments encouraged an appearance of isolation and distance between the object of study and the investigator’s political views or psyche, both of which were subjective and contingent and needed to be strictly disciplined. Such separation it referred to as scientific objectivity, and science was Truth, which even mainstream religion could not challenge. Any method that could insert statistics, a chart, or reference to a “sound methodology” in the humanities was an automatic plus, assimilating our work to the “exact” or "hard" sciences. Here was the crux of my love/hate with scholarship, which offered me escape from my wayward mind, at the same time a useful proof of my intelligence, even as I clamored at the wall it set against me as a subject. In particular, the symbiotic contract between detached study and the distant past was contained in the admonition, "we are too close to the present to come to conclusions about it," that is, for those of us condemned to inexactitude to reach the purity and finality expected of scientific knowledge. This was also preached in the middle-brow press, walling off the recent horrors of the Second World War, and the unsettling effects of our own returning fighters. It had to be absolutely clear who were the heroes and who the murderers—this division was fundamental to the cold war which succeeded the hot one with barely a breather in between. The present, which for us back then meant living through the aftermath, was still too hot with the fear of regression (meaning war with the commies) and the megalomania of the victorious. To coolly extricate the present as an object of knowledge from an environment of passionate commitment seemed impossible, an either/or situation. This contributed to a model that would have knowledge radically cut off from anything messy, ambivalent, and personally motivated by the will to know. Knowledge could only be obtained concerning what we, like the Enlightenment’s Deist God, had not experienced in a way specific to the individual. The scholarly episteme of that era often hid a misty humanism (which I both shared and criticized), an envious identification with subjects who didn't have to experience modern confusion, yet a disidentification that escaped the limiting conditions of those in the distant past. The scholar transmuted those conditions into contemporary words, for their sake, as it were—they knew not what they did; we can know the past better than it could ever know itself. Not only was this patronizing, it avoided relevance and analogy to things that might have mattered considerably to the present. Scholarship built a safe haven of ignorance out of its bricks rather than an expanded imaginary for the present. It was profoundly pessimistic to deny the word "knowledge" to things up close, as if to reserve that word for some kind of attainable consensus of a small body of intellectual workers, such as the present could not provide. This was also an ethic protecting the scholar from scrutiny at a time when scrutiny for political commitment was at its height. Partisanship in the contemporary world was cast to a lower level of human worth, what is contingent and unverifiable as a scientific conclusion. Partisans, after all, are people who can expect to get into trouble. I have put this in the past tense, voicing a criticism that could be found on other side of the sixties watershed; it reflects a tension in me as one who was then on that scholarly road. After the turn of the seventies students and scholars in the humanities and social sciences increasingly had to answer to the charge of political or humanitarian relevance: what have you been doing that is helpful to the world? This may have first appeared in all those prefaces that included reference to the “turmoil” of the present period, even spoke of “revolutionary change” that was in the air. To argue for the relevance of one’s work was a reaction against the scholarly episteme that avoided the present or claimed a neutrality that concealed political judgments about the past. The challenge became a new rule to live by, whereby those whose minds had wandered in paths of interest to only a handful of others would be required to defend their social worth and function. I got out of the profession [1972] before I had to stand before this new judge. My radical, activist politics had no place in the classroom; I even refused to state my opposition to the Vietnam War, instead promoting questions about what political thinking might require. My political, religious, and personal subjectivity—my psyche—was quite engaged in my study, little as I understood the dynamics of this at the time. But when I saw “relevance” appear on the academic horizon in the early seventies I found it laughable; I couldn’t imagine it was the future. For the university, seeking to attract and placate students, relevance meant to provide courses of personal interest and to eliminate most required courses; to radical faculty it meant to relate every topic to current political concerns, ultimately to politicize the classroom. On the contrary, I didn’t link my work to the world in any practical, functional way; nor was I one of those defending scholarly distance against the “present-minded” trend. My work and my political viewpoint were at the intersection of my subjectivity and the world, their relevance to each other was in question and could not be forced.
Reading in depth [non-fiction] has come to mean that my own consciousness and the book before me reverberate in a way I can at times feel almost physically. I have a hand in creating this effect by parachuting myself into the text, determined to find in it my own thought and being, and then to find how it could somehow be true. My job is to find the mode of translation, the key that makes it my truth. Occasionally the author interferes with this process by obstinately positing his or her own “private” conceptions, as they seem to me--hard and untranslatable. The author asserts the right to resist like a petulant child and breaks my concentration, blocking the pretence I have constructed. A moral figure often appears warning me of the peril of this self-centering. Like all indulgent pleasures, I will have to pay sooner or later; I await the sentence. This unpleasant shift of consciousness always catches me unawares and threatens to throw me back to the search-for-the-real-truth pattern of school and scholar, the truth that has nothing to do with me. This Scholar poses the reader as an autonomous self alongside the author's; he compares the opinions of both, however malleable and subject to influence might be those of the reader, and however one-sided the flow towards the more substantial opinions of the author, fortified by immovable typescript and the publisher’s and editor’s authority. The reader-as-Scholar asks to be persuaded and decides if he has been; submits here and criticizes there, assesses the strength and weakness of a text and makes a judgment. Since I have no quarrel with this figure, the shift over to him brings me to accuse my peculiar approach as the fault of solipsism and thus of immateriality, the lack of a sticking place, of reference outside myself and relevance to others. Let me generalize, since I am surely not the only one unable consistently to come up to the scholarly ideal. Ultimately the question is, what do we retain of our own identity as solid and stable human beings on the perhaps rare occasion that we plunge into another's world and dissolve ourselves in it? Do we re-emerge in the same place we sank or dived in, or is that place impossible to re-locate? If all vocations and specific responsibilities to the world, such as scholarly rationality, can dissolve—not that they should--then the act and very place of re-emergence reveal themselves as some version of fiction, that is, alterable, a work that engages the imagination. Perhaps instead of coming up for the air we think our human identities require, we go down and become fish of the deep. Or we might shift back and forth from surface to depths and allow the turbulence of our movements. This has to do with loyalties, commitments that might last an adult lifetime. Just as each musician has a story of how he or she at one time or period found the courage and inspiration to plunge into a life of music, so every scholar—tell me if I’m wrong!--has a story of immersion, epiphany, sudden light that dawned, and often can name a particular book, author or teacher to whom they give a respect that cannot be completely defined as scholarly. No musician is merely working to perfect technique or to create the best music; no scholar fully embodies the Scholar. We are both more and less than that. At one period of my studies [1990] I self-consciously dropped myself into a broad range of texts and found myself lifting out ideas, molding and pasting them onto the subject most on my mind--free improvisation. I found analogy and relevance everywhere, all grist for the mill that churned out proof of the validity of my singular musical form. It awakened my intellect from its slumber to see improvisation reflected in such diverse reading as physics and mythology, art and political theory, as if those writers who knew nothing of improv were voluntarily coming to my doorstep. I had at that time stepped back from the world of organizing and touring fulltime and could see as my next job to turn my research into a book. However, though it felt creative and “right”, I became suspicious of this project and stopped in my tracks. The very pleasure and ease of my endeavor led to frustration and boredom when I thought of it as a book. First of all, I would be defending my form of music as a lawyer defends a client, wronged and wounded by rejection from the world [in the eighties free improvisation was barely considered music at all]. The client in essence was myself; I was party to the dispute. Following the conclusion of the book project I would be the tract-seller pushing his wares, serving The Cause, as I had done in my politically active days. To be defender of the faith was a banal task for which I was well qualified and could even be expected by others to accomplish. Like “having” a career, it was too much like a job. While I enjoyed the imagination gone wild and self-indulgent, I could only imagine the result as a thesis book, which, though I read such regularly, does not interest me to write. It would lack respect for the intent of the writers I was channeling towards improvisation, and would narrow my intellect to a single target. Most importantly, it would present only the positive view of improvisation, when in fact I felt that criticism of improvisation as situated in our culture was equally important. Once I knew the book's necessary conclusion I lost motivation; if it’s a matter of filling in details let George do it—much like my attitude earlier towards the Ph.D. thesis. I squirmed out of the hero’s suit of armor with my name on it, entered the naked, anarchic and pointless expressiveness of painting, and a few years later the writing with which this current text is continuous. Unfortunately, though, it left an aftertaste of self-created failure; here once again was the blocked theorist and advocate with “important things to say”. In the study that followed I took my cue from the epiphanal experience I had of certain Beckett plays [1991]. My relation to the text was then established by the presumption that I already had a place there as co-sufferer with the writer, who had left these traces for me. (By “sufferer” I imply the broad sense of Greek pathos, to have things happen, which is to experience life.) I wasn't judging good or bad writing or even the relative value of competing ideas, I just connected to the writer. Free of profession, including that of author and advocate of worthy ideas, I was free to be affected, pushed around, torn, shorn. I could hold the fiction that myself and writer were in each other's arms, often speechless, and I could stay in that realm as long as it seemed fruitful. It led on and on, to finding all aspects of my experience reflected in study, not just the narrowly musical, and to discovering voices in myself I can imagine trusting and distrusting. In place of the singular, unitary being I often strive to be, I become multiple. Granted, Beckett and others I read this way are fiction, and more conventionally appropriate for projection than non-fiction, which is normally written to supply a combination of information, concepts and recommendations intended to convince the reader as literally true or at least functional. But I approached this similarly to fiction, asking, what kind of thinking, habits and motivations are at play, for instance in French theory, and why am I attracted to this rather than that? Again I was fitting myself into the shoes of the other, whether they seemed to fit or not. Perhaps after a few weeks I could break them in—my feet and the shoes together. Surprisingly, reading under this sign as I still often do, I find myself paying more attention to my reflections about the actual text and the stance of the writer, to the extent of thinking "maybe I'm right in my judgment" rather than submitting. That is, judgments that are mine belong to my own mind-process and are points of a flux passing before me. If I'm not hired for the job of capturing, arranging, and defending truth statements then I am not obscured by the doubts that trail behind self-assertion and defensiveness. When my mind changes I want it to be because I've truly seen something I hadn't allowed before, and not out of a knee jerking with doubt; the latter corrodes even the latest change in my understanding without my even noticing what is happening. "Pay attention" (and the rest of my discipline) makes me to witness my detachment and re-attachment to my thoughts enough to allow the suggestion to creep in that, after all, there might be some validity to this or that reaction I have to what's in front of me. If thoughts are also reactive feelings then they flex, mine as well as the author's; even behind the ostensibly hard walls of a philosophical system is a mouth that speaks and so also an ear. Arm in arm we talk, whisper, cajole, my reactions are questions of just what I want and need to explore, not tests of my moral or intellectual worth and worthiness in comparison to the writer. We must and will see the apparently same things differently but we are not out to defeat the other. Maybe we’re not even out to convince each other, which is at the core of conventional non-fictional discourse. This is the work of mutuality and communality, lopping off limbs of competitive acquisition and defensiveness as an economy measure so that we can be clear and precise about what it is we choose to offer. I present an idea to discover its flaws, its incompleteness. I ask, why and at what point do I feel the need to defend this? Here I am operating as “thinking being”, being which is thinking, and not as the one who represents and must defend his thought. The question of non-fiction has to do with the realness of thought--can it be real in its own way and not by osmosis, or bolted on to the kind of reality of our material perceptions? The range of conflicting ideas, as I see it, is at least comparable to the range of imagination and artistic expression. We can define and refine our particular set of ideas all we want, the trouble comes in making them stick, in convincing ourselves and others of their adequacy to reality. My Marxian materialism was an attempt to give the same materiality and necessity to my thinking as I would to bread, a knife, a revolution: solid, implacable, verified. As directly referencing the mode of production, for instance, thought would be a form of matter and given the name of either ideology or consciousness, a (real) force in the world, which my own troubling thoughts did not have. Words and concepts name things. “Sticks and stones”, the chant telling us to ignore name-calling, has its place precisely because names and the ideas behind them do hurt us—and in reply can vindicate us. Since they don’t break the visible skin we think them less real; we are then shamed by the hurt and have difficulty understanding how they could. People define and defend the word "real" through this psychology, and scorn the ideal as fanciful, which falls to the metaphorically real knife. To assume the reality of thought in this way, bolted on to the eternal universe, rather than to know we create it each time, with amazing continuity and repetitiveness, removes it from us. In fact as soon as thought is questioned and the continuity broken it annoyingly proves its artificiality.
Modernist experimentalism is somewhat mislabeled, perhaps by its own practitioners; in fact it was one of the labels actually chosen by the artists, as opposed to “impressionism”, “fauvism”, and “cubism”, which were critics’ put-downs. The label was first Zola's; what he meant by experiment was basically naturalism, a literature that would tell us what society was really like, and not the self-indulgent fantasy of romantics. Fantasy was the Voltairean “infamous thing” of the late nineteenth century. Judgment, it was hoped by the bourgeois in the late nineteenth century, would be based on fact and not subjective intuition, and facts are determined by observation organized as experiment. Zola already knew the result of his life work of the experimental knowledge: exposé of the darkest social aspect of industrialism, vindication of the worker. Ultimately political progress would counter the oppression of industrial progress, so light would follow the dark. Zola has been called the first modern intellectual, combining artist, scientific realist, theorist, and political writer. This figure hides a swarm of contradictions, and contradiction necessitates a ton of work, not to mention a tradition and imitators. His work, for instance, still consists of stories of fictional characters one could identify with, and his novels were not tentative and awaiting an outcome, as with a scientific experiment. Zola put in a claim to a piece of that pie of certain knowledge in an age when legitimation was shifting from religion and humanism to the side of science and materiality. He thought he could do for literature what Claude Bernard had done to medicine, and like so many artists (and in our day theorists as well) who think their work an extension of science he misunderstood Bernard’s work, twisting it to his imagination. odernist experimentalism is somewhat mislabeled, perhaps by its own practitioners; in fact it was one of the labels actually chosen by the artists, as opposed to “impressionism”, “fauvism”, and “cubism”, which were critics’ put-downs. The label was first Zola's; what he meant by experiment was basically naturalism, a literature that would tell us what society was really like, and not the self-indulgent fantasy of romantics. Fantasy was the Voltairean “infamous thing” of the late nineteenth century. Judgment, it was hoped by the bourgeois in the late nineteenth century, would be based on fact and not subjective intuition, and facts are determined by observation organized as experiment. Zola already knew the result of his life work of the experimental knowledge: exposé of the darkest social aspect of industrialism, vindication of the worker. Ultimately political progress would counter the oppression of industrial progress, so light would follow the dark. Zola has been called the first modern intellectual, combining artist, scientific realist, theorist, and political writer. This figure hides a swarm of contradictions, and contradiction necessitates a ton of work, not to mention a tradition and imitators. His work, for instance, still consists of stories of fictional characters one could identify with, and his novels were not tentative and awaiting an outcome, as with a scientific experiment. Zola put in a claim to a piece of that pie of certain knowledge in an age when legitimation was shifting from religion and humanism to the side of science and materiality. He thought he could do for literature what Claude Bernard had done to medicine, and like so many artists (and in our day theorists as well) who think their work an extension of science he misunderstood Bernard’s work, twisting it to his imagination. The label of “experimental” that some critics, publicists and artists have appended in the last sixty years to certain literature, dance and music still implies something of Zola’s perspective. Partly to survive in a hostile environment, this art apes science, at least a popular image of it. It is always a positive affirmation, one that avoids recalling the past, as does “avant-garde”, and is immune from attack, for it does not claim to represent anything beyond the moment of its presumed testing. Experimental is nothing so bold as a movement, which would involve commitment and community. I can presumably be interested in a disinterested way rather than dug in to defend something defined as vital to me. As an artist, anxiously hawking my wares in order to survive, I must appear to be merely letting the facts speak for themselves. As in science, the art experiment awaits the moment of decision when it can be legitimately valued, and so it is preliminary, tentative and even ephemeral. In general, artwork today is presented as process rather than achievement. This assimilates art to the relativism that is generic to postmodern culture, whose ideology rejects finality as “essentialist Truth”, while its practice enshrines certain artworks in the permanent gilded frame of achieved and defining Art. In experimental art, which is very much on the ideological side of this contradiction, performance is often framed as a “work in progress”, which in fact rarely becomes “the work” or oeuvre, the piece that would indicate a master. Each artwork is one of a series, the expression of a style lying somewhere behind it, not presented to be judged individually as final. If I as actor-creator-producer am only experimenting then I am detached from it, and the audience-spectator-consumer is under no obligation to get engaged beyond a cool distance. The label indicates modesty, belied by the grandiose transcendental subject of the scientist, which promises impersonal knowledge divorced from the creator. This masks the often intense subjectivity of the creator and hope for acceptance, which will verify the quite immodest but unspoken: “here is where you will find true art”. While as an experimenter I am not responsible or accountable, yet I want to take the credit if it “works”, that is, if it is accepted or if I can envision its success. The test of this is whether artwork can be continually rejected and still be thought to “work”. I think not, which means the market called the audience is the testing ground rather than standards internal to the artist. "Experimental" has ceased to be an adjective modifying the artwork and has become the generic form of high art, in continuity with the historical avant-garde. It is a promotional label, more effective in the marketplace than any other—in fact I myself use it occasionally to advertise my music, even though, like other improvisers, I do not actually play with experimental tentativeness. Part of the reason for the success of the label is the bent of our entire culture towards the idealized version of scientific truth and progress, with its proof in technological change. Art must find a way to share that spotlight. It goes deeper than this, however. Progress is cultural self-congratulation, even a vindication for what we ourselves have not actually suffered; this collective we has triumphed in some collective fashion. As our culture of ironic twisting would have it, what is imagined to be held lowest, most despised, outcast and inaccessible is actually (surprise!) the highest and most worthy. Jesus’ “And the last shall be first” is finally achieved in the secular and sophisticated art world! Outsider Art, like outsider politics of the left and right, is in fact not the outside at all, it is hegemonic, or at least the spokespeople are. But the “we” of contemporary high culture, rescuing the lowest and despised, need a “them”, an other which is both past, unenlightened taste or the philistine artistic reactionary of today. goes deeper than this, however. Progress is cultural self-congratulation, even vindication for what the high culture "we" have not actually suffered; we have triumphed in some collective fashion. As our culture of ironic twisting would have it, what is imagined to be held lowest, most despised, outcast and inaccessible is actually (surprise!) the highest and most worthy. Jesus’ “And the last shall be first” is finally achieved in the secular and sophisticated art world. Outsider Art, like outsider politics of the left and right, is in fact not the outside at all, it is hegemonic, or at least the spokespeople are. But the “we” of contemporary high culture, rescuing the lowest and despised, need a “them”, an other which is both past, unenlightened taste or the philistine artistic reactionary of today. This story recalls the origins of modern science, which was a triumph of what was considered by the philistines of that era—the Church and its scholastic metaphysics--to be beneath the dignity of man. Experimental science mucked around in mere matter, compared to the spirituality which claimed to be focused on higher things. Similarly, capital was once snubbed as filthy lucre, not worthy of an honest man’s attentions. This tradition of ironic reversal ultimately became the story of Progress and put the West in the world’s driver seat. In the public mind art fell in nicely with the story of “yes we can” that vindicated the rejected, even if the artists themselves paid little heed to it. They experimented, history proved them right, we the public can identify with that story, and speculators who buy their work certainly hope the story continues to be believed! story goes back to the origins of modern science, which was a triumph of what was considered by the philistines of that era to be beneath the dignity of man—mucking around in mere matter—compared to the high art of theology and metaphysics. Similarly, capital was once snubbed as filthy lucre, not worthy of an honest man’s attentions. This tradition of ironic reversal ultimately put the West in the world’s driver seat. In the public mind art fell in nicely with this story of “yes we can” vindication of the rejected, even if individual artists paid little heed to it. They experimented, history proved them right, we the public can identify with that story, and speculators who buy their work certainly hope the story continues to be believed! Experiment is a useful and occasional tool for art and always has been. The question remains whether it should be more than that, whether the persuasive story and metaphor for art should be literalized. Can the pragmatic approach be the very model for creating art--to try this and that until it “works”, to proceed from problem to solution, to adopt useful “strategies”? Can the artist be the practicing scientist or technologist, tinkering with sound, promoting theories and solutions that can be disproved by tomorrow’s research? Apparently this is the approach of many labeled “avant-garde” and progressive, and what many are trained to do. As I said above, there is resonance for the science metaphor in the art and music world because it resonates in the entire culture. However, I think many are hampered by this and would like to think differently about their way of working, but would lose their credentials if they did not present themselves in the white coat of the experimenter. At least since Zola's time, idealized Science has been engaged in a pas de deux with Art, that is, Culture in the narrow sense and identified with the humanities. Science, as figured in the public imagination, has been setting the pace for this cultural dance, if not for the artists themselves. For the artist molded to this figure I suggest it is a self-deceiving, sly masquerade, the adoption of a persona the practicing scientist would avoid. Isn’t a performance, a book, a painting, an installation, a dance labeled a work in progress, actually presented and received as a finished thing, without apology? In its heyday twenty years ago postmodernist theory feared becoming identified with any conclusions other than the ironic, an irony deprived of the hesitations of modernist ambivalence. As such it was forcefully conclusive, ideological, militant in fact, could not keep an ironic smirk on its own face. As if working from under ground, it expected to undermine the received ideas of an establishment, seen as a tree to be toppled by chewing away at the roots. It needed, and for a time had, such an “old guard” Other it could count on for resistance. Postmodernists felt science bolstered their case, much as Zola did a century before. But when confronted with postmodernism and its American spawn of cultural studies, practicing scientists demurred (the Sokal Hoax of 1996): you are not talking science but indulging fantasy. Experimental art bought into the postmodern project and still clings to it as a strategy of radical challenge to the culture, and the culture’s marketplace, called “the mainstream”. For instance, it presents itself as escaping genre as the cul de sac of a tradition and a consumer group, yet in order to market itself it inescapably becomes a genre which then must lie to itself about its capacity for self-transcendence. Here is the fundamental contradiction, that one cannot be humble about one’s achievements, as artists usually are among themselves, and promotional at the same time. “Experimental” has that touch of appropriate self-sacrifice, but by merely proclaiming itself it hardly risks a thing. Risk only appears when you allow consequences of failure to stick to you, when you can honestly get hurt. Unlike the scientific experiment, which can fail, be forgotten and even ridiculed, the artistic experiment has no way of judging itself as failed—all is process, which never fails, since it is continuous with life itself. But from outside the charmed circle it is quite easy to find ridicule. Behind the image of the scientist is the psychology of the rational ego, which takes the form of both theory and practice, as applied to the artwork and the subjectivity of the artist. Art is then a barrier against the dream; “moves” must be depicted in functional terms. Dream-like surrealism, for instance, cannot be accessed apart from a verbal interpretation. As for practice, when artists and critics present artwork in the frame of problem-solving they invoke the down-to-earth approach that gets things done, especially trusted in American culture. One cannot do anything without a flexible strategy, and the genius for art is seen to be located in those who discover problems others have overlooked and devise ingenious strategies for resolving them. This is also the language of Theory in the academic world of cultural studies, which has had such strong partisans in the artworld. Theory was even foreshadowed in Conceptual Art of the early seventies, which replaced painting and sculpture with theory and artist statement. When a theory is wrong the artwork will be wrong: this is frame of rationality. Artwork today is validated as high or advanced if it is puzzling, and the key to the puzzle is the artist statement which elucidates it in two directions—the subjectivity of the artist, presented as unique but able to be categorized and understood by an empathetic public, and the theory that the artist means to illustrate. “Without a program you won’t know the players,” hawked at baseball games, holds true of artist statements today pinned onto the artwork and justifying it: art reduced to rational explanation. Art is then the juncture of theory and practice, which is meant to escape the affective consideration of beauty or spectator reaction. The spectator who simply observes and comments on the bare artwork is bound to be uninformed, reacting subjectively and inappropriately, the current philistine. Related to this is also the language of action, as in the critic Harold Rosenberg’s depiction of the abstract expressionist as an “action painter”, and Sartre’s favoring of the “engaged” writer. These appeared shortly after WWII, when trust of the fighter was in danger, and needed to be asserted against the fear of slipping back into the soft indulgence of civilian life. For the public to see the artist “in action” is to go into the studio itself, satisfying the “how” question that would literalize art and remove the fantasy. Hans Namuth’s film of Jackson Pollock in the act of painting was the precursor of the how question as it sought ascendancy over the phenomenological “what” of the artwork. By the “what” I mean not the literal description but the spectator left alone to be penetrated, seized, enlightened, swept away by what is right there in front of him or her. Lots of artists’ limbs have been chopped off to fit the Procrustean beds of critically correct theory. This is even true of the theory today (especially promoted by Arthur Danto) of an overarching pluralism and eclecticism, as if to say “there are no movements confining the artist, no Hegelian Progress of consciousness; what we have now is just artworks”. That view would have us overlook any consciousness present in artworks, not just that presented as experimental. I am getting deeper
into this subject than I intended and feel the need to go back into
this extensively with certain questions. For instance, the public
function of Art requires artists to submit to interviews, which means
usually to journalists, critics or gung-ho promoters who must frame
art in the most commonplace themes. Why have artists submitted to
these paparazzi pounding at their door, who assume they are performing
a service for art? Why do they so commonly define what they do in
the most understandable terms of our culture, or at least of the cultural
left? Is there no conflict within “artist subjectivity”,
a resistance by artists themselves to having their subjectivity depicted
as either a politically left perspective, or framed as unique and
resistant to an oppressive conformity (bucking the trend), or as too
crazy to imagine that we could ever access it ourselves? And finally,
there is a widespread belief that with all the attention given to
training artists we as a culture (American) are finally on the road
to overcoming the philistinism of the past and truly valuing creative
activity. Can there be no criticism that would negate our culture’s
self-congratulation? Is there nothing outside Outside Art, no underground
that does not aspire to come into the light of approval? First I ask myself, what is it that I and my partners have been doing the past thirty years, playing a music that has gone under the name of “free improvisation”? This music has been around since the early sixties, now the playing preference of perhaps a few hundred musicians around the world, and known to a commercially insignificant number of listeners. More than any other musical form, this is played within an egalitarian community, despite the ebbs and flows of individual careers. Anyone can play, including those completely untrained, and those who would not even describe themselves as musicians. What interests me here however is not to describe its actual features so much as to explore its internal principles, its potential that does not await realization, and its political horizon and implications. Let me try to correct the terminology. The Greek nomos, usually translated as custom or law, also means tune or melody. That means that when we speak of free improvisation we might rather call it a-nomos, anomalous--out of tune or out of melody—which is correct; it is a music without melody. Alternatively it might be called autopoietic music, self-creating, from a concept of Franciso Varela: a machine (as in Deleuze/Guattari) that is “self-contained and cannot be described by using dimensions that define another space”. This is a more arrogant version of “anomalous (tune-less) music” but at least sets a good tone for questioning: is a self-contained space possible at all? And what is the perspective for talking about this as opposed to other musics? That its practitioners are also at a loss to answer this would make the discussion fruitful for all and not just a Q and A. Either of these appellations is better than “free” improvisation, the most common name it has gone by over the years. “Free” implies chaos, noise, an unassimilable Real or a mythic totality, a metaphor for what is impossible to conceive as epistemologically “real” and beyond concrete realization. This would be fine, an image of a dream or mystery, but “free music” is then placed alongside categories that are not totalities but discrete and distinguishable objects, which leads to the impression that other musics could not be “free” or only partially so. To stamp our music with this word is tendentious, an unfair privileging, a kind of advertising for one’s own product that lords it over the others. It is disingenuous to disregard the categorical differences of the adjectives modifying music. That it is unfamiliar to all but a handful of the population, at least in the form of sound improvisation (such as electro-acoustic improvisation, or eai), is no excuse for hyperbole. In fact, to call it “free” has the counter-effect of disqualifying itself from consideration, by placing it on a non-comparable level, above the others, which merely want to point out their particular form as an option for choosing it. Calling it autopoesis or anomalous gives us the chance to enter a conversation rather than let it fall into a misconception, or make a claim that is obviously ideological. “It is what I call anomalous music.” “What do you mean by that?” Certainly anomalous does not, like jazz, call to mind specific, identifiable traits. But to abjure melody is truly, if only partially descriptive, one of many traits known to music that can be ticked off as missing, and a firm rule of this music. “Without melody” means that it is not cyclic; it can be melodic but not have a singular, repeatable melody as a component. “Anomalous” as “lacking melody” has the advantage of including the more commonly known free jazz, which is often mistaken for the broader category, which includes improvisation which also lacks rhythm, consistent tempo, “drive”, or rules concerning instrumentation. In fact, this is perhaps the only kind of music that cannot be constructed in imagination from a written text, but must be heard live to begin to understand what is being discussed. By denying nomos as melody this music does not deny nomos as rule, to which musicians and non-playing listeners must conform. All music in some way follows a rule or constellation of rules just in order to be recognized as music even by its practitioners. The nomos of music is not completely relative to culture and history. A Balinese waving a mallet in front of a gong but making no sound is not making music; there might be long and elaborate silences, but they must relate to sound audible to humans in some way. Sound is the least common denominator, the sine qua non. In our day however the converse has also become a rule: when we or anyone makes any sound at all we cannot not make music, no matter what our intention or protest of lacking musical ability. This is because the rules of music now have invaded the listening as well as the making of sound. The listener attending to the rustle of trees or the traffic or the coughing of an audience can hear it as music. This much is a diachronic development, an historical rule of modernity that extends through today, namely, the ever-widening expansion of listening, of the ability to hear music in all sound, the aestheticizing of sound. Music history tells us of one scandal after another, and eventual listener acceptance until the next barrier is reached and breached. The barrier separating music from noise is obviously a rule meant to be broken—and rebuilt somewhere else. As soon as someone says surely that is not music, someone else will pop up and say, I can hear it as music. Similarly, you may think you are making a rational argument in prose; I might hear it as poetry. The argument might be recognized, but an alternative hearing is possible alongside it, just as one may include traditional music and a dripping faucet in the day’s aesthetic enjoyment. The abandonment of melody is just one step in this direction away from the effort to foreground and isolate a signal out of a sea of noise, and a step towards hearing noise itself, what was formerly foreclosed as expendable interference, as the signal. Or rather while appearing to eliminate the signal/noise dichotomy, this music re-establishes it elsewhere. Instruments can be played but they can also be simply used as tools in what are called “extended techniques”; all material is simply the means of making sound. The most radical elimination of all is perhaps the composer; even the player is not a composer but the one who makes sound at that very moment, without plan or forethought. Indeed it is misleading even to call it “improvisation”, since that implies a theme or melody as its base. What is or is not music is in question, and so there can be no complaint that one’s partners have done something wrong. There are no mistakes, but at least one absolute rule: the players must strictly follow what they consider makes musical sense, quite possibly on a level they have not previously imagined. There is no rehearsal possible; whatever the musical judgment, it is fully there each time it is played. Performance is by no means necessary, in fact the frame of performance, and all that is necessary for musicians to fit their playing into that form, can get in the way. This music has only artificial endings, it does not truly conclude; in fact one could say the ultimate rule is to keep the playing going, interesting and seductive enough so that no one thinks of stopping. Besides the rules common to all music--the inevitably acquired skills and habits of the players, and the ability to hear sound as music—there is also the inescapable rule that determines that there can and must be rules in the first place. This contradicts the claims of conceptual art, anti-art and noise music, that want to deny they are making art or music. There is no escape from being and playing within this transcendent rule, which opens the door to categorization--genre and species and all that marketing and hierarchy of players that follow. As musicians we want a relation with the world and not to be taken literally as outlaws, heroically disobeying “imposed” genre categories—that can only be a promotional posture. We do not want to play from a sovereign position outside of music or even, in the community of players integrated through sound, to think of ourselves as commanding our playing. In this music we surrender to others. Anomalous music is the limit, the horizon of musical possibility, and touches or imagines the limit of social possibility as well. In fact it is arguable that, as a simple communal experience of following what is going on, like a collective conversation, it only barely fits the category of ”experimental art”, which is today’s replacement for the historical avant-garde and the music world to which this music has been assigned. It does not have a musical world of its own. That would require a musical object, and there is no object; whatever status and judgment of the art object might be possible is bolted on and can be unbolted and discarded—that in fact is the cyclic story of this music, were one to seek out the details.
The micro system of the human endeavor called music is paralleled in politics and more than that, they increasingly mirror each other in the (post)modern world. Both are subsumed under an ontological order for which there can be things and people not easily comprehended or categorized but no unambivalent outside. Like the ancient “Greek” and “barbarian”, the modern “inside” and “outside” function together, dividing being between them, only today’s border is like the geographical river, thought to separate but actually joining the two. This modern order includes the urge common to art and politics to frame and represent an “outside” as that which it is not. For left culture this self- and other-perception of an inside and an outside assumes politics and art as the mediator, even the intercessor between them. Music for instance is divided this way, some being considered “out”, while the “inside” is called mainstream or conventional, yet as with its counterpart, the signal/noise dichotomy, this distinction is fragile and tends to break down. For the art of left culture the arrow tends towards noise, just as for left culture politics it tends towards the outside, where the attention is directed. We don’t have real art, we don’t have real political value, they do. This dynamic has increasingly become the driving force making Art a component of the broad culture and a common ground for those who perceive themselves on the left. Art is the model for “creativity”, with negative liberty (freedom from rules) as its ideology: to be an artist you must first of all lose your chains. Art is freedom incarnate, and “free” democratically opens to doors to all. The historical tendency is for those who think of themselves as included (those who expect their taste or value to be taken into account, who are “the world”) to be symbolically and symbiotically allied with the excluded (the depoliticized and disinherited). If asked, the included may well think of themselves as alienated, but culturally they respond as if the world beyond their family, neighborhood and coworkers is their world. Even in rebellion they bear the tradition of the wider culture and are embarrassed by their inside position, which they think of as privilege, in the face of those on the other side of the wall. They see themselves politically divided by a stone fortification from rebellious slaves and peasants, laborers, recent immigrants, the poor of the world. Culturally their outside is the bohemian, the suffering genius, the mad, obsessive-compulsive artist. In the past few decades this figure has been formulated as the Outsider artist, untouched by the corruptions of the market. On the one hand the suffering multitude, on the other the suffering artist. Since in their imaginary they are divided from suffering, those self-perceived as privileged must ignore, mis-recognize and even deny their own actual suffering. At the same time, however, the wall functions to keep those inside from getting out, which they are often tormented to acknowledge. Here art and politics diverge. Towards the outside the political inside is guilty for their privilege of not having to suffer, and is obliged to pay politically, whereas the artistic inside, the spectator especially, thinks he is missing something that is outside and envies it. The outside others are what the privileged cannot be or reach, the gap of desire. Here then is the paradoxical desire of both politics and of art, stronger now in its postmodern development: to include the outside, which is the political project, and to be or at least to validate the outside, which is the envy at the core of the spectator of art. These are two aspects of the same imaginary, both of which involve a vision of progress and irreversibility but are ultimately non-technological or scientific. The first would be considered the thrust of liberal democracy since the 19th century (Mill and the gradual expansion of the electorate—and now doesn’t the whole world vote in the US presidential election?) The second aspect has gone under the name of avant-garde, at least since the late 19th century, and continues today as a mass phenomenon (since Warhol we’re all artists) rather than encased in the modernist “movement”. The desire which has increased in strength with globalization is to be inside at the same time as outside, at least in imagination, that is, to be the power on that threshold, the generous gatekeeper. Art enables a soft or weak politics, one which does not wish to have or be confused with Power, does not have to be contradictory. Will to power is seen as the masculinist Old Left, a Grand Theory idea, a politics for which art could only be a kind of advertising. We don’t make, we create. The subject of such politics and art is not the self and not the collective in which one finds oneself but more generously the other, which then gets framed, colonized, and its lack of privilege reversed. This is the kiss of death, but it is impossible to accept that one has killed what one wanted to keep pure. The presumed inside, in this ontology, can only think of itself as anti-colonial, anti-racist, and necessary for the world in order to correct its ways. To hold such opinions of oneself is considered an act, in itself virtuous, and need not be duplicated by anything more overt. From this point of view, as soon as one interpellates and recognizes the outside, which means to distinguish and value it, the latter is at least in motion to the inside. Presumably it will, if there is a critical mass of like opinion, eventually cross the threshold. This is as far as
I have gotten as of this writing. I have removed many loose ends (such
as whether left art and politics compensate each other, which I suspect),
and if I look through this one more time I will find much more that
needs re-examining, elaborating, or deleting. It is not mine to do
alone, but calls for our joint effort, at best in face-to-face conversations.
For communications leading to discussion you may email me at jackwri444@aol.com 5. college education before the sixties hit
In establishing the syllabus and curriculum our liberal, secular Christian administrators wanted us to begin our studies asking what it meant to be a man, one who must make choices in a world hard to comprehend. Their attention was directed more to our consciousness than to the shape of the world, hence the humanistic and literary rather than critical bent of our education. We got a moral education, intended to form character through habits of reflection, with study expected to bring meaning to our lives along with qualifications for a profession. We were encouraged to work hard for the sake of self-discipline, such that high grades were rare compared to Ivy League schools, a matter of pride in what we called the “poison ivy league”. The college didn’t promote the moralistic view that the world was wrong, filled with evils (poverty, racism, etc.) we were to combat, as found in many small liberal arts colleges today, but rather suggested that we and the world were both fallible and open to question. I've learned since that this embodied what Lionel Trilling in 1965 called "the whole-man theory" of education. The study of literature was "to have a unique effectiveness in opening the mind and illuminating it, in purging the mind of prejudices, and received ideas, in making the mind free and active." This creed, passed down from Matthew Arnold a century before, sums up what we received. (from Beyond Culture, Essays in Literature and Learning, 1968, p. 212) We were kids, told repeatedly at orientation to think of ourselves as “men”, with the college itself in loco parentis, responsible to shock us out of childhood naiveté into adulthood. In that affluent age a high ranked college like this was affordable to the middle class. Almost all were fresh out of suburban high schools and lived on campus; very few of us were working or had ever worked more than summer jobs. Besides our homes, the only world we knew was this monkish, exclusive community bursting with male intensity. Here was a bourgeois education for middle class professionals, preparing us not for fancy, quick-rich jobs (the Mad-men of that era) or for moving and shaking the world, but for careers with a lifetime of responsibility to internalized standards. Despite my closet Marxism, I loved every minute of it. All of us--engineers, pre-med, and humanities students like myself--were required to read and discuss books that defined and questioned Western civilization, although in a way that would be considered apolitical today. In freshman English we read the work of troubled spirits from sixty years earlier, the long autobiography of a man sometimes called an anti-modernist, The Education of Henry Adams, and Joseph Conrad's Heart of Darkness, with its play between Kurt and the narrator, whom we could read ourselves into. Both fin de siecle authors were suspicious of underlying currents beneath an apparently secure Western civilization. Stephen Crane’s The Red Badge of Courage (to us the “red barge of garbage”) was an obvious moral teaching for all of us, since we were enlisted in ROTC and expecting to be drafted sooner or later; the reluctant hero learns to fight. By a huge leap we came to the adolescent doubter of Salinger’s The Catcher in the Rye, almost our contemporary, and close to our subjectivity. We also read James B. Conant’s 1952 lectures, Modern science and Modern Man, which defended science just at the point when it was beginning to lose its idealistic aura to the Cold War politics of defense spending. It also presented us with the problematic status of Newtonian science since Einstein; in fact Conant, besides being an ardent Hitler supporter twenty years earlier, was a mentor of one of the heroes of postmodernism, Thomas H. Kuhn, who introduced relativism to scientific theory. All in all, we had a good mix of the world as it presented itself—solid reality, aimed at progress and goodness—and some of the ways it could be criticized. Available to us were the exact sciences, which included behaviorist psychology and economics; the classical humanities, history, and languages; one course in non-Western history and culture; but none of the social sciences, such as sociology and anthropology, which would have challenged our conceptions of how the world actually operates. And most striking in contrast to today, no department of art or music, since they were not considered serious professional choices. From the classical and pre-Modernist literary milieu I and perhaps others took on a languorous superiority and detachment from the present, which seemed banal by comparison. With Europe then just emerging from ruin I could identify with a nostalgic view of the pre-WWI Old Europe that represented the best of human art and thought, framed by the coming doom. Later I absorbed Matthew Arnold's anxieties about materialism and his "sweetness and light" of the classics, and some of the later aestheticism. I imitated the career of the nineteenth century scholar, whom I pictured as the scavenger and beachcomber of once living artifacts. A hands-clean optimism plus the organizing and enterprising ability of the American reinforced me, like Woodrow Wilson saving Europe from itself. By locating my mind in the late Victorian humanities I could make facile judgments on contemporary issues, while avoiding descent into the maelstrom of present life, as my age-peers elsewhere were doing (for instance, the University of Michigan, which engendered the radical Students for a Democratic Society). There was a lack of concrete reality in my enterprise, a romantic and precious fakeness to my pretended manhood. I saw myself as one of the east coast elite-in-training, cultivating a taste for a world that had fallen apart, and not wanting to engage what had taken its place since. (I went on to graduate school in classical and pre-modern European history and only crashed into the realities of life and the world during the tumult of the late sixties.) We, if I can generalize to a small segment of the school and perhaps of that generation of intellectuals, could haughtily dismiss everyone in the past as blind participants, yet we couldn't stretch our imagination to think that we might have a culturally-determined and self-serving stance of our own. Such an awareness would have been confusing, since it would seem to involve oneself in an ever-narrowing, negative self-conception, while the bait of liberal education—at least before the more critical academia of the seventies!--was the High Victorian promise of an ever-expanded sense of self and power. The college provided a tunnel from Teddy Roosevelt, with his muscular optimism for “real men”, past Franklin Roosevelt, mass politics and culture, Modernist ambiguity and the avant-garde, to right where we were; we passed through without a scratch. I myself had fortunately absorbed much Dostoevskian and Nietzschean darkness through my private reading, but it sustained itself underground, separated from the "race of life", a nineteenth century phrase appropriate to us, eagerly bunched at the starting gate. The Victorian conservatism of the college was expected in the end to combat the decadent vertigo of despair and the tragic view of life to which we were at times exposed. If we were to be successful men of the world then our lives would not be cluttered with ambiguity nor tainted with unquenchable rebellion, but would carry us smoothly through stable lives to respectable careers of moderate wealth. And most of us did just that.
|
|