Monday, December 20, 2010

The Liberal Arts as Archimedean Point



The seasons govern us, we professors, just as surely as they governed our agrarian ancestors.  Like farmers, we find the nature of our work determined by the flux of seasons, although we seem somehow to have lost a season in the transition from the cultivation of the soil to the cultivation of the liberal arts.  If we teach on the American semester system, our year divides into spring, summer, and fall.  While those of us in the northern tier of the country certainly shiver from December to March, we don’t recognize winter as a full season, instead relegating it to a brief “winter break.”  Our real seasons consist of the spring and fall semesters — each with an arc of its own from the “here’s a syllabus” moment through the real teaching and on to the terrible grind of end-of-semester grading — and the impossibly glorious summer, when, unless we’ve resigned ourselves to post-tenure intellectual rust, we rev the engines of research as high as they’ll go, and see how far they’ll take us.  So for me, the fall just ended a few days ago, with the essays and exams done and the grades turned in to the registrar.  And I find myself asking some of the same questions I ask every year around this time: what is it we’re doing, anyway, and in what possible way can it matter?

Don’t get me wrong: I’m not cynical.  In fact, my instinct is to believe in what we in the liberal arts do, at least in principle.  But what, exactly, is it, and why is it worth believing in?  I took a stab at an answer in one of my notebooks while my students were writing their exam on Romanticism.  Here, in slightly cleaned-up form, is what I came up with.


*

First of all, there’s no such thing as the liberal arts — or, to put it in a less showy way, there’s no absolute consensus about what the liberal arts are, nor any consensus on how or why they’re taught.  In fact, let’s begin with a moment of dissensus: a faculty meeting many months ago, during which I nearly ground my teeth into a fine white powder while a much-respected colleague held forth passionately on the education of new students.  “With first-year students,” he maintained “we should remember that we’re teaching skills, not content — so by all means be willing to let the content slide while you concentrate on building fundamentals like the capacity for critical thought.”  He continued in this vein for some time, with the usual response from the end-of-day crowd: a combination of nodding (some ostentatious, some not) and tired-eyed staring into space, followed, at his oratorical terminus, by polite applause.  Maybe it was my background in literary studies, where the simple distinction between form and content is generally rejected; maybe it was my pathological anti-authoritarianism (my inner anarchist punk grabbed me by the shoulder of my dingy corduroy blazer and noted that the speaker was some kind of dean and, therefore, wrong — Q.E.D.); maybe it was an orneriness introduced into my system by the digestive acrobatics required to process a cafeteria calzone.  Whatever the reason, I took quiet exception — loud exception being, as I have slowly learned, a sure way to prolong the very meeting from which one years to escape.

The basis of my exception-taking was this: the speaker’s assumption that critical thinking was in some way abstractable from what he dismissed as the mere “content” of study.  This struck me as entirely false, possibly damagingly so, and very much at odds with my personal sense of the liberal arts.  To my mind, there can be no such thing as disembodied or contextless critical thinking.  To think critically one needs some kind of Archimedean point outside of one’s own inherited assumptions, some introduction to a mode of life or way of thinking that is other and alien, that doesn’t operate with reference to the coordinates to which we reflexively refer for guidance.  There are a limitless number of such points, but in whatever version, they all fall into the seemingly inert category of “content.”  Much of the thinking that went into multiculturalism, and into the old triad of race/gender/class, makes an argument of this kind: by understanding, say, the complexities of how gender has been defined, codified, and normalized in different times and places, one can step outside one’s inherited assumptions; by reading the imaginative literature of an ethnic group not your own, one can suddenly find oneself seeing the world of one’s understandings from the outside; by looking into the systems by which social groups seek to establish status one can gain critical insight into one’s own social being.


On a good day, the study of “content” can even give one an Archimedean point outside one’s own categories of analysis, categories like race, gender, and class.  I had the privilege of being in a room where this happened to about a hundred people at once.  It was at a conference on postcolonial literature, when the Tanzanian scholar Joseph Mbele rose up and interrupted the American speaker who’d been impressing us all with talk about how we needed to get beyond generalizations about the postcolonial, and introduce into it the subtleties of race, gender, and class.  “When,” asked Mbele, “will my village matter?”  He continued by explaining, in some detail, the methods for identity-construction and categorization in his hometown, many of which revolved around excruciatingly complex notions of kinship obligation.  Our categories were interesting, but they were definitely Western categories, and we needed to think of them as such, rather than as timeless and placeless universals.  Had Mbele not walked us, however briefly, through the “content” of Tanzanian notions of identity, we wouldn’t have been able to step outside the assumptions of what we’d mistaken for a rather cosmopolitan point of view.

To my mind, historical study of any kind (literary, artistic, scientific, mathematic, political, what have you) is one of the most powerful means of stepping outside one’s inherited assumptions and gaining a critical perspective on them.  The remoteness of things, the utter alienness of the ways people from other periods thought and lived and embodied their subjectivities — this is no inert pile of facts, “content” that one must unfortunately get rid of on the way to gaining the skill of critical thought.  Historical knowledge is, instead, the vehicle by which we may move from where we are, in our understandings of ourselves and the world, to where we might be.  Get to know how people thought about economics before we even had the word “economy” in anything like the modern sense of the term, and you’ll be positioned to understand the contemporary financial crisis in ways you never would have been able to otherwise.  Get to understand how an ancient Greek saw his or her relationship to the obligations of the polis, and you’ll see everything about how we operate as a body politic as if for the first time.


I made the mistake, once, of proposing to the faculty that we consider historical difference a category as important as gender or ethnic difference when we give students credit for taking required cultural diversity courses. Most of my students are American women born within the last quarter century.  Strangely, they can get diversity credit for a course on the modern women’s novel, but not for a course on, say, Beowulf.  And one could reasonably argue that Beowulf comes from a set of cultural assumptions more different from those students' own than, say, the assumptions informing the work of Joyce Carol Oates.  I think people assumed I was trying to argue for the reinstatement of Ye Olde Literarie Canone, though, and it didn’t fly.  It was probably my fault for not coming at it differently.  I mean, I was young enough to believe that the way to get something done was to introduce the idea in a meeting, then sit back in arm-folded Righteousness and await acceptance by the throng.  Politic I was not.

*

These were the things I thought about, as I watched my students writing their exams.  And I wondered, too: had the students in my seminar on Romanticism been learning critical thinking?  I certainly hadn’t stopped with the Hazlitt and Keats all of a sudden to say it was time to work on our critical thinking skills.  Nor had I started every day with an injunction to critically interrogate the texts before us, seeking out their odiously out-of-date assumptions and holding them up for criticism — I actually attended a course like this when I was a student, a course in which a fellow student once confessed to the fresh-from-Yale-in-the-90s professor that she didn’t know what to say about the text, but she knew "there must be something wrong with it, or we wouldn’t be reading it.”  In fact, I think that kind of “critical thinking” is hardly critical at all: it’s just an imposing of our present views onto the past, in order to find the past wanting.  It’s not much different than the kind of smugness Joseph Mbele exploded at that postcolonial conference I’d attended.  The best kind of critical thinking, I suppose, is the kind that comes when we let the culturally different interrogate us as much as we interrogate the assumptions of the culturally different.  It’s in the meeting of the two different cultural horizons that some of the most powerful critical thinking can occur.  And that’s a matter of content.