Dec 19

Mentors and Intangibles: Remembering Gary and Greg

As applicants and search committees look forward to the upcoming Modern Language Association convention in Seattle, Rosemary Feal has been listing “intangibles, not visible on CV”  on her Twitter feed. She means characteristics like integrity, maturity, honesty, and empathy, which while hard to glean from job materials are nonetheless very important to employers. I’ve been thinking a lot about these kinds of intangible characteristics recently, but not because of the MLA. When I think of the intangibles that I would claim for myself, I trace many of them back to my college swim coach and mentor Gary Troyer, who passed away last week, and to Greg Colomb, who passed away in October. I am a different person — a better person — for having known them, but it is hard to quantify what they taught me. As I reflect on what the world has lost in the last two months, I felt the need to think about how much I learned from these two mentors.

Probably the most specific characteristic I associate having learned from both is self-confidence. Greg ran the writing program at the University of Virginia, and every graduate student who came through our department worked closely with him. A memory I deeply cherish is from my first time teaching a college course, as a teaching assistant in Greg’s “academic and professional writing.” I led a once-a-week workshop, which complemented the lectures. Around mid-semester, about an hour before my class, I happened to meet Greg in the hallway. “I’m coming to observe you today,” he said (while some professors give lots of advance notice before coming to observe a TA, I soon learned that wasn’t Greg’s style). The class went fine, and when I went by Greg’s office a few days later to talk to him about it, he told me “you’re a born teacher.” This remains one of the greatest compliments anyone’s ever paid me, and meant so much more coming from Greg. Whenever a class doesn’t go well, or I feel like I’m not doing my best, I remember Greg’s confidence in me.

Gary had been coaching for thirty plus years when I first met him, and one way he had learned to keep track of the swimmers was to recruit them to play water polo. I’d never even seen a game before, but was happy to fill out the bench and stay in reasonable shape. I wasn’t very good at water polo, and probably didn’t try as hard in practices as I could have. After a successful performance in the 100 fly in one of the first swim meets, Gary confessed to me, “a few months ago I wasn’t sure you could even finish a 100 fly.” This was blunt honesty, which some found off-putting but which I appreciated. And that he could be this honest meant that when he was excited about something, it was genuine. He wasn’t given to large outward displays of emotion, but I remember winning a race and looking over to see him with both arms raised in victory. The image stays with me.

The more I think about it, Gary and Greg had a lot in common, and would have gotten along well together. Both were inveterate lovers of knowledge: Gary did the crossword every day, and Greg was never shy about sharing something he’d just learned (he taught me everything I know about curling, a sport I never thought could be interesting.) Both loved to cook, and would have students over to their houses (I can’t begin to guess how much carne asada and tri-tip sandwiches Gary cooked us). And both were repositories of institutional memory: Gary coached at Pomona for over thirty years and we loved hearing his stories and meeting alumni whom he’d coached years ago, and Greg could draw on decades of experience in the writing program to aid in any of the administrative questions that came up (I witnessed this first-hand in weekly meetings.)

But most of all, both were great men: happily married, they loved their jobs and were loved by their students. There are many people to whom I look up intellectually and professionally, but when I think of the kind of person I would like to be, I’m hard pressed to imagine better mentors than Greg and Gary.

Nov 16

The Teachers Who Mattered Most

[This post is cross-posted on the NINES blog]

Following last week’s symposium here at UVA, I found myself recalling Roger Lundin’s essay in Pedagogy from a few years ago: “the teachers who mattered most to me did so because of what they loved,” writes Lundin. “As I taught, in other words, I learned I had come to love what my most effective teachers had loved, and they had taught me how” (137). Lundin, riffing on Wordsworth’s Prelude – “what we have loved, others will love, and we will teach them how” – offers a viewpoint that I think was implicit in many of the discussions. The symposium marks in the inauguration of Institute of the Humanities and Global Cultures, and Steve Ramsay began his talk by praising institutes like this one for providing an opportunity for scholars to live an intellectual life in community with others. Community – which means, people – is as important to academic fields as the theories and methodologies that were the symposium’s explicit focus.

Lundin again: “For the past several decades in the humanities, our discourse has been theory-rich, perhaps theory-saturated, and we have developed explanations for everything from the nuances of différance to the needs of the subaltern. But when have we thought about love?” (134). Love of our work, Lundin means, and in a real, non-theoretical sense. A flurry of recent posts (like Natalie Cecire’s and Jean Bauer’s) has considered the place of theory in digital humanities. And perhaps the most important argument to arise from symposium (besides the institute itself, of course) will be Bethany Nowviskie’s call for reform of graduate training, to match the methods and questions that will form the future. But in the words of the Black Eyed Peas, where is the love?

For digital humanities, the response to the Black Eyed Peas comes from the Troggs: love is all around. At THATCamps, at MLA sessions, on Twitter – digital humanists seem to have a fondness for their work, an emotional connection to their theoretical arguments. Panels play to packed houses, in a way that other fields seem not to. This isn’t to say that everyone always agrees with each other, or that theoretical conversations don’t happen. The teachers who matter to us, Lundin is careful to state, are not necessarily the ones with whom we always agree: “my most influential teachers had religious commitments, political views, or theoretical understanding that differed sharply from my own” (137). Disagreement of course fosters insights. Responding to Bethany, Ryan Cordell hopes to reform undergraduate teaching as well. Ted Underwood, though, is “not yet sure about the implications at the undergraduate level. Maybe ten years from now I’ll be teaching text mining to undergrads … but then again, maybe the things undergraduates need most from an English course will still be historical perspective, close reading, a willingness to revise, and a habit of considering objections to their own thesis.” In considering how our pedagogical goals might change, Ted gives what I think is the best and most concise list of what those goals are now (at least the best I’ve heard).

Academics are teachers, and I’m excited to see that teaching has become a center of the conversation in digital humanities, with both graduate students and undergraduates involved in digital scholarship. We can say, to the leaders in the field, what you love we will love: teach us how.

Nov 14

Digital Humanities and Your Brain

[this post first appeared on the NINES blog, 10/12/2011]

Here at the University of Virginia a group of scholars from various disciplines meets periodically to discuss digital scholarship. We began meeting a few years ago, and we call ourselves EELS (it stands for “electronically enabled literary scholarship.”) At one of our earliest meetings we read articles by Mark Bauerlein and Nicholas Carr, readings we termed “anti-EELS.” Their arguments are rather played-out by this time, but a few years ago were gaining steam: Carr’s article “Is Google Making Us Stupid?” had just appeared in the Atlantic, and Bauerlein asked in the Chronicle whether online literacy is of a lesser kind (both have since expanded their arguments books). Forget about challenging the place of digital scholarship in the academy: Carr and Bauerlein challenged very idea that the internet and the digital age left us any ability to think at all.

Even in 2008, of course, Carr and Bauerlein did not speak for everyone. Bauerlein cites Leah Price, for example, who argues that we need to expand our notions of what “reading” means. The response that most intrigues me, though, is Steven Pinker’s. In a New York Times op-ed Pinker (a cognitive scientist at Harvard) takes issue with Carr’s argument that the internet affects our brains: “cognitive neuroscientists roll their eyes at such talk. Yes, every time we learn a fact or skill the wiring of the brain changes; it’s not as if the information is stored in the pancreas. But the existence of neural plasticity does not mean the brain is a blob of clay pounded into shape by experience.” Pinker goes on to compare Carr to “primitive peoples who believe that eating fierce animals will make them fierce.” By writing “as if the brain takes on the qualities of whatever it consumes” Carr and Bauerlein seem to “assume that watching quick cuts in rock videos turns your mental life into quick cuts or that reading bullet points and Twitter postings turns your thoughts into bullet points and Twitter postings.”

Digital humanists have of course found academic uses for Twitter, and might take exception to this particular example. But Pinker’s larger point is an accusation that I think we must take seriously. His response to Carr is, essentially “you just don’t understand the brain”: he draws a boundary between the cognitive neuroscientists rolling their eyes and the poor humanist who can’t tell the brain from the pancreas. So while I applaud Pinker’s argument, the tone has me a bit worried.

Many humanists are no stranger to Steven Pinker, who ranks high on a short list of scientists whose work forms the basis of another emerging area of humanistic inquiry: cognitive literary studies. The 1990s was declared “the decade of the brain,” and during the same period in which the digital humanities have become so prominent, a parallel movement has connected humanistic research to brain science. But as Jonathan Kramnick has argued, this scholarship has its risks. While hoping to add a “scientific” basis to humanistic questions, proponents of cognitive approaches sometimes wind up, like Carr, drawing on a field without really understanding it. Literary Darwinism, says Kramnick, might not “bring us any closer to science. At the very least, the substance of the claim fails to represent debates within the sciences themselves.” Some scholars, says Kramnick, have failed to sufficiently immerse themselves in the discipline from which they hope to draw, and in an attempt to become “more scientific” have in fact become less so.

Like cognitive literary studies, digital humanities must draw on other disciplines, using methods and tools that many humanities scholars aren’t comfortable with. And digital humanities has witnessed similar debates about the extent to which we must immerse ourselves in these other disciplines. Do we, as Stephen Ramsay suggests, have to know how to code, and build things? Do we have to be trained statisticians so that the our text-mining results are “statistically significant? Are we more or less rigorous than the proponents of culturomics, whose work many humanities scholars seem skeptical about? These are questions about method, and interdisciplinarity, and collaboration. And they’re not particularly new questions. But I do think the comparison between digital humanities and cognitive literary studies is a useful one: how can tools and methods from other disciplines help us answer questions in our own?

As a parting note, I’ll point to Cathy Davidson’s upcoming course, which looks to be a model for interdisciplinary teaching. Perhaps this approach will connect the humanities with the brain and the internet.