Wednesday, July 28, 2010

Dialectical geekism

Jennie Rothenberg Gritz's article in The Atlantic today, What's Wrong With the American University System, serves as a nice segue from my last post into some more general comments about liberal arts education in a technocratic information society.
[L]iberal arts, properly conceived, means wrestling with issues and ideas, putting the mind to work in a way these young people will only be able to do for these four years. And we'd like this for everyone. They can always learn vocational things later, on the job. They can even get an engineering degree later—by the way, in two years rather than four.
I'm inclining toward this position, especially since I've found it easy to acquire on my own, for free, the knowledge it took people with CS degrees tens of thousands of dollars to get. Without doubt, there are exceptions; however, it's increasingly the case that knowledge necessary to get technical jobs is available for free on the Internet. You can learn a lot that is practically useful and lucrative as a passionate hobbyist. It seems far more difficult to pick up a comprehensive liberal arts education that way (though I'm sure that's possible for some people, too).
In our economy, they're not really ready for you until you're 28 or so. They want you to have a number of years behind you. So when somebody comes out of college at 22 with a bachelor's degree, what can that person really offer Goldman Sachs or General Electric or the Department of the Interior? Besides, young people today are going to live to be 90. There's no rush. That's why I say they should take a year to work at Costco, at Barnes & Noble, whatever, a year away from studying, and think about they really what want to do.
What percentage of people really know what they want to do when they're 21 or 22 anyway? I'm 31, and I have to say, I'm overjoyed not to be living out the dreams of 19 year old me! I couldn't imagine being 40 or 50 and living out the plans I made as a kid. It would assume I hadn't learned anything significant enough in that time to cause me to fundamentally change my life, and that prospect seems equally absurd and sad.
How much really valuable research is being done on cancer? When I was at Cornell, Congress announced that they were going to pour a lot of money into cancer research. So a memo went out to the Cornell professors—not just in the sciences, mind you—saying, "Can you take your current research and cancerize it?" There's a lot of that going on. So sociology professors decided to research cancer communications, and so on.
Despite the received wisdom which says the more money we throw at a problem, the more likely it is to be solved, I think the above is more likely true. Solving technical problems is not as easy as increasing funding. Often it seems that problem-solving follows its own trajectory, and too much funding can actually hurt an endeavor. This might seem unrelated, but consider the following quote from Ben Goetzel's article on AI in February's H+:
We were also intrigued to find that most experts didn’t think a massive increase in funding would have a big payoff. Several experts even thought that massive funding would actually slow things down because “many scholars would focus on making money and administration” rather than on research. Another thought “massive funding increases corruption in a field and its oppression of dissenting views in the long term.” Many experts thought that AGI progress requires theoretical breakthroughs from just a few dedicated, capable researchers, something that does not depend on massive funding. Many feared that funding would not be wisely targeted.
One might think, "Yeah, but what does AI research have to do with cancer research?" Medical research is becoming more like an information technology every year. And the remarks about excessive funding leading to corruption and unwise administrative decisions is generalizable to any field. I don't know if this is true. Again, it's just a quote from an anonymous source. But it seems plausible enough and echoes what Hacker and Dreifus are saying.
Our view is that the primary obligation belongs to the teacher. Good teaching is not just imparting knowledge, like pouring milk into a jug. It's the job of the teacher to get students interested and turned on no matter what the subject is.
I found this to be my experience when teaching. I was a difficult teacher with plenty of quirks and arbitrary biases. Nevertheless, the reactions of students at the end of each semester were usually positive. They enjoyed being challenged. Challenging a student is different from giving them a hard time. You challenge a student when you engage their mind and help them elevate themselves to a new, more comprehensive perspective. I won't say I was great at this, but given their reactions, I would say I was better at it than the teachers in the impoverished, inner city school districts they were coming from.

All of this raises the question: whither liberal arts education in a technocratic info-culture? I think a liberal arts education has a central role to play in our civilization: that of drawing the mind to a more comprehensive, rational perspective on the whole ethical and political project of our civilization. It's the movement from "How?" type questions (technical solutions to problems) to "Why?" type questions (having to do with the relative to ultimate worth of what we're doing). If one of the main problems with the geek mentality is a tunnel vision which excludes the relevance of things like fairness, justice, and reciprocity, the remedy is to situation our technical pursuits within the broader context of the project of grounding and building the secular institutions of modern society. In other words, I see the point of liberal arts education as similar to the role Hegel assigned philosophy. For this reason I'll playfully call what I'm suggesting "dialectical geekism". (I also like the kitschy allusion to dialectical materialism.)

No comments: