Friday, July 30, 2010

"He doesn't speak English? I got a solution for ya - don't talk to him. How hard is that?"

Immigration is one of the few issues I've moved further to the left on over the past few years, and I've done so for reasons damn similar to those given by this dickhead:



Is there any moral difference between sneaking a job across a border (which is what outsourcing is, by the way) and sneaking a person across a border to work a job? Either capitalism is global, or it's not. Unless your brain is confused by an isolationist delusion, I think it makes sense that a global division of labor is more efficient than just having division of labor within national boundaries. But then don't have a double-standard about it, as though you're entitled to something just because you happened to be born in the U.S.A. I hate the hypocritical whining about it. Either shut up, or do something that moves us toward a more fair system OVERALL, for EVERYBODY. What? You're going to receive all the benefits of a global capitalist system but none of the costs? Stop asking for exceptions for yourself!

Wednesday, July 28, 2010

Dialectical geekism

Jennie Rothenberg Gritz's article in The Atlantic today, What's Wrong With the American University System, serves as a nice segue from my last post into some more general comments about liberal arts education in a technocratic information society.
[L]iberal arts, properly conceived, means wrestling with issues and ideas, putting the mind to work in a way these young people will only be able to do for these four years. And we'd like this for everyone. They can always learn vocational things later, on the job. They can even get an engineering degree later—by the way, in two years rather than four.
I'm inclining toward this position, especially since I've found it easy to acquire on my own, for free, the knowledge it took people with CS degrees tens of thousands of dollars to get. Without doubt, there are exceptions; however, it's increasingly the case that knowledge necessary to get technical jobs is available for free on the Internet. You can learn a lot that is practically useful and lucrative as a passionate hobbyist. It seems far more difficult to pick up a comprehensive liberal arts education that way (though I'm sure that's possible for some people, too).
In our economy, they're not really ready for you until you're 28 or so. They want you to have a number of years behind you. So when somebody comes out of college at 22 with a bachelor's degree, what can that person really offer Goldman Sachs or General Electric or the Department of the Interior? Besides, young people today are going to live to be 90. There's no rush. That's why I say they should take a year to work at Costco, at Barnes & Noble, whatever, a year away from studying, and think about they really what want to do.
What percentage of people really know what they want to do when they're 21 or 22 anyway? I'm 31, and I have to say, I'm overjoyed not to be living out the dreams of 19 year old me! I couldn't imagine being 40 or 50 and living out the plans I made as a kid. It would assume I hadn't learned anything significant enough in that time to cause me to fundamentally change my life, and that prospect seems equally absurd and sad.
How much really valuable research is being done on cancer? When I was at Cornell, Congress announced that they were going to pour a lot of money into cancer research. So a memo went out to the Cornell professors—not just in the sciences, mind you—saying, "Can you take your current research and cancerize it?" There's a lot of that going on. So sociology professors decided to research cancer communications, and so on.
Despite the received wisdom which says the more money we throw at a problem, the more likely it is to be solved, I think the above is more likely true. Solving technical problems is not as easy as increasing funding. Often it seems that problem-solving follows its own trajectory, and too much funding can actually hurt an endeavor. This might seem unrelated, but consider the following quote from Ben Goetzel's article on AI in February's H+:
We were also intrigued to find that most experts didn’t think a massive increase in funding would have a big payoff. Several experts even thought that massive funding would actually slow things down because “many scholars would focus on making money and administration” rather than on research. Another thought “massive funding increases corruption in a field and its oppression of dissenting views in the long term.” Many experts thought that AGI progress requires theoretical breakthroughs from just a few dedicated, capable researchers, something that does not depend on massive funding. Many feared that funding would not be wisely targeted.
One might think, "Yeah, but what does AI research have to do with cancer research?" Medical research is becoming more like an information technology every year. And the remarks about excessive funding leading to corruption and unwise administrative decisions is generalizable to any field. I don't know if this is true. Again, it's just a quote from an anonymous source. But it seems plausible enough and echoes what Hacker and Dreifus are saying.
Our view is that the primary obligation belongs to the teacher. Good teaching is not just imparting knowledge, like pouring milk into a jug. It's the job of the teacher to get students interested and turned on no matter what the subject is.
I found this to be my experience when teaching. I was a difficult teacher with plenty of quirks and arbitrary biases. Nevertheless, the reactions of students at the end of each semester were usually positive. They enjoyed being challenged. Challenging a student is different from giving them a hard time. You challenge a student when you engage their mind and help them elevate themselves to a new, more comprehensive perspective. I won't say I was great at this, but given their reactions, I would say I was better at it than the teachers in the impoverished, inner city school districts they were coming from.

All of this raises the question: whither liberal arts education in a technocratic info-culture? I think a liberal arts education has a central role to play in our civilization: that of drawing the mind to a more comprehensive, rational perspective on the whole ethical and political project of our civilization. It's the movement from "How?" type questions (technical solutions to problems) to "Why?" type questions (having to do with the relative to ultimate worth of what we're doing). If one of the main problems with the geek mentality is a tunnel vision which excludes the relevance of things like fairness, justice, and reciprocity, the remedy is to situation our technical pursuits within the broader context of the project of grounding and building the secular institutions of modern society. In other words, I see the point of liberal arts education as similar to the role Hegel assigned philosophy. For this reason I'll playfully call what I'm suggesting "dialectical geekism". (I also like the kitschy allusion to dialectical materialism.)

Of Geek Culture

-----BEGIN GEEK CODE BLOCK-----
GIT/M/MU/P/S/SS/O d-- s-: a C++ UL++>$ P++>$ L++>$ E---
W++ w+ PS++ PE++ GPG++ t-- X R* !tv b++++ G-e+++
h+ r
------END GEEK CODE BLOCK------

I've had a number of discussions over the past few weeks with different people about geeks and geek culture. I'm ambivalent toward geek culture. Many geeks I meet seem clueless about basic aspects of political and social relations that other educated people seem to take for granted. Nevertheless, going by the Wikipedia definition of the word—"One who is perceived to be overly obsessed with one or more things including those of intellectuality, electronics, etc."—I am a geek. The contents of this blog are more than enough testament to that.

Yet it's one thing to have geeky interests and another to be part of a geek subculture. If by "geek" we mean an intense interest in the technical details of some particular subject, to the point of making us lose peripheral vision at times of things outside that subject, then there's an argument to be made (which Neal Stephenson does make) that we're all geeks now. That would owe to the technocratic shape society has taken since the 50s and to the explosion of the internet and IT after the mid-90s. Geekdom then is an epiphenomenon of the intellectual and technical division of labor in the information society.

As one's position in the hierarchy of division of labor in industrial capitalism tended to incline one toward certain social identifications and political positions, it seems one can also make meaningful generalities about geek subcultures and the political views held by their members and adherents. The problem is that this subculture has become much more complex since the explosion of the Internet and IT in the mid-90s. Before then, but roughly from the late 60s/early 70s onward, to be a geek was largely synonymous with being a tech geek or hacker: the programmer subculture that originated at MIT. In the days before the Macintosh—and for a period of time after—to know that someone was into computers usually meant also knowing, with a high degree of probability, that they were socially awkward, into tabletop roleplaying games, libertarian, had a junk food diet, didn't know when and when not to correct an error in conversation, and were probably annoying to be around.

But the geek landscape has become more complex and multidimensional since the mid-90s. People from all walks of life have been drawn into information technology, because IT has become such an integral part of the production process. With this influx of people with such vastly different intellectual and personal backgrounds, the political and cultural flavor of IT has changed drastically. While Silicon Valley CEOs were by and large in favor of Ron Paul, white collar IT workers in general were pro-Obama. Obama appealed to them because he's part of their generation, and because he's sharp and tech-savvy. They didn't vote for Obama because he's black. I don't think geeks in general would do something like this. They voted for him because they thought he represented their interests and deserved to be President. (This is why Hillary Clinton had to start doing shots of whiskey in PA to get votes. Both candidates were equally centrist, but Obama appealed to an entirely different, ascendant portion of the capitalist class system in the U.S.)

The phenomenon is further complicated by the fact that there's more than one way to be a geek now. There aren't just tech geeks. There are arts geeks, punk geeks, metal geeks, history geeks, comics geeks, horror geeks—almost any kind of geek you can imagine. Again, I think this is because of the way capitalist society has become an information capitalist society. Yet it's not simply the case that to know something means to be part of a subculture. This was the case with the original geek culture of the 70s and 80s, but it's not so anymore. And this is what makes analyzing geek culture difficult. It's one thing to be intensely interested in subject x but it's another thing to be part of the subculture. These were almost identical in early geek subculture, probably because there were so few people who had the knowledge in the first place. It was concentrated in university CS and AI labs.

Now the situation is very different. People can collect interests and hobbies without having to deal directly with another person at all. They can simply spend hours a day researching it on the Internet. Nevertheless, one can draw probabilistic inferences. That I'm a Linux enthusiast doesn't tell you a whole lot, since almost anyone can and does get into Linux. If I tell you I'm into Linux and collecting firearms, well, now you might start to form a picture in your mind. If I tell you that, in addition to firearms and Linux, I'm also passionate about German Idealist philosophy, that's going to mean something very different from if I tell you I'm also into roleplaying games or Joss Whedon or genre fiction or whatever.

The political and social character of geek subcultures has changed drastically since their explosion 15 years ago. One can't simply take it for granted that because a person is a geek, they're libertarian or aren't able to recognize the nuances of gender or race relations. The "Racefail" fiasco of last year illustrates this, as does the 2008 Presidential election. To be a geek doesn't just mean coming from a science/tech background. It could also mean coming from a humanities background, since so many people who major in humanities go on to have IT jobs. Of course people are dense almost no matter where you go, but your odds of encountering politically enlightened individuals jumps drastically as soon as you enter a geek field as opposed to if you work in a place where everyone has business or accounting degrees. I don't have numbers on this, but I'm guessing information workers are generally more progressive than those people who don't work with information. (This doesn't mean we don't all have a long way to go before we're where we should be.)

There's a lot I haven't deal with here. I only mentioned non-IT geek subcultures in passing. I've hardly touched on the gender composition of these subcultures (some geek subcultures are predominantly composed of women) and the attendant differences in political perspective. There's also a lot more to say about the complex interaction between being interested in a subject and being part of a subculture. I think there's also a lot to say about the role of the humanities in all of this, and what it means that so many people from the humanities are going on to get jobs in IT (rather than just CS majors). So accept this as an introductory post on the subject, something I'll hopefully add to later.

Sunday, July 18, 2010

Hegel's philosophy of history

While I was on the bike at the gym today, I read the chapter on Hegel's philosophy of history in Herbert Marcuse's book Reason and Revolution. This is probably the most derided and dismissed part of Hegel's philosophy, maybe even more so than his philosophy of nature, owing to what some take to be its political and moral implications. As Marcuse himself puts it:
Hegel's picture of the Reformation is fully as erronenous as his description of the subsequent social development, confusing the ideas by which modern society glorified its rise fo the reality of this society. He was thus led to a harmonistic interpretation of history, according to which the crossing to a new historical form is at the same time a progress to a higher historical form—a preposterous interpretation, because all the victims of oppression and injustice are witness against it, as are all the vain sufferings and sacrifices of history. The interpretation is the more preposterous because it denies the critical implications of the dialectic and establishes a harmony between the progress of thought and the process of reality.
I'm not really in a position to refute Marcuse on this point, not being familiar enough with the text he's interpreting. What he's saying sounds true. From the perspective of the current order (whatever it may be), there is going to be strong pressure, just by virtue of being embedded in the culture, to view everything that has come before as not only leading up to the current state of affairs but also justifying it and making it appear rational. It's one thing to say that the basic categories by means of which we understand history—justice, freedom, and emancipation—could not have come about but through the experience of humans of their opposites (injustice, slavery, and oppression). It's quite another thing to say this makes history rational and thereby justified. This is the move that Marcuse calls a "harmonistic interpretation of history". The philosopher's job is not to justify or rationalize what has come before and what exists now. It's the opposite. It's to engage in the act of immanent critique, by means of which the highest categories of civilization are turned against civilization itself. It's only by means of the most ruthless self-criticism that we arrive at the self-consciousness which Hegel sees as constituting the ultimate law of history.

And that law of history is, I believe, more or less correct. It's the idea that universality means something, i.e., is a real historical force, when individuals self-consciously understand themselves as acting in accordance with it. The whole point and purpose of history is so that individuals understand themselves as acting in accordance with notions of freedom, justice, equality, and the like. Hegel's understanding of history is in an important respect a Kantian one, in that there are always two, not one, preconditions of rational action: I must act in accordance with the notion, but I must always also take myself to be acting in accordance with the notion. Anything else is mere mechanism. This is why, according to Hegel, history is the story of man's emancipation from and domination over nature. Again, it's a basic Kantian move. The given (be it nature, the thing in itself, or the local Catholic bishop) has no intrinsic authority for freedom. Whatever authority it gets it gets by means of the free act of reason.

In my opinion the major metaphysical (as opposed to moral or political) weakness in Hegel's philosophy of history (at least as explained by Marcuse) is in Hegel's understanding of the relationship between historical change and natural change. As Marcuse puts it:
Since Aristotle, historical change has been contrasted with changes in nature. Hegel held to the same distinction. He says historical change is 'an advance to something better, more perfect,' whereas mutation in nature 'exhibits only a perpetually self-repeating cycle.' It is only in historical change that something new arises. Historical change is therefore development. (emphasis mine)
As Aristotle puts it in Physics B1, "[W]hen we speak of nature as being a generation, this is a process toward nature [as a form]." From the time of Aristotle but until roughly the time of Hegel, the process of nature was understood as roughly cyclical in kind. It was an imperfect replica of the perfect cyclical generation of the heavens. While species occasionally passed into and out of existence (the Greeks knew of fossils), the overall tendency was for nature to "keep with itself". So humans give birth to more humans, deer give birth to more deer, acorns eventually become acorn-bearing oaks, the Nile flooded according to schedule, etc. This idea was under attack in Hegel's day, but it wasn't really until the publication of Charles Darwin's The Origin of Species that people began to realize the extent to which the opposite position was true. Even Darwin wasn't aware of the extent to which nature generated novelty. He thought the entire process leading up to humans might have transpired over the course of a few million years. We know now the process was several orders of magnitude greater. The Earth is about 4.5 billion years old.

The implications of this for the philosophy of history are staggering. Hegel thought going back to the beginning meant going back to the Greeks or to "oriental" society. We know now that Hegel's history covers approximately 1% of the actual history of mankind. But apart from the mere quantitative adjustment, there is a qualitative one, too. Evolution makes it impossible to regard human history as constituting a realm entirely separate from that of natural history. On the one hand, we now know that nature generates radically new forms. Evolution is the original avant garde modernist. On the other hand, it no longer makes sense to think of the human act of creation (be it artistic or technological or otherwise cultural) in abstraction from the tendency of nature to increase in complexity and in order. The transition from genetics to memetics in human beings does not designate a radical break with nature but rather a smooth continuation of this increase in complexity and order toward an as-yet unrealized terminus.

Historical change and natural change appear to be two species of the same genus, and so neither reduces to the other. The genus appears to be change itself, or time, which is not an empty form as Kant thought, but which has a rational structure. As Marcuse explains:
The work of thought was destroyed by thought. Thought is thus drawn into the process of time, and the force that compelled knowledge in the Logic to negate every particular content is disclosed, in the Philosophy of History, as the negativity of time itself. Hegel says: 'Time is the negative element in the sensuous world. Thought is the same negativity, but it is the deepest, the infinite form of it...'
As Hegel argues repeatedly, negation is not mere destruction but rather determinate negation. It is an annihilation which remembers and builds upon what it annihilates. A community or way of life does not merely come into existence. It kills the parent-society that gave birth to it, and it understands itself and justifies its way of doing things in contrast to the previous way of doing things. Modern society is a rational society that takes itself to be rational for rational reasons! (Enter Nietzsche and Adorno here...) Whether or not you buy into all these trappings about rationality, though, this much is true: people in a society have a shared memory or history which constitutes who they are. Furthermore, it constitutes how they reflect upon that identity—which especially in the United States has huge implications for the critique of racial, gender, and sexual identity.

What I'm suggesting is that there is an analog for this in nature. All life based on DNA "records" everything that has happened to it so far in the evolutionary process. Every species in existence today is a descendant of one, common, universal ancestor, the basic chemical traits of which we still all possess. That genetic information gets expressed in an organism which gets acted upon by an environment, which in turn causes the organism's genetic information to change over the long-run. Those organisms in turn impact and change the environment in drastic ways (the oxygenation crisis, the snowball earths, etc.), resulting in a dynamic feedback loop. This is precisely why the evolution of life has been accelerating. It's a result of (a) the ability of nature to store and destroy information and (b) the dynamic feedback loop between nature and itself as it acts on this information (blindly carrying out those instructions). Eventually large brains allow for greater memory and self-conscious, rapid information storage and retrieval. The invention of culture causes an increase in the order and complexity of information in the universe, but it does more than that. For once, the instructions don't have to be carried out or destroyed blindly. The information can be used and changed consciously, in accordance with other experiences. Language allows people to share these experiences and information with one another in one life time. The arrangement of the information is more ordered and complex now than previously, but a qualitative change is also taking place with the introduction of consciousness. Information has now become knowledge. Knowledge allows me to act upon my own experience, rather than waiting for nature to do it over the course of many generations. I direct myself by means of knowledge only if there is a self to mediate the relationship. Thus, individual subjectivity. Yet only agriculture allows societies to grow much larger than they could before. This in turns sets the stage for the complex social relations that give rise to what Hegel calls Geist: the relationship in and through which individual subjectivities are validated as subjectivities, but which is constituted by those very same subjectivities. And that's where Hegel's story takes off.

The fact that Hegel is blithe about history being a "slaughter bench" and that there are people who are "outside of history" is distracting, to say the least. But I think a proper understanding of the world requires something like Hegel's account. The main problem with it—once you get past the 19th century German-isms in it—really isn't in the account of history so much as it's in the account of nature behind it. The quote about time being the negative element in the natural world comes very close to the truth. The problem is with seeing nature as a less perfect form of thinking. That's a remnant of the old Aristotlean idea that the earth is a less perfect image of the stars, and the stars are an image of pure thought, so therefore life is an imperfect image of pure thought. That's where you get a lot of weird teleology and retroactive justification. I think the correct view is that, as Aristotle said, nature is form [morphe and therefore eidos or "idea"]. But once the last universal ancestor came into existence (and perhaps before), nature developed the ability to act destructive upon itself. It could "negate" itself. (And by the way, this is why Wolfram's cellular automata can't really explain why we have thought and culture and jazz and whiskey. It's because there's no dynamism in cellular automata. You need to introduce evolutionary algoritms. These cause the laws (the "forms") by means of which the automata act to themselves change. This is precisely the destructive, reflexive ability nature already has after the LUA.) And that's the key right there. That's why you can get from cyanobacteria to Mahler. That's why you can eventually get from the blind destruction and creation of information in "brute" nature to the self-conscious construction and use of knowledge in a human being. It's because nature isn't just "form" or "in-form-ation". It's because—for whatever reason—nature is already reflexive.