Sunday, March 28, 2010

the third culture

I recently read John Brockman's essay The Third Culture. This much I agreed with:
In the past few years, the playing field of American intellectual life has shifted, and the traditional intellectual has become increasingly marginalized. A 1950s education in Freud, Marx, and modernism is not a sufficient qualification for a thinking person in the 1990s.
He wrote this in the early 90s, but it seems just as true today. Of course you can eke out an existence with a degree in the liberal arts, but it mostly entails (a) arguing with other liberal arts people and (b) training other people in the liberal arts.

The problem with Brockman's "third culture" is that it's not a third culture at all. The third culture is just the second culture after the second culture has dismissed the first culture as irrelevant and reactionary. There's no vision here of what a partnership between humanities intellectuals and scientist intellectuals would look like. There's simply a denial that humanities intellectuals have any of relevance to contribute to the advance of civilization. That's either silly or dangerous, depending on your opinion of the relative worth of the humanities.

So I pose the following question to you for discussion:

What role should humanities intellectuals play in the advance of human civilization?

Saturday, March 27, 2010

the structure of time

From physorg.com today:
The accretion process releases vast amounts of energy, and as a result quasars are among the most powerful energy sources known. No one knows for sure, however, how these objects form, how they develop in time, or how exactly their stupendous energies are produced. Because they are so bright, quasars can be seen even when they are very far away, and this combination of being both highly energetic and located at cosmological distances makes them appealing to astronomers trying to figure out the nature of galactic center black holes (our own Milky Way has one) and the conditions in the early universe that prompt these monsters to form.

There are about forty quasars known to be so far away that their light has been traveling toward us for over twelve billion years; in other words, their black holes were already glowing brightly when the universe was very young - less than one billion years old. The question is: do they look like nearby quasars, or are they different somehow? CfA astronomer Yue Shen is a member of an international team of twelve astronomers that has concluded that some remote quasars are very different indeed.

Using the Spitzer Space Telescope's sensitive infrared cameras, the scientists observed twenty-one distant quasars to see whether or not they could detect evidence for hot dust; such dust would be expected if there really is a hot accreting disk of material around a black hole. Indeed, hot dust is a characteristic feature of quasars in the local universe.

Remarkably, as the team reports in this week's issue of Nature, two of the quasars in their study show no evidence for hot dust. The implication is that these galaxies are so primitive (in cosmic terms) that there has not been time for them to make dust, presumably either because there has not been time to form enough of the required constituent chemical elements, or because there has not been time to assemble them into dust grains. The results suggest that these objects date from an epoch in the universe when dust was first being made. Dust is a key catalyst in turning atomic gas into the molecules that facilitate stellar birth and evolution, and this new result is significant not only for quasar research, but also for helping understand how the first few generation of stars in the universe came to be.
I find this wild for some reason. We're asked to think of what a galaxy might be, not only before stars exist but before there is dust. This is a mere 1 billion years after the creation of the universe, a whole 12 or so billion years before the present.

Few people stop to recognize how strange the universe, the whole of physical existence, is. It's worth thinking about.

The furthest back our concept of time, as understood by contemporary physics, allows us to go back is to 10-43 seconds after the "big bang". No one knows what the big bang is or was. It's a theoretical entity derived from the fact that everything in the universe is rushing away from everything else right now—very, very fast. If everything is rushing away from everything else, it stands to reason that if you go wind the clock back, everything was very close to everything else at some point in the distant past, right before it started getting very far apart. According to measurements of the light we receive from special supernovae known as "standard candles", cosmologists argue that the big bang occurred approximately 13.75 billion years ago. That's the point at which all the matter in the universe—about 1.6 x 1060 kilograms, which is about the size of your ass if you attached 60 zeroes to it—occupied less than the space taken up an atom, which is about 1 ångström aka 10-10 meters, which is much smaller even than my ass which itself is very small.

Don't bother trying to imagine it. It's just a number. An intellectual entity. It probably happened, so far as we know it, but there's no reason to believe the typical human imagination is at all adequate to the way the world in fact is.

The universe expanded and cooled a little. Before it cooled, the four forces—electromagnetism, the weak force, the strong force, and gravity—were all basically the same "thing". Again, it's just a mathematical concept. It's not even that, since no one yet has a theory to account for the unity of gravity with the other three forces. Therefore, it's an imagined concept, three steps removed from normal sensuous experience. (Imagination is one step away, mathematical reason another, imagined mathematical theory yet another.)

When it cooled—between 10-43 and 10-36 seconds—gravity separated out from the electromagnetic and strong and weak forces. At this point you have gravity, the Higgs boson (as yet undiscovered), and magnetic monopoles. Magnetic monopoles are so important. We've never discovered one, but inflation theory (see below) predicts they should be especially hard to find. According to Alan Guth, if you could put together about an ounce of magnetic monopoles and a false vacuum, you could create another universe off from this one. Looks like the creation of the universe, at least in theory, is not a divine act. Thankfully the Large Hadron Collider might create magnetic monopoles! If we do create them, it will probably change the world as we know it, but that's a story for another time.

At this point inflation takes place. Inflation implies that the universe went from being about the size of an atom to being, well, the size of the universe in 10-7 seconds (.0000001 seconds). It increased by a factor of about 1078, which is a meaningless number to a human but it's impossibly large. The discovery that the universe inflated is one of the all-time great discoveries in physics. There are a few reasons we know it happened. If you look in any direction in the universe, it's all the same. This implies that everything in the universe was at one point very close together—really, "touching". Otherwise how would everything on the "east" side of the universe have communicated with everything on the "west" side of the universe? (Since they're way too far apart to do any communicating now.) We also know inflation is true because the universe looks completely flat. The universe before 10-36 seconds must have been highly curved. This follow from Einstein's general theory of relativity, according to which, the more shit you have crammed into a 5 lb bag, the more "curved" will be the space it occupies. Curvature of space is hard to grasp, but if you ever played PacMan, it's kind of like when PacMan goes through the portal on the right side of the screen and appears on the left. There's a very high probability the universe is indeed like an extremely big PacMan board in this respect. What's extremely unlikely is that the universe is flat. And yet all our most careful measurements of it show that it is sublimely flat. Inflation explains why. Early on, the universe just got so big that from any observable point on the surface of the universe, it looks flat. Just like from any observable point on the surface of the earth it looks flat. Inflation also explains the bit about monopoles. It predicts there should only be 1 monopole per observable area of the universe. But the actual universe is about 1026 bigger than the observable universe. Hence, we don't see any monopoles. (Doesn't rule out creating them in particle accelerators!)

Time slows down to a crawl after this. You imagine it would, given how big space becomes. It's not until after 10-12 seconds that quarks are allowed to form hadrons (the stuff out of which the crap surrounding us is made).

Yes, we're talking about billion billionths of a second here, very short periods of time, but keep in mind we're working on an exponential scale. 1012 seconds is 1,000,000,000,000 seconds or about 32,000 years (3.2 x 104 years). 1036 seconds is 1,000,000,000,000,000,000,000,000,000,000,000,000 seconds or about 3.1 x 1028 years. So a difference of about 24 orders of magnitude. There's no difference between fractions of a second and factors of a second other than the ordinary human level of perception. Therefore, even though it was trillionths of a second, damn near an eternity passed between inflation and the beginning of the creation of matter. An even greater amount of time was to pass before the universe was cool enough to do anything else like form hydrogen or make dust.

This trend of exponentially greater periods of time between events was to continue. It takes longer to reach the photon epoch (all the light gets released, creating the cosmic background microwave radiation we see today). And it takes longer still to reach the formation of normal matter. Longer than that to create quasars (see above), galaxies, stars, planets, and the rest. The length of time between events increases exponentially.

But finally the exponentially increasing orders of magnitude levels out (from our perspective) and forms an S-shaped curve—when life comes into existence. Now it begins to go in the other direction (picture a bell-shaped curve here). Life exists for 3 billion years before there are eukaryotes. Then in a shorter time there is a difference in sexes. In a shorter time there are body parts (cambrian explosion). A shorter time before that there is life on land. A shorter time before that there are brains. A shorter time before that, creatures who are bipedal and who have opposable thumbs. A shorter time before that, humans. A shorter time before that, art, religion, and thought. A shorter time before that, the death of Socrates. Then the invention of the printing press. Then the invention of the steam engine. Then the computer. Then the internet. Then the sequencing of the human genome.

It might be the case that time is not a neutral, empty medium in which events take place but rather has an intrinsic structure to it. Plato argued that space and matter were the same thing, that the basic background material of the cosmos was something called "chora" which knotted up to form ordinary objects. There are many physicists today who believe that if we probe matter on a fine enough scale, we'll see that Plato's idea is true and point particles are mere configurations of space. Perhaps time is similarly locked in with the events that supposedly unfold in it. In that case the bell-shaped curve does not describe a set of events that happen to occur in time. Instead it describes the structure and perhaps meaning of time itself. In that case human thought is not a mere froth floating on the surface of a vast cosmic ocean—capable of seeing and conceiving of a small part of what is, but incapable of really getting to the bottom of things. Instead, human thought—especially as applied to solving the problems humans encounter, using finite resources—is an efficient cause in the service of a broad cosmological principle. In other words human invention and problem-solving are not mere spandrels, out-of-place with respect to the cold, meaningless cosmos. To think is part of the meaning of time itself, and it is essential to the unfolding purpose of existence.

Saturday, March 20, 2010

Leibniz and information theory

Perhaps Gottfried Wilhelm Leibniz is the patron philosopher of information theorists. Consider the following:
  1. Leibniz invented the first digital mechanical calculator capable of addition, subtraction, multiplication, and division. Called the "Stepped Reckoner", it was an improvement over Schickard's earlier model.
  2. He might have been the first to document the base 2 (binary) system.
  3. He developed formal logic, not as an abstract idea, but rather in connection with his work on calculators.
  4. Bertrand Russell was so impressed by Leibniz that Russell's first book was about Leibniz's philosophy (The Philosophy of Leibniz). Russell (with Whitehead) of course went on to write Principia Mathematica, which influenced Gödel and Turing who actually began the age of programmable computing.
  5. In Leibniz's metaphysics, relational properties reduce to monistic properties. A relational property is expressed in the proposition "The earth and the sun are 93 million miles from one another." A monistic property is expressed in the proposition "Socrates is mortal." Monistic properties follow from the essence of the thing itself. So it's in the nature of Socrates to be mortal. It's not in the nature of the earth or the sun to be such and such distance from one another—or so it seems. According to Leibniz, an infinite intellect, like that possessed by God, could derive all relational properties just by having insight into the nature of the substances themselves. If one knew everything contained in the concept "Caesar", one could analytically derive that he would cross the Rubicon in 49 B.C. Human beings, possessed of finite, not infinite, intellect, must go out and discover things empirically, using the methods of natural science. But brute empirical facts, like the distance between the sun and the earth, or the date Caesar crossed the Rubicon, are only apparently part of the furniture of the universe. In fact, there are only monads and their intrinsic properties, and they are related to one another, not by brute relations of space and time, but rather by their concepts. To put it in philosophical parlance, natural science gives us an ideal picture of the world, but pure thought expresses what is absolutely real. (Compare with Newton and Kant who basically thought the opposite.)

    What does this have to do with information and computation? One way of describing Leibniz's metaphysics is to say that the world is souls all the way down. Every existing thing is a windowless monad which has no real relationship to anything outside of itself. It might have an ideal or made-up relationship to other monads when viewed from the human perspective, but that's a mere artifact of our finite intellect that would go away as soon as one adopted a divine, infinite perspective.

    But imagine one were to adopt that infinite perspective. Imagine one had a complete account of the world as it really is, and that all of that information were contained in a book. Because our intellects are finite, we can't know what information is contained in that book, but we can know the general form it would take (assuming Leibniz is correct). It would contain a list of substances, things like "Caesar", "Socrates", "Sun", etc., and for each substance there would be given a derivation, using symbols of formal logic which Leibniz devised, of everything contained in the substance's concept.

    So basically what Russell and Whitehead were trying to do with mathematics—ground the whole thing in logic—Leibniz was trying to do with the whole of reality, right down to its very core. Another way to put it: every substance does in space and time what is programmed into the concept which defines what it is. The reason I see you walking toward me is not because either of us is really moving through space and time. It's because the substance that I am is programmed to see you moving toward me just at the same time the substance you are is programmed to walk toward me. In fact, there is nothing more to me being me and you being you to be programmed exactly like this. To see it any other way is merely an illusion. This is what Leibniz means by "preestablished harmony".
Of course the flaw in my scheme here is that Leibniz might have been the last human being who knew everything there was to know in his time! He was the father of information theory, but he was the father of a lot of things, just by virtue of how widely and deeply he thought about the world. He could be the father of me for all I know. And seeing how in Leibniz's metaphysical universe disparate things are connected to one another in the most unlikely ways, no one should be surprised if it were true.

Saturday, March 13, 2010

the hypothesizing brain

"The brain's main job, like that of a scientist, is to generate hypotheses about what is going on in the outside world, a Max Planck Institute for Brain Research study suggests." (more)

I'm not so sure. It's not the scientific work itself I'm skeptical of but rather the interpretation of it quoted above. It doesn't sound right.

First of all, a brain is a part of a scientist already. Saying a brain's job is like a scientist's job is like saying a CPU's job is like a computer's job or that an electron's job is analogous to that of an atom's. I know the point here is deeper than that, and I'm harping on a syntactical error; nevertheless, this subject is murky already without saying confusing things like this.

I suspect the deeper point is that the principal function of the brain is to form hypotheses, and that human beings in general, just by virtue of experiencing the world, are either doing science or are doing some sort of proto-scientific activity.

So there are several questions in here:

What's the relationship of hypothesizing to doing science in general? If someone is hypothesizing, is that enough to say they are "acting like a scientist"?

What's the relationship of hypothesizing to ordinary experience? Is the brain's main function to hypothesize? Is human experience primarily constituted by hypothesizing?

The interpretation seems to suggest that the main way we relate to the world is by means of generating intellectual models of the world (of which hypothesizing is a species). Yet our main way of relating to the world seems non-representational and practical in nature. Sure, if I walk in a room and see someone there I really wasn't expecting, that's going to cause a jump in activity in a part of my brain. From the experiential point of view, I'd have encountered something novel, so I'd have to bring in some conceptual framework to make sense of it. (Though imagine you walk into a room and someone takes a swing at you. A fight/flight response would kick in before a conceptual one did.)

But you can't generate a general understanding of how the mind works on the basis of an extreme example. It's like generating your position on waterboarding with the "there's a ticking bomb about to explode" scenario. You have to go by how the mind acts ordinarily. Ordinarily I'm not drafting hypotheses or any conceptual models of the world; I'm just interacting with it. (And I think we should be skeptical of any claim that we're doing this unconsciously.)

I don't relate to the world as though it's an object before me that I form theories about. I relate to the world as though it were part of my own nervous system. When I have an itch, I don't form a hypothesis or a conceptual model of it. I just scratch it. The itch invites a scratch, and I oblige. That's how most experience is. The dishes are dirty and invite washing, and so (if they're lucky) I wash them. (They were unlucky last night.)

Tuesday, March 2, 2010

the motor of hominid evolution

Yesterday I watched the first part of Nova's Becoming Human documentary. According to the documentary, hominid brain size remained relatively stable for 4 million years, and then suddenly there was a brain size explosion coinciding with the emergence of the genus Homo. Why should brain size suddenly have increased after 4 million years of near-stagnation? The answer, according to the narrator, might be climate change. Evidence from samples taken from the sea bed off the coast of Ethiopia show that the climate in the Rift Valley was volatile in the period leading up to the emergence of Homo. Rapid alternation between moist verdant periods, dry plains, desert, and back again might have selected for a creature which could deal with novelty.  This would have necessitated a larger brain.

It's not the first time I've heard a theory about environmental factors affecting human evolution. The Mount Toba catastrophe theory is used to explain why there is an apparent bottleneck in human evolution. Evidence from mitochondrial DNA shows that every human on earth today is descended from an extremely small ancient population. There's more genetic diversity in a community of chimpanzees than there is in the entire human race. There must have been some point in history where the human population was reduced to perhaps a few hundred individuals. One explanation is that the eruption of the Mt. Toba supervolcano caused a severe drop in global temperatures which led to the near-extinction of humans.

It's possible that these hypotheses are true. However, we should be skeptical of hypotheses according to which environmental factors are the chief engine of evolution. Evolutionary biologists have accepted for awhile now that the principle driver of evolution is not the impact of the environment but the impact of species on one another. This was recently proved for the first time in a laboratory experiment. It's not the environment that selects our genes so much as the competition between species.

The idea that competition amongst early hominids might have been so fierce as to impact our genes and cause our brains to grow bigger shouldn't come as a huge surprise. Anthropologists have known for awhile now that violence in hunter and gatherer communities is hundreds of times greater than the violence we're accustomed to in modern society. Lawrence Keeley at Oxford has compiled data showing that if the rate of mortality from violence were as high today as it were in hunting and gathering societies, World War II would have resulted in the deaths of 2 billion people, not 100 million. (more) Yet humans were hunters and gatherers for over 3/4 of their existence. It's unlikely we were more peaceful then than contemporary hunters and gatherers are now. It's extremely likely that archaic hominids were less civilized than that.

One can easily imagine a scenario in which extreme, vicious inter- and intra-species conflict selects for those who aren't just brutal but who are exceptionally cunning as well. Not just those who are able to overcome others by means of brute strength, but who are good liars and artificers, too, ones who have a more developed sense of self and so who are able to hide things from others and to anticipate just a few more steps into the future. A fortuitous side-effect of such an ability is that we're able eventually to realize that such gifts are best used toward the ends of culture and contemplation, and that these things are impossible without peace and civil order. But the knife that carved those abilities might have been far more insidious and less accidental than climate change.