History Blinks

Concerning history, Pascal wrote “Le nez de Cléopatre: s’il eût été plus court, toute la face de la terre aurait changé.”  Numistmatics have recently begun to challenge this precept, however, with the discovery of less flattering profiles of the Egyptian queen.  Earlier this year, academics at the University of Newcastle announced that by studying an ancient denarius, they arrived at the conclusion that the Queen of the Nile was rather thin-lipped and hook-nosed.  Looking at pictures of the denarius in question, however, one cannot help but feel that perhaps the coin itself has undergone a bit of distortion over the years.

Given that history distorts, it seems peculiar that we would place so much faith in the clarity of twenty-twenty hindsight.  Perhaps this is intended as a contrast with the propensity to error that befalls us when we attempt proclamations of foresight.  Yet even the clarity of hindsight regarding recent events is often, in turn, contrasted with the objectivity achieved when we put a few hundred years between ourselves and the events we wish to put under the investigative eye.  Is there an appropriate period of time after which we can say that clarity has been achieved, shortly before that counter-current of historical distortion takes over and befuddles the mind, like the last beer too many at the end of a long night?

Looking back is often a reflexive act that allows us to regret, and thus put away, our past choices.  Usually, as Tolstoy opines in his excursis to War and Peace, distance provides a viewpoint that demonstrates the insignificance of individual actions, and the illusory nature of choice.  It is only in the moment that Napolean appears to guide his armies over the battlefield to certain victory.  With the cool eye of recollection, he is seen to be a man merely standing amid the smoke of battle giving instructions that may or may not reach their destinations, while the battle itself is simply the aggregation of tens of thousands of individual struggles.

And yet, in the cross-currents of history looking forward and historians looking backward, one occassionally finds eddies in which the hand of history casts our collective fates with only a handful of lots.  Such an eddy occurred in late 2000, and, in retrospect, it changed the face of the world.  With an oracular — and possibly slightly tipsy — pen, the late Auberon Waugh was there to capture the moment for posterity:


Many Europeans may find it rather hurtful that the United States has lost all interest in Europe, as we are constantly reminded nowadays, but I think it should be said that by no means all Americans have ever been much interested in us. Only the more sophisticated or better educated were aware of the older culture from which so many of their brave ideas about democracy derived.

Perhaps the real significance of the new development is that Americans of the better class have been driven out of the key position they once held, as happened in this country after the war, leaving decisions to be made by the more or less uneducated. We owe both classes of American an enormous debt of gratitude for having saved us from the evils of Nazism and socialism, and we should never forget that. It is no disgrace that George W. Bush has never been to Europe; 50 per cent of Americans have never been abroad. They have everything they need in their own country, but their ignorance of history seems insurmountable.

Everything will be decided in Florida, but it is too late to lecture the inhabitants about the great events of world history which brought them to their present position in world affairs. Florida is a strange and dangerous place to be. It has killer toads and killer alligators. An article in the Washington Post points out that it is also the state where one is most likely to be killed by lightning. Most recently a man in south central Florida was convicted of animal abuse for killing his dog because he thought it was gay. The state carried out a long love affair with the electric chair which it stopped only recently, and somewhat reluctantly, in the face of bad publicity when people’s heads started bursting into flames.

George W. Bush’s considerable experience of the death penalty in Texas may help him here, but I feel we should leave the Americans to make up their own minds on the point. If we had a choice in the matter, I would like to think we would all choose the most venerable candidate, Senator Strom Thurmond (or Thurman if you follow caption writers in The Times) who is 97 years old. If the other candidates cannot reach a decision by Inauguration Day on January 20, he will swear the oath himself. These young people may have many interesting features, of course, and Al Gore’s hairstyle might give us something to think about, but one wearies of them after a while.


Excursis on Deception


The Renaissance theories about eyes and pneuma depend on a natural relation between the eyes and the underlying physical world.  For instance, a person could not give someone the evil eye simply by painting their own eyes with pigment.  The cause of the affliction must lie in the nature of the person who passes on this curse, and not in some extraneous cause.   Similarly, the eyes of the beloved must really transmit something of her soul through love’s arrows in order to ensorcle the lover.  In this sense, pneumatic theories are natural theories.

In my readings for the preceding blog, however, I came across a curious origin for the name of the belladonna plant.  According to some sources, the belladonna, an herb of the nightshade family, was once used as a cosmetic to dilate women’s pupils, which was believed to make them more attractive to men.  The belladonna’s name, consequently, is ascribed to its association with beautiful women.

Deh, bella donna, che a’ raggi d’amore
ti scaldi, s’i’ vo’ credere a’ sembianti
che soglion esser testimon del core

Ah, fair lady, who warmest thyself in the rays of love,
if I may trust to looks
which are wont to be witnesses of the heart

–Dante, Purgatorio XXVIII

The practical power of the belladonna, in effect, replaces a spiritual theory of love with a psychological one, for the eyes no longer mirror the soul but instead can be manipulated and enhanced by other means.  What is given by nature is transformed by art into something other, and the presuppositions about natural relations are undermined in the process, much as in the modern world, breast augmentation is preceived as displacing natural beauty with an artificial conception of what is beautiful.  The analogy is sometimes drawn with the binding of women’s feet in China, a practice that was propelled by a cultural desire on the part of certain men for small feet as well as coercion by women who had already undergone the grueling procedure.

The case with foot-binding may be something different, however, since the goal in this case is not to make one thing appear to be something it is not, but which is also natural, for instance transforming small eyes into big eyes, but rather to transform one natural thing into something else that is unnatural, and culturally conditioned.  Exceptionally large eyes, while unusual, do occur in nature, whereas feet folded over on themselves do not.  Thus the former is an act of deception, while the latter, technically, is not.

For Aristotle, the senses can never be deceived.  In On the Soul III:3 he states that “sensations are always true.”  To explain deception, then, he extends the faculty of imagination beyond something that is merely present in revery, and instead makes it a part of everyday experience.  To make this distinction between sense and imagination at the end of III:3, Aristotle draws on a distinction he made previously between special objects of sense and incidental objects of sense.  As an illustration (which is then used in several other works) Aristotle contrasts the patch of white that we might see in the distance with the son of Diares (the son of Cleon is sometimes also used in these illustrations).  The son of Diares is the incidental object of sense, while the patch of white is what we actually see.  While we might be in error about the former, we cannot be so about the latter.

Perception of the special objects of sense is never in error or admits the least possible amount of falsehood.  That of the concommitance of the objects concomitant with the sensible qualities comes next: in this case certainly we may be deceived; for while the perception that there is white before us cannot be false, the perception that what is white is this or that may be false.

Aristotle makes the imagination an intermediary between sensation and thought, functioning both as a high-level kind of sensation, or as something that often accompanies sensation, as well as a low-level kind of thinking.  Most interestingly, he ascribes this faculty of pseudo-thought to animals.

And because imaginations remain in the organs of sense and resemble sensations, animals in their actions are largely guided by them, some (i.e. the brutes) because of the non-existence in them of mind, others (i.e. men) because of the temporary eclipse in them of mind by feelings or disease or sleep.

tr. J.A. Smith

Contemporary biology supports the belief that animals not only have the faculty of imagination, and so are capable of being deceived, but goes further in suggesting that they also have the capacity to be deceivers.  In their book, How Monkeys See The World, Dorothy Cheney and Robert Seyfarth provide empirical evidence about the mental lives of monkeys, apes, and other species, including their ability to mislead others, even without the ability to introspect, which is the core faculty that allows humans to form notions about the inner lives of other people, and in turn allows humans to present themselves in ways that manipulate those inner lives.

The behavior and vocal signals of many different species often function to deceive or mislead others.  A review of the evidence, however, raises doubts about the flexibility of animal deception and provides little evidence for the attribution of mental states to others.  Great tits, for example give apparently deceptive alarm calls at feeding perches, and they are skillful enough to vary their false alarm calls depending upon who is nearby. If the birds at the feeding perch are lower ranking than the signaler, false alarm calls are rarely given, presumably because the caller can simply supplant his rivals by approaching.  When higher-ranking birds are present and a supplant is not possible, however, lower-ranking birds do give false alarm calls (Moller 1988).  There is, then, some flexibility in the use of deceptive alarms by great tits; however, the limits of great tit deception are equally striking.

This behavior suggests Nietzsche’s analysis of the origins of ressentiment, through the exercise of which Nietzsche’s mass men are able to overcome his nobles since the latter are incapable of duplicity or even of understanding it.  Ressentiment is a tool that allows not only for the levelling of society, but also allows the weak of Nietzsche’s philosophy to overcome the strong using mendacity and illusions.  The power of ressentiment comes from the ability to shape the minds of others as well as the drive to do so.  In animals, however, this special faculty seems to be absent.  According to Cheney and Seyfarth, manipulations of this sort only affect behavior, not thoughts.

We have no evidence, for example, that the birds use any other signals to deceive each other or that they use deceptive signals in any other social context.  Even in the case of nonhuman primates, there is little evidence that individuals ever act to manipulate each others’ beliefs, as opposed to each others’ behavior.

Perhaps the power of the belladonna, unlike that of Dante’s bella donna, is of a similar kind for, as Cheney and Seyfarth point out, the limits of great tit deception are striking.

Of Zombies (Part II)

Before using zombies as a metaphor for the dehumanizing treatment of prisoners at Guantanamo, Slavoj Zizek drew attention to a significant but often overlooked characteristic of zombies [1992]:

To a connoisseur of Alfred Hitchcock, this image instantly recalls The Birds, namely the corpse with the pecked-out eyes upon which Mitch’s mother (Jessica Tandy) stumbles in a lonely farmhouse, its sight causing her to emit a silent scream.  When, occasionally, we do catch the sparkle of these eyes, they seem like two candles lit deep within the head, perceivable only in the dark: these two lights somehow at odds with the head’s surface, like lanterns burning at night in a lonely, abandoned house, are responsible for the uncanny effect of the “living dead.”

The eyes of the undead are typically turned up so the irises are hidden and only white is shown (or sometimes the irises are even blotted out completely by the noxious fluid that animates the zombie).  This blankness of expression emphasizes the lack of an inner fire, as well as an incongruence between what zombies once were and what they have become. 

Contrast this with the eyes of the Afghan girl above, captured by a National Geographic photographer’s camera in 1985, which seem to overflow with the story of her life.

Zizek plays on this common association of the eyes with the soul to draw a connection between the empty eyes of the undead and the windows of an abandoned house.  The origin of this perceived affinity between eyes and souls is difficult to track down, however.  William Blake observed that “This life’s dim windows of the soul / Distorts the heavens from pole to pole.”  This in turn appears to be a reference to an older English folk saying, The eyes are the windows of the soul or, alternatively, The eyes are the windows to the soul, which the OED traces back to the sixteenth century.  Yet we also find a variation of this proverb in French, Les yeux sont le miroir de l’ame, which can loosely be translated as “The eyes are a reflection of the heart.”  de.wikiquote.org turns up Das Auge ist ein Fenster in die Seele as a German proverb, but erroneously ascribes it to the Bible. 

Rather than the Bible, the connection may lead back to ancient greek psychology.  In the Timaeus, Plato propounds a theory of vision involving both an inner fire and an outer fire created by the Demiurge.   Following Empedocles, Plato states that the inner fire lies behind the eyes, and in the act of perceiving emits rays that reach out, Superman-like, to touch the object being perceived.  At the object, the rays carrying the inner fire co-mingle with the light around the thing perceived and return this mixed light to the eyes and to the perceptive soul. 

In On Sense and the Sensible, Aristotle rejects his master’s notion of an inner fire, among other reasons because he finds it unnecessary.  Rather than a fire going out and then coming back in, Aristotle proposes that light from the object simply enters the eye, as we believe today.  He points out the mistaken notion that the visual organ is made of fire (natural science in the ancient world always revolved around the four elements) has its source in the bright lights one sees when one presses a finger against the eye.  Centuries later, Isaac Newton describes a similar experiment he self-inflicted by pushing a stick against his own eye, to see what would happen.

Aristotle proposes that the eye, in particular the pupil, is made of water rather than fire, for it has this particular characteristic of water: it is transparent.  Instead of serving as an active organ of attention, shooting out rays towards the world, the eye is a passive organ that receives impressions of color and magnitude which it passes to the soul, forming an impression of the sensible forms upon the soul as a signet ring forms an impression upon a piece of wax.

There must, therefore, be some translucent medium within the eye, and, as this is not air, it must be water.  The soul or its perceptive part is not situated at the external surface of the eye, but obviously somewhere within: whence the necessity of the interior of the eye being translucent, i.e. capable of admitting light.

In On the Motion of Animals and On the Generation of Animals, Aristotle outlines a physical theory of pneuma, a fine substance which permeates the body and carries sense impressions to the heart, which is the organ of the sixth sense (an organ he earlier denied  exists in On the Soul), or the common sense.  This pneumatic theory was further developed by Aristotle’s disciples, then by the Stoics, and eventually made its way into Renaissance psychology.

In his 1984 study of Renaissance phantasmic pneuma, Eros and Magic, Ioan Couliano surveys the problem of pneumatic infection through the eyes.  On the one hand, this takes the form of the evil eye, in which a diseased eye or an eye filled with malice can infect a person through the sensory organ and pneuma, thus taking over the sensus communis and causing a wasting away of the infected victim.  On the other, it takes the form of romantic infatuation, in which the beloved’s image takes over the lover’s soul and, when the love is unrequited, causes a similar wasting away of the victim. This erotic phenomenon led the poet Giacomo da Lentino to ask, “How can it be that so large a woman has been able to penetate my eyes, which are so small, and then enter my heart and my brain?”  Following the Platonic theory of ingneous optical rays, French poets identified this with fleches d’amour, an image which still persists in modern culture, though out of context, as Cupid’s arrows.  In its proper context, we can better understand Leonardo da Vinci’s observation “that the eyes of virgins have the power to attract the love of men.”

Circulating through the same pneumatic passage in which contagion of the blood is spread are images that, in the mirror of common sense, are changed into phantasms.  When Eros is at work, the phantasm of the loved object leads its own existence, all the more disquieting because it exerts a kind of vampirism on the subject’s other phantasms and throughts.  It is a morbid distension of its activity which, in its results, can be called both concentration and possession: concentration because the subject’s entire inner life is reduced to contemplation of one phantasm only; possession, because this phantasmic monopoly is involuntary and its collateral influence over the subject’s psychosomatic condition is highly deleterious.

— Eros and Magic in the Renaissance, tr. Margaret Cook

All the foregoing has assumed that the affinity between eyes and souls is a cultural artifact.  An alternative case can be made that the cultural function of the eyes is actually a side-effect of how we see the world.  Studies of the brain indicate that the interpretation of other people’s emotional states tend to concentrate on the eyes, and a great deal of our brain capacity is devoted to this particular task.  The amygdala, a part of the brain connected to the visual cortex and responsible for regulating fear reactions, has been shown to respond more strongly to larger (fearful) eye whites than to smaller (happy) eye whites. At the opposite end of the spectrum, Arthur Arun, a New York psychiatrist, has performed experiments demonstrating that simply encouraging people to stare into each other’s eyes for a length of time can instill feelings of attraction. 

The proverb the eyes are the windows to the soul may mask a physicalist truth, that the eyes are not a metaphor for the soul, but rather the soul is a metaphor for the eyes.  In the eyes we see the essence of another person: their emotions which over time become a model of our expectations of how they will respond to us.  The eyes are a touchstone allowing us to project thoughts and beliefs upon other people.  We introspect to triangulate our beliefs, eye expressions, and emotions, and from this matrix try to determine if another person responds as we would, or as we would like.  We look to the eyes to determine a person’s depth of emotion, and consequently their depth of spirit.  And when those eyes are empty, there is no longer anything present to project upon or interpret.


As no part, if it participate not in soul, will be a part except in an equivocal sense (as the eye of a dead man is still called an ‘eye’), so no soul will exist in anything except that of which it is soul….

On the Generation of Animals, tr. Arthur Platt

Of Zombies (Part I)

Asking the question “What is a zombie?” raises methodological issues which must be addressed before any attempt to answer the question may proceed.  For instance, it must be determined what kind of zombie we are trying to define: voodoo zombies, movie zombies, philosophical zombies, some other kind of thing called a “zombie”.  We might also want to arrive at a definition that covers all these various sorts of zombie.  Additionally, we need to concern ourselves with how we should go about determining what a zombie is.  We might follow a natural language philosophy, in which case we would replace the question “What is a zombie?” with the semantic question “What do we mean when we say zombie?”  We might, on the contrary, decide that we want to determine the deep meaning of the phenomenon of zombies, in which case we replace “What is a zombie” with the structuralist question “What is the cultural function of the zombie?”  Both of these questions have empirical, hence verifiable, procedures for persuing their respective questions.  We might also pursue a non-verifiable manner of determining what a zombie is.  To find out what a movie zombie is, we might ask George A. Romero what he intended his zombies to be.  We might also take the tack that the author is unreliable in matters such as this, and so a true revelation of the deep meaning of zombies would require that we ask anyone but the auteur what zombies represent. 

One tendency in evaluations of the undead is to discover a political meaning in the zombie phenomenon.  In doing so, the intent isn’t simply to show that there is a political dimension to zombies, but rather that the political exhausts all the deep meaning inherent in zombies.  For a survey of the political analyses of zombie-hood, see Reason Magazine‘s survey of zombie literature, which covers interpretations of zombies as alienated labor, Vietnam vets, white supremists, consumer culture, and a few more.  This follows a tendency in certain circles to see all deep meaning as ultimately political.

David Chalmers goes in a different direction with his discussion of philosophical zombies.  Chalmers makes clear that he is not trying to reinterpret the phenomenon of zombies, but rather is merely appropriating the language of zombies to describe something technically different.  Thus, while there may be overlaps between philosophical zombies, movie zombies and voodoo zombies, these are not necessarily relevant to the study of zombies that he is pursuing.  Which to some extent is unfortunate, since the relationship between philosophical zombies and political zombies is a rich one.  There is an apparent connection between zombies as a manifestation of alientated labor, zombies as a manifestation of aliented man, and zombies as beings without interior lives.

Zombies can be defined provisionally as empty vessels into which any sort of meaning may be poured.  This is what Descartes does in the Meditations to resolve the problem of other minds which he initially poses.  Early in this work, Descartes wonders how he can know that the people around him are indeed real people rather than automata, devices created to emulate human behavior but which have no being other than that of a seeming-nature.  Only after proving his own existence, which serves as a ground from which to prove the existence of God, is he able to return to the original problem and declare that other persons most likely do have an interior life like his because they outwardly behave as he does, and that God would not create a world in which an appearance such as this is not accompanied by a similar reality.  God is not a deceiver.

God has been pronounced dead in the intervening years, and so we are left with various problems we once thought resolved.  The notion of a natural political order upon which democracies such as the United States were founded have fallen aside in His wake.  Without a ready repository of pre-determined meanings founded on religion, modern man is left unmoored and in search for relevance.  Once apparently settled by Descartes, the problem of other minds rises from the dead to trouble us once more, and the attempt to unravel the meaning of the Zombie is entangled with the attempt to unravel the meaning of our own existence.

Concerning Facts

In the early chapters of A Study In Scarlet, John H. Watson observes of his friend Sherlock Holmes:

His ignorance was as remarkable as his knowledge.  Of contemporary literature, philosophy and politics he appeared to know next to nothing.  Upon my quoting Thomas Carlyle, he inquired in the naivest way who he might be and what he had done.  My surprise reached a climax, however, when I found incidentally that he was ignorant of the Copernican Theory and of the composition of the Solar System.  That any civilized human being in this nineteenth century should not be aware that the earth travelled round the sun appeared to be to me such an extraordinary fact that I could harldly realize it.

There is a similar strain of incredulity, both within the United States as well as without, when it is observed that a vast number of Americans claim they do not believe in Evolution.  It is a source of such consternation that the beliefs of presidential candidates on this matter are speculated upon and discussed as a sort of key that will reveal the secret heart of these men and women.  Are people who do not believe in Evolution simply of lower intellectual abilities than the rest of us?  Or is it rather that the decision not to believe is an indication of other values, tied together in a web of beliefs, that hinge on certain characteristics which make these people ultimately alien in their thought patterns, radically other in their perception of reality?  Do these people pose a threat to the homogeneity of world view that we take for granted in public discourse?

“You appear to be astonished,” he said, smiling at my expression of surprise.  “Now that I do know it I shall do my best to forget it.”

“To forget it!”

“You see,” he explained, “I consider that a man’s brain originally is like alittle empty attic, and you have to stock it with such furniture as you choose. A fool takes in all the lumber of every sort that he comes across, so that the knowledge which might be ueful to him gets crowded out, or at best is jumbled up with a lot of other tings, so that he has a difficulty in laying his hands upon it…. Depend upon it there comes a time when for every addition of knowledge you forget something that you knew before. It is of the highest importance, therefore, not to have useless facts elbowing out the useful ones.”

If they deny one fact, what else might they deny?  If they hold this set of beliefs, what else might they cleave to?  As a Scientific American article put it,

“Embarrassingly, in the 21st century, in the most scientifically advanced nation the world has ever known, creationists can still persuade politicians, judges and ordinary citizens that evolution is a flawed, poorly supported fantasy. They lobby for creationist ideas such as “intelligent design” to be taught as alternatives to evolution in science classrooms.

“In addition to the theory of evolution, meaning the idea of descent with modification, one may also speak of the fact of evolution.  The NAS defines a fact as “an observation that has been repeatedly confirmed and for all practical purposes is accepted as true …. All sciences frequently rely on indirect evidence. Physicists cannot see subatomic particles directly, for instance, so they verify their existence by watching for telltale tracks that the particles leave in cloud chambers. The absence of direct observation does not make physicists’ conclusions less certain.”

The atheism entry on About.com puts it more baldly:

“Evolutionary theory is the organizing principle for all modern biology – denial of it is like denying relativity in modern physics. The fact of evolution — the fact that allele frequencies change in populations over time — is as undeniable as are the actions of gravity or continental shifts. Despite this, only a third of Americans actually think that evolution is supported by the evidence…. People who don’t “accept” evolution are guilty of very unfortunate ignorance, but it’s probably an understandable ignorance. I wouldn’t be surprised if people were similarly ignorant of other aspects of science. It’s a sign of the great scientific illiteracy of American culture.”

“But the Solar System!” I protested.

“What the deuce is it to me?” he interrupted impatiently: “you say that we go round the sun. If we went round the moon it would not make a pennyworth of difference to me or to my work.”

Alasdair MacIntyre in After Virtue marks this conflation of theory and facts (evolution is a fact, but it is also a theory) as a product of the 17th and 18th centuries.  Empiricism is based on the notion that what we see is what there actually is.  It depends on the belief (or is it fact?) that our experiences are reliable.  Natural science, on the other hand, depends on tools of measurement as the arbiter of what is real.  The telescope tells us more than our unreliable eyes can. 

“…[I]n the measurement of temperature the effect of heat on spirits of alcohol or mercury is given priority over the effect of heat on sunburnt skin or parched throats.”

Just as theory is dependent upon measurement to verify it, so measurement is dependent on theory to justify it.  We require a theory of how heat affects mercury in order to be able to rely on our thermometer.  Yet this is so far from the notions of common sense and perception which undergird empiricism.

“There is indeed therefore something extraordinary in the coexistence of empiricism and natural science in the same culture, for they represent radically different and incompatible ways of approaching the world.  But in the eighteenth century both could be incorporated and expressed within one and the same world-view.  It follows that that world-view is at its best radically incoherent….”

Out of this notion of the fact, as something both self-evident and obscure at the same time, Max Weber formulated the opposition central to his theorizing, and still central to the modern world view: the fact-value distinction.  Just as a fact has a dual nature, a value also has an inherent ambiguity. It is both a choice as well as something imposed upon us by society.  In its second form, it is something that can be studied by the social sciences, and consequently can be analyzed to some degree as a fact.  In the first form, it is radically subjective, and as indeterminate as the swerve of Lucretius.

The matter can be framed as something even stranger than that.  In existentialist terms, the choice is something we are always obliged to make, so that the notion of a “factual” value is ultimately false, or worse, inauthentic.  From a scientific view point, on the other hand, choice is illusory, and merely a stand-in for facts we do not yet know.

It is these two terms, as vague as they are, that inform our public discourse.  On the one hand, facts are something we should all agree upon; the replacement of values for facts is considered an act of civil disobedience.  If we can’t agree on the facts, then what can we agree on?  On the other hand, we should not be driven by facts alone.  Is it enough to say that a market economy in the long run is the most efficient way to distribute goods?  What about social justice?  What about idealism?  What about values?

I was on the point of asking him what that work might be, but something in his manner showed me that the question would be an unwelcome one. I pondered over our short conversation, however, and endeavoured to draw my deductions from it. He said that he would acquire no knowledge which did not bear upon his object. Therefore all the knowledge which he possessed was such as would be useful to him.

It is the state-of-mind of Evolution-deniers I find most fascinating.  The more I think about them, the more I long to be one.  They hold a strange position that while they want to leave room in public science education for the creationism — hence the insistence on the public avowal that evolution is “only a theory” — they appear to have no desire to actually displace the teaching of evolutionary biology.  Perhaps this is merely strategic, a camel’s nose under the tent. 

But what if we take them at their word?  In that case, they want to find a way to make the fact of evolution and the value of creationism exist side-by-side.  They want to take nothing away from evolution to the extent that it is a practical tool that provides technology for them and extends their lives, but they also want to take nothing away from faith to the extent that it provides a reason to live and a way to go about it.  It is a world-view only possible with the construction of the fact-value distinction.  It is a beautiful attempt to reconcile the irreconcilable, and to make possible a plurality of beliefs that should not co-exist.  Still, it is a world-view that is at its best radically incoherent. 

“I don’t like to talk much with people who always agree with me. It is amusing to coquette with an echo for a little while, but one soon tires of it.”

— Thomas Carlyle

The Horned Man

I first met Dean not long after my wife and I split up.  I had just gotten over a serious illness that I won’t bother to talk about, except that it had something to do with the miserably weary split-up and my feeling that everything was dead.  With the coming of Dean Moriarty began the part of my life you could call my life on the road.  Before that I’d often dreamed of going West to see the country, always vaguely planning and never taking off. 

— Jack Kerouac, On The Road


The opposite of the dream we savor but never fulfill, a frailty of both the great and the small (how long did Richard Feynman plan his trip to Tuva?), is the fate we dread but never find the courage to face.  Instead we tell ourselves that it is not really so bad.  We curse our own weakness and blame ourselves for not taking it better.  We try to find the bright side, and in the end justify our circumstances in terms of what we gain through our sacrifice.

This is how I think of my daily commute, about which the current New Yorker has an article.  Normally I read The New Yorker when I arrive home from work.  It is my quiet time, during which I shrug off the cares of the day, and escape into a fantasy world of cosmopolitan effetism and intellectual escapism — John Colapinto’s article The Interpreter, also in this issue, about a tribe in the Amazon and how their idiosyncratic language, which lacks a feature of most tongues known as recursion, has turned the world of linguistics upside-down and pitted neo-Whorfians against Chomskyites, is a case in point.

When I got home today and openned up my newly arrived magazine, however, I found an article about my own life.

Atlanta is perhaps the purest specimen of a vexed commuter town, a big-fridge paradise.  Los Angeles, the country’s most sprawling megalopolis, may boast a more dizzying array of horrible commutes, but many of them are the result of a difficult landscape — ocean restricting growth on one side, mountains on another.  Chicago, Washington, D.C., and the Bay Area are worthy candidates, but they, too, owe a degree of complication to bodies of water.  But Atlanta, like Houston, sprawls without impediment in all directions, and an inordinate number of the commutes range from one edge of the sprawl to the opposite side.  People live and work on the outskirts.  For them, the city itself is little more than an obstacle and an idea.

Atlanta is a beltway town — it is defined by the interstate, known as the Perimeter, that encircles it.  It has a notoriously paltry system of public transportation.

Road-building doesn’t much help.  Atlanta is a showcase for a phenomenon called “induced traffic”: the more highway lanes you build, the more traffic you get.  People find it agreeable to move further away, and, as others join them, they find it less agreeable (or affordable), and so they move farther still.  The lanes fill up.

— Nick Paumgarten, There and Back Again: The Soul of the Commuter

The lure of Atlanta is the plenitude of jobs and cheap housing.  Our house, as my West Coast relatives frequently remind me, would cost a million dollars in the Los Angeles suburbs.  I paid a fraction of that for our four bedroom house on an acre and a half lot, sheltered from our neighbors by a thick copse of pine trees.  What they never tell you is that the jobs and the houses are frequent but far between.  The good areas are quickly overdeveloped and become congested.  To get away from them, you have to move further out.  To afford them, you have to get a higher paying job which more often than not is further away.   I drive from the east side of town to the west side of town, while others drive from the west to the east.  Why this should be I do not know, but it is a central feature of living in Atlanta.  Still others drive from the north to the south, or from south to north.  Atlanta is a city shaped like a doughnut, and driving from one edge to another is still better than driving from the edge to the center.  Very few people live near where they work.

The article makes more frightening observations about the phenomenon of commuting.  For instance, Robert Putnam, a political scientist at Harvard, points out that “[e]very ten minutes of commuting results in ten per cent fewer social connections.  Commuting is connected to social isolation, which causes unhappiness.”  According to economists Alois Stutzer and Bruno Frey at the University of Zurich, “if your trip is an hour each way, you’d have to make forty per cent more in salary to be as satisfied with life as a noncommuter is.”  They call this The Commuting Paradox.

“People with long journeys to and from work are systematically worse off and report significantly lower subjective well-being,” Stutzer told me. According to the economic concept of equilibrium, people will move or change jobs to make up for imbalances in compensation.  Commute time should be offset by higher pay or lower living costs, or a better standard of living.  It is this last category that people apparently have trouble measuring.  They tend to overvalue the material fruits of their commute — money, house, prestige — and to undervalue what they’re giving up: sleep, exercise, fun.

To this list of things given up should be added simpler things such as spending time with one’s children, time with one’s spouse, time in church and other communal activities.  The average American male is overweight, has few to no close friends, and leans right-wing.  Sociologists and pundits often attempt to tie these features to such things as the American gun-culture, video games, a laxity in contemporary cultural values, an increased commoditization and objectification of sex — but couldn’t these be merely the superstructure imposed by the infrastructure of the daily commute?  Take for instance the rise of Conservatism in America.  Nominally it was inspired by Reaganism, but isn’t Reaganism just the ideology fomented in the frustration of the Carter era, dominated by and remembered for cars lined up waiting for gas? 

The gas line has grown into the commuter jam.  As we sit in our cars, we listen to shock jocks on the car radio who reflect back to us the frustration and resentment we feel as we make our way through unpredictable and ultimately irrational choices on the road — Don Imus in D.C. (until recently), Neal Boortz in Atlanta, and Howard Stern just about anywhere since he is on satellite radio.  The lonely driver feels a false sense of control and liberty as only being locked in a small space can inspire.  The lone commuter hates his fellow man and is determined to cut him off before he can be cut off.  Who needs a political franchise when he has a horn, strategically positioned before him like a lab rat’s food bar, with which he can instantly articulate his feelings about the state of the nation?  Has a study ever been done correlating a person’s daily commute with his political leanings? 

Yet this isn’t exactly my life, since my life isn’t nearly so bad as all this.  About six months ago, I switched from a job for which I drove an hour, each way, to the job I currently have for which I drive only about 40 minutes each way.  My company also has a progressive policy about occassionally working from home, which is nothing less than a reprieve and release from the obligation of meeting my fellow man on the road.  An extra 40 minutes each day probably doesn’t seem like a lot, but it is enough time for me to get home after work and open my New Yorker and read for a while.  It is enough so that, after reading, I am relaxed and can play with my children rather than turn on the news.  I can put my children to bed and talk with my wife rather than watch Grey’s Anatomy plus whatever comes after that until I fall asleep.  I am no longer the sort of man who beats his horn in anger against the world, and perhaps even my politics have mellowed a bit.

Of course were I still at my old job, my commute would be longer and I would not have had time to read this article, and would never have had the leisure to reflect on homo comutus.  A different sort of commuting paradox.

Concerning Civility

There are currently two conversations going on over the Internet and in other media about civility. The Don Imus discussion tends to be interesting (for instance here and here), but has hardly reached the heart of the matter and seems to be skimming the surface of many issues. Another, concerning anonymous attacks on Kathy Sierra and what to do about it, tends to be unsophisticated and rather silly (try this out http://radar.oreilly.com/archives/2007/04/code_of_conduct.html). Yet both touch on the same matter: what is civility and how do we get some?

I like civility, in the right measure. I especially like it between friends. It also seems like a useful thing in public discourse, because better conversations occur when people restrain themselves a bit and don’t go off on the first thing they disagree with. But most of all, I like civility because it makes transgression possible. When everyone curses all the time, it dilutes the whole endeavor. But when people generally restrain themselves, then the properly timed mal mot can be a wonderful and liberating thing.

In all of these discussions, the notion of a “line being crossed” keeps surfacing, with no real investigation of what that line is.
Instead, the discussion about civility tends to break down in terms of are you with us or not, cause I know it when I smell it, and if you can’t smell it, then you’re not with us. And of course I can smell it, and I am always with us, so where I stand should be clear.

But as at a dinner party when someone lets off a fart, I find that, despite myself, after a first whiff I always end up taking a follow-up whiff — to see if it is gone? to test whether it was merely imagined? to try to identify the culprit? or perhaps simply out of a perverse habit of the connoisseur attempting to pick out the colors that make up the current pallet.

What are these lines of civility that we must not cross? And why does the mere existence of the line make me not only want to cross it, but also to vandalize it a bit?

The Open Internet and Its Enemies

Crazyfinger makes an interesting comment on Jeff Jarvis’s blog.

Deadwood. The blogosphere of today feels like that town, with its own version of Swearengens, E. B. Farnums…

There is a lot of background to this, worth unpacking; it can all be distilled, however, to the observation that people are sometimes mean on the internet.

The long version goes something like this.  Kathy Sierra, who is an admired web design guru, Web 2.0 advocate, and co-author of the immensely popular Head First series of technical books, has a blog.  And recently people started making obnoxious comments on her blog, obnoxious comments on other blogs about her, works of Photoshop clipping involving her, and finally death threats.  She is now considering getting out of the blogosphere altogether, a dramatic instance of Gresham’s law at work.  In the meantime, however, it turns out she has some notable friends who are now trying to use their influence to do something about the netnasties.  Tim O’Reilly, who runs a successful technical press and also helped coin the term Web 2.0, proposes a blogger code of conduct to which bloggers can sign on as a mark of their bona fides.

In other milieus, this mild suggestion of self-regulation would seem perfectly reasonable, but the internet is not just any milieu.  It has mythic origins as an unregulated medium for the transmission of ideas and great hopes — democratic ideals, anarchic utopias, freedom of speech, freedom of expression — are tied to it.  The wildness of the internet contributes to its appeal.  Like the American frontier, it is a terrain where anyone can re-create themselves, and build a new culture in which they can happily dwell.

This conjoining of freedom, the Internet, the blogosphere, Web 2.0, and the Open Source movement was at one time promoted by the same people who are finding problems with it now.  In a 2006 commencement speech at UC Berkeley’s School of Information, Tim O’Reilly said:

The internet has enormous power to increase our freedom. It also has enormous power to limit our freedom, to track our every move and monitor our every conversation. We must make sure that we don’t trade off freedom for convenience or security.

In his own explication of what his neologism Web 2.0 meant, O’Reilly wrote:

If an essential part of Web 2.0 is harnessing collective intelligence, turning the web into a kind of global brain, the blogosphere is the equivalent of constant mental chatter in the forebrain, the voice we hear in all of our heads. It may not reflect the deep structure of the brain, which is often unconscious, but is instead the equivalent of conscious thought. And as a reflection of conscious thought and attention, the blogosphere has begun to have a powerful effect.

Kathy Sierra has been more consistent in her view of the openness of the Internet.  In 2005 she discussed the enforcement of “be-nice” rules on a forum she started.

Enforcing a “be nice” rule is a big commitment and a risk. People complain about the policy all the time, tossing out “censorship” and “no free speech” for starters. We see this as a metaphor mismatch. We view javaranch as a great big dinner party at the ranch, where everyone there is a guest. The ones who complain about censorship believe it is a public space, and that all opinions should be allowed. In fact, nearly all opinions are allowed on javaranch. It’s usually not about what you say there, it’s how you say it.

And this isn’t about being politically correct, either. It’s a judgement call by the moderators, of course. It’s fuzzy trying to decide exactly what constitutes “not nice”, and it’s determined subjectively by the culture of the ranch.

At the same time, it was also she who pointed out, quite accurately, this principle of the Internet:

If we want our users (members, guests, students, potential customers, kids, co-workers, etc.) to pay attention, we have to be provocative. We can moan all we want about how the responsible person should pay attention to what’s important rather than what’s compelling. But it’s not about responsibility or maturity. It’s not even about interest.

Provocation is in the eye of the provoked, obviously, so there’s no clear formula. But there’s plenty we can try, depending on the circumstances….

These notions of the Internet age as herald to a new form of social interaction even permeates seemingly unrelated movements like the  Agile Methodology for software development, which promotes:

Individuals and interactions over processes and tools
Working software over comprehensive documentation
Customer collaboration over contract negotiation
Responding to change over following a plan

Even the Open Source movement, which promotes a particular way of distributing software, includes these interesting stipulations in the license they promulgate:

5. No Discrimination Against Persons or Groups

The license must not discriminate against any person or group of persons.

6. No Discrimination Against Fields of Endeavor

The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.

By open, they truly mean open.  Given this emphasis on ideals of individuality, freedom, and equality in Internet culture, it can be seen why any suggestion that it might be in everyone’s interest to curtail any of these is seen as anathema.  It also explains the strange bind those working on O’Reilly’s proposed blogger’s code of conduct find themselves in.  Once it was agreed upon that some sort of action should be taken to deal with the netnasties, it was discovered that nothing could really be done without enforcement, and no one wants enforcement since it is a form of coercion.  Consequently, the code of conduct has turned out to be a document that offends half of the Internet by suggesting mild coercion in the first place, and then draws the dirision of the other half by having no teeth.  The draft code of conduct is currently in a state of flux, and may change radically over the next several weeks.  At one point, however, it included an article that stated that bloggers don’t take themselves seriously.  The intent of this was somewhat lost to me, but the irony was not.  Bloggers don’t take themselves seriously, and yet they feel they need a code of conduct to explicate what they believe in, including the tenet that they don’t take themselves seriously.

As it is shaping up, though, the code of conduct ressembles to a remarkable degree the bylaws of various community forums across the Internet.  What separates forums from blogs is, primarily, that forums are composed of people who consent to obey the oversight of moderators as a mechanism for regulating discussions.  Blogs, on the other hand, are visible and generally accessible to everyone.  Forums usually have mechanisms in place to eject members who repeatedly behave badly.  The Internet has no such mechanism.  Finally, forums usually only provide one with an audience of a hundred to a thousand people.  Blogs offer a potential audience of hundreds of millions of people.

It is likely for this last reason that many people have turned to blogs, rather than forums, as their main outlet for Internet discourse.  If gold was the main currency of the Old West, attention is the main currency of the new frontier — or, as advertisers like to call it, eyeballs.  The public life of many people on the Internet involves acquiring eyeballs, which can then be converted to real money if one chooses to advertise on one’s site, or else may simply be used as a mode of social promotion.  Returning to moderated forums is akin to returning to the towns back East where laws were more stringent, and safety more assured, but opportunities for advancement and transformation were limited.  The blogosphere holds out the promise that anyone can be famous if they turn the right phrase, capture the right attention, come up with the next big idea.

For these very reasons, however, the rules of a community cannot be enforced where no laws exist.  How then does justice get enforced on the digital frontier?

As Crazyfinger (who has an interesting blog of his own about Adam Smith’s Theory of Moral Sentiments) suggests, an analogy can be drawn between the TV drama Deadwood and the current state of the Internet.  Had Deadwood survived another season, we might even have received a definitive answer, but as it is, we only have suggestions.

Deadwood, in the show of the same name, is a gold town in South Dakota marching slowly toward incorporation and civilization.  Alma Garret (a proxy for Kathy Sierra) has accidentally struck it rich when her lot is discovered to contain one of the richest gold veins in the region.  As a consequent, she is a victim of unscrupulous persons anxious to take hold of her, ermm, eyeballs.  Having neither reputation to lose nor character to restrain them, they act provacatively in their efforts to raise their own social status.

Throughout the series, three main ways are provided to afford Alma Garret the protection of civilization she requires, in a place without civilization.  The first is Wild Bill Hickock, who through the authority granted him by his reputation, is able to coerce people to behave appropriately.  This is analogous to the the attempt by Kathy Sierra’s friend Tim O’Reilly, as well as others, to use their reputations to shame people into agreeing to some sort of blogging standards.  Sadly, the attempt is also analogous to the letter written by the town fathers in Season Three in the local paper to turn sentiment against George Hearst, who has designs on Alma’s gold.  Wild Bill, of course, is shot at the end of Season One.  Best character on TV ever.  Nuff said.

Alma Garret’s second line of defense is Sheriff Seth Bullock.  Bullock is of heroic proportions.  Through the exercise of precise and barely restrained violence, Bullock is able to herd and intimidate those who would upset the peace of Deadwood.  Bullock is the equivalent of the sort of hero we occassionally encounter in forums and message boards, who through wit, knowledge, and force of character is able both to inspire people to behave better as well as punish with an acid tongue those who do not.  Alas, on the Internet, there are too few of these, and the few there are tend to retreat into their own preoccupations over time.  A case again, perhaps, of Gresham’s Law: bad money drives good money out of circulation.

The last option Alma Garret has at her disposal is to accede to the wishes of the villainous and violent George Hearst on the best terms she is able.  This is what Alma Garret does at the end of Season Three, rather than force a confrontation that would likely see many of the main characters killed off.  This, as far as I can see, is the only way to bring civilization to the blogosphere, and it is an unhappy turn.  Civilization, once we accept that there will always be netnasties, is only possible when we turn a monopoly of coercive power over to a single entity.  If, as O’Reilly, Sierra and others have argued, it is necessary to make the blogosphere and the Internet, by extension, obey the rules of a community forum, then something like this must occur.  The most likely way for this to happen is if one of the main social networking sites joins forces with one of the major blogging hosts, such as Typepad or LiveJournal, and compels everyone who wants to blog to sign on to their service, following the other principle of the Internet that growth engenders growth.  Having acquired a majority share of the blogosphere, such a monopolistic regime can then enforce community rules such as the ones Tim O’Reilly is attempted to formulate.  This entails the victory of Hearst and of civilazation, and the closing of the frontier.

And, I think, it is also the moral of Deadwood, if there is a moral to be found.  The frontier gives us heroes, but it also engenders monsters.  The idealized vision of the frontier must at some point be confronted with the ugliness it foments; a place of greed, corruption, mysogeny, pornography and guys who say “cocksucker”.  If we can’t find it in ourselves to embrace the ugliness along with the heroism of the frontier, then we must make the great compromise and bear the yoke which assures civility, and which, Rousseau promises, is only a light yoke, after all.


The Clementinum Baroque Library, Prague

I have been looking through my library shelves, picking out books I once read for signs of what I once saw in them.  I no longer remember.  All I find are rare page corners delicately turned over.  I have never been able to pick up the habit of actually touching ink to my books, which has always struck me as a sort of desecration of the text.  Nevertheless, I do find chocolate and nicotine stains on the edges, occassionally, and from books I have not openned since college, I am often rewarded with the foul whiff of stale cigarettes smoked long ago, a scent no doubt mirrored internally as a black smear on my lungs.  As I peruse these pages with their foreshortenned corners, a sign from my past self to my future self that here there be something of note, I wonder what I used to know that I no longer know, and what I thought I knew but now I know better.  This is one such passage.


In ancient times the painters were supplied with their themes by the poets, though at liberty to indulge in as much decorative play as was decent within the limits of a given theme; later, the failure of the poets to keep their position at the head of affairs forced painters to paint whatever their patrons commissioned, or whatever came to hand, and finally to experiment in pure decoration; now affectations of madness in poets are condoned by false analogy with pictorial experiments in unrepresentational form and colour.  So Sacheverell Sitwell wrote in Vogue (August, 1945):

Once again we are leading Europe in the Arts …

He lists the fashionable painters and sculptors and adds:

The accompanying works of the poets are not hard to find … Dylan Thomas, whose texture is as abstract as that of any modern painter … There is even no necessity for him to explain his imagery, for it is only intended to be half understood.

It is not as though the so-called surrealists, impressionists, expressionists and neo-romantics were concealing a grand secret by pretended folly, in the style of Gwion; they are concealing their unhappy lack of a secret.

For there are no poetic secrets now, except of course the sort which the common people are debarred by their lack of poetic perception from understanding, and by their anti-poetic education (unless perhaps in wild Wales) from respecting.  Such secrets, even the Work of the Chariot, may be safely revealed in any crowded restaurant or cafe without fear of the avenging lightning-stroke: the noise of the orchestra, the clatter of plates and the buzz of a hundred unrelated conversations will effectively drown the words — and, in any case, nobody will be listening.

— Robert Graves, The White Goddess

Today, the Work of the Chariot has its own flash-enabled web page, as well as a Wikipedia entry.  There was even an X-Files episode in which it served as a narrative device. 

May Bertram

“So that I’m the only person who knows?”

“The only person in the world.”

“Well,” she quickly replied, “I myself have never spoken.  I’ve never, never repeated of you what you told me.”  She looked at him so that he perfectly believed her.  Their eyes met over it in such a way that he was without a doubt.  “And I never will.”

She spoke with an earnestness that, as if almost excessive, put him at ease about her possible derision.  Somehow the whole question was a new luxury to him—that is from the moment she was in possession.  If she didn’t take the sarcastic view she clearly took the sympathetic, and that was what he had had, in all the long time, from no one whomsoever.  What he felt was that he couldn’t at present have begun to tell her, and yet could profit perhaps exquisitely by the accident of having done so of old.  “Please don’t then.  We’re just right as it is.”

“Oh I am,” she laughed, “if you are!”  To which she added: “Then you do still feel in the same way?”

It was impossible he shouldn’t take to himself that she was really interested, though it all kept coming as a perfect surprise.  He had thought of himself so long as abominably alone, and lo he wasn’t alone a bit.  He hadn’t been, it appeared, for an hour—since those moments on the Sorrento boat.  It was she who had been, he seemed to see as he looked at her—she who had been made so by the graceless fact of his lapse of fidelity.  To tell her what he had told her—what had it been but to ask something of her? something that she had given, in her charity, without his having, by a remembrance, by a return of the spirit, failing another encounter, so much as thanked her.  What he had asked of her had been simply at first not to laugh at him.  She had beautifully not done so for ten years, and she was not doing so now.  So he had endless gratitude to make up.  Only for that he must see just how he had figured to her.  “What, exactly, was the account I gave—?”

“Of the way you did feel?  Well, it was very simple.  You said you had had from your earliest time, as the deepest thing within you, the sense of being kept for something rare and strange, possibly prodigious and terrible, that was sooner or later to happen to you, that you had in your bones the foreboding and the conviction of, and that would perhaps overwhelm you.”

“Do you call that very simple?” John Marcher asked.

She thought a moment.  “It was perhaps because I seemed, as you spoke, to understand it.”

“You do understand it?” he eagerly asked.

Again she kept her kind eyes on him.  “You still have the belief?”

— Henry James, The Beast in the Jungle

Authentically Virtual