The Self-Correcting Process

carnival

Science is all about making proposals that can be tested (especially after Karl Popper’s formulation of the Falsifiability Criterion), and then undergoing the experience of having that proposal rejected.  This is the essence of any successful process — not that it eliminates errors altogether, but rather that it is able to make corrections despite these errors so that the target need never shift.

Professor Alain Connes recently gave his opinion of Xin-Jing Li’s proof for the Riemann Hypothesis — a proof which relies in part on Professor Connes’ work –  in a blog comment to his own blog (by way of Slashdot):

I dont like to be too negative in my comments. Li’s paper is an attempt to prove a variant of the global trace formula of my paper in Selecta. The "proof" is that of Theorem 7.3 page 29 in Li’s paper, but I stopped reading it when I saw that he is extending the test function h from ideles to adeles by 0 outside ideles and then using Fourier transform (see page 31). This cannot work and ideles form a set of measure 0 inside adeles (unlike what happens when one only deals with finitely many places).

 

Self-correcting extends to other professions, as well.  Scott Hanselman recently posted to correct an opinion he discovered here which he felt required some testing.  Through his own tests, he discovered that nesting a using directive inside a  namespace declaration provides no apparent performance benefit over placing it outside the namespace.

This leads him to draw these important lesson:

  • Don’t believe everything you read, even on a Microsoft Blog.
  • Don’t believe this blog, either!
  • Decide for yourself with experiments if you need a tiebreaker!

 

The sentiment recalls Ralph Waldo Emerson’s memorable words:

 

There is a time in every man’s education when he arrives at the conviction that envy is ignorance; that imitation is suicide; that he must take himself for better, for worse, as his portion; that though the wide universe is full of good, no kernel of nourishing corn can come to him but through his toil bestowed on that plot of ground which is given to him to till. The power which resides in him is new in nature, and none but he knows what that is which he can do, nor does he know until he has tried.

Trust thyself: every heart vibrates to that iron string. Accept the place the divine providence has found for you, the society of your contemporaries, the connection of events.

 

A similar sentiment is expressed in Hobbes’ Leviathan, though with a wicked edge:

 

And as to the faculties of the mind, setting aside the arts grounded upon words, and especially that skill of proceeding upon general and infallible rules, called science, which very few have and but in few things, as being not a native faculty born with us, nor attained, as prudence, while we look after somewhat else, I find yet a greater equality amongst men than that of strength. For prudence is but experience, which equal time equally bestows on all men in those things they equally apply themselves unto. That which may perhaps make such equality incredible is but a vain conceit of one’s own wisdom, which almost all men think they have in a greater degree than the vulgar; that is, than all men but themselves, and a few others, whom by fame, or for concurring with themselves, they approve. For such is the nature of men that howsoever they may acknowledge many others to be more witty, or more eloquent or more learned, yet they will hardly believe there be many so wise as themselves; for they see their own wit at hand, and other men’s at a distance. But this proveth rather that men are in that point equal, than unequal. For there is not ordinarily a greater sign of the equal distribution of anything than that every man is contented with his share. [emphasis mine]

 

We find it again expressed in Descartes’ Discours de la méthode. Descartes, it might be remembered, occasionally exchanged letters with Hobbes:

 

Le bon sens est la chose du monde la mieux partagée; car chacun pense en être si bien pourvu, que ceux même qui sont les plus difficiles à contenter en toute autre chose n’ont point coutume d’en désirer plus qu’ils en ont.

 

Both Hobbes and Descartes formulate their defense of common sense somewhat ironically.  In a recent post, Steve Yegge takes out the irony (or perhaps takes out the kernel of truth and leaves nothing but the irony) in his argument against Joel Spolsky’s widely aknowledged criteria for a desirable employee: "smart, and gets things done."

According to Yegge, the crux of the problem is this:

 

Unfortunately, smart is a generic enough concept that pretty much everyone in the world thinks [he’s] smart.

So looking for Smart is a bit problematic, since we aren’t smart enough to distinguish it from B.S. The best we can do is find people who we think are smart because they’re a bit like us.

So, like, what kind of people is this Smart, and Gets Things Done adage actually hiring?

 

And yet the self-correcting process continues, on the principle that we are all smart enough, collectively, to solve our problems in the aggregate, even if we can’t solve them as individuals.

Presidential candidate Barack Obama recently held a news conference to correct a misunderstanding he had made a few hours earlier about his stance on the Iraq War.  According to CNN:

 

Obama on Thursday denied that he’s shying away from his proposed 16-month phased withdrawal of all combat troops from Iraq, calling it "pure speculation" and adding that his "position has not changed."

However, he told reporters questioning his stance that he will "continue to refine" his policies as warranted.

His comments prompted the Republican National Committee to put out an e-mail saying the presumed Democratic nominee was backing away from his position on withdrawal.

Obama called a second news conference later Thursday to reiterate that he is not changing his position.

 

This is, of course, merely a blip in the history of self-correction.  A more significant one can be found in Bakhtin’s attempt to interpret the works of Rabelais, and to demonstrate (convincingly) that everyone before him misunderstood the father of Gargantua. 

Bakhtin’s analysis of Rabelais in turn brought to light one of the great discoveries of his career: The Carnival — though a colleague once found an earlier reference to the concept in one of Ernst Cassirer’s works.  Against the notion of a careful and steady self-correcting mechanism in history, Bakhtin introduced the metaphor of the Medieval Carnival:

 

The essential principle of grotesque realism is degradation, that is, the lowering of all that is high, spiritual, ideal, abstract; it is a transfer to the material level, to the sphere of earth and body in their indissoluble unity.

Degradation and debasement of the higher do not have a formal and relative character in grotesque realism. "Upward" and "downward" have here an absolute and strictly topographical meaning….Earth is an element that devours, swallows up (the grave, the womb) and at the same time an element of birth, of renascence (the maternal breasts)….Degradation digs a bodily grave for a new birth….To degrade an object does not imply merely hurling it into the void of nonexistence, into absolute destruction, but to hurl it down to the reproductive lower stratum, the zone in which conception and a new birth take place.

 

The Carnival serves to correct inequalities and resentments in society and its subcultures not by setting it upon a surer footing, but rather by affording us an opportunity to air our grievances publicly in a controlled ceremony which allows society and its hierarchical institutions to continue as they are.  It is a release, rather than an adjustment.  A pot party at a rock festival rather than a general strike.

As for the Internet, it is sometimes hard to say what is actually occurring in the back-and-forth that occurs between various blogs.  Have we actually harnessed the wisdom of crowds and created a self-correcting process that responds more rapidly to intellectual propositions, nudging them over a very short time to the correct solution, or have we in fact recreated the Medieval Carnival, a massive gathering of people in one location which breaks down the normal distinctions between wisdom and folly, knowledge and error, competence and foolhardiness? 

Concerning Civility



There are currently two conversations going on over the Internet and in other media about civility. The Don Imus discussion tends to be interesting (for instance here and here), but has hardly reached the heart of the matter and seems to be skimming the surface of many issues. Another, concerning anonymous attacks on Kathy Sierra and what to do about it, tends to be unsophisticated and rather silly (try this out http://radar.oreilly.com/archives/2007/04/code_of_conduct.html). Yet both touch on the same matter: what is civility and how do we get some?


I like civility, in the right measure. I especially like it between friends. It also seems like a useful thing in public discourse, because better conversations occur when people restrain themselves a bit and don’t go off on the first thing they disagree with. But most of all, I like civility because it makes transgression possible. When everyone curses all the time, it dilutes the whole endeavor. But when people generally restrain themselves, then the properly timed mal mot can be a wonderful and liberating thing.


In all of these discussions, the notion of a “line being crossed” keeps surfacing, with no real investigation of what that line is.
Instead, the discussion about civility tends to break down in terms of are you with us or not, cause I know it when I smell it, and if you can’t smell it, then you’re not with us. And of course I can smell it, and I am always with us, so where I stand should be clear.


But as at a dinner party when someone lets off a fart, I find that, despite myself, after a first whiff I always end up taking a follow-up whiff — to see if it is gone? to test whether it was merely imagined? to try to identify the culprit? or perhaps simply out of a perverse habit of the connoisseur attempting to pick out the colors that make up the current pallet.

What are these lines of civility that we must not cross? And why does the mere existence of the line make me not only want to cross it, but also to vandalize it a bit?

The Open Internet and Its Enemies


Crazyfinger makes an interesting comment on Jeff Jarvis’s blog.




Deadwood. The blogosphere of today feels like that town, with its own version of Swearengens, E. B. Farnums…


There is a lot of background to this, worth unpacking; it can all be distilled, however, to the observation that people are sometimes mean on the internet.


The long version goes something like this.  Kathy Sierra, who is an admired web design guru, Web 2.0 advocate, and co-author of the immensely popular Head First series of technical books, has a blog.  And recently people started making obnoxious comments on her blog, obnoxious comments on other blogs about her, works of Photoshop clipping involving her, and finally death threats.  She is now considering getting out of the blogosphere altogether, a dramatic instance of Gresham’s law at work.  In the meantime, however, it turns out she has some notable friends who are now trying to use their influence to do something about the netnasties.  Tim O’Reilly, who runs a successful technical press and also helped coin the term Web 2.0, proposes a blogger code of conduct to which bloggers can sign on as a mark of their bona fides.


In other milieus, this mild suggestion of self-regulation would seem perfectly reasonable, but the internet is not just any milieu.  It has mythic origins as an unregulated medium for the transmission of ideas and great hopes — democratic ideals, anarchic utopias, freedom of speech, freedom of expression — are tied to it.  The wildness of the internet contributes to its appeal.  Like the American frontier, it is a terrain where anyone can re-create themselves, and build a new culture in which they can happily dwell.


This conjoining of freedom, the Internet, the blogosphere, Web 2.0, and the Open Source movement was at one time promoted by the same people who are finding problems with it now.  In a 2006 commencement speech at UC Berkeley’s School of Information, Tim O’Reilly said:




The internet has enormous power to increase our freedom. It also has enormous power to limit our freedom, to track our every move and monitor our every conversation. We must make sure that we don’t trade off freedom for convenience or security.


In his own explication of what his neologism Web 2.0 meant, O’Reilly wrote:




If an essential part of Web 2.0 is harnessing collective intelligence, turning the web into a kind of global brain, the blogosphere is the equivalent of constant mental chatter in the forebrain, the voice we hear in all of our heads. It may not reflect the deep structure of the brain, which is often unconscious, but is instead the equivalent of conscious thought. And as a reflection of conscious thought and attention, the blogosphere has begun to have a powerful effect.


Kathy Sierra has been more consistent in her view of the openness of the Internet.  In 2005 she discussed the enforcement of “be-nice” rules on a forum she started.




Enforcing a “be nice” rule is a big commitment and a risk. People complain about the policy all the time, tossing out “censorship” and “no free speech” for starters. We see this as a metaphor mismatch. We view javaranch as a great big dinner party at the ranch, where everyone there is a guest. The ones who complain about censorship believe it is a public space, and that all opinions should be allowed. In fact, nearly all opinions are allowed on javaranch. It’s usually not about what you say there, it’s how you say it.


And this isn’t about being politically correct, either. It’s a judgement call by the moderators, of course. It’s fuzzy trying to decide exactly what constitutes “not nice”, and it’s determined subjectively by the culture of the ranch.


At the same time, it was also she who pointed out, quite accurately, this principle of the Internet:




If we want our users (members, guests, students, potential customers, kids, co-workers, etc.) to pay attention, we have to be provocative. We can moan all we want about how the responsible person should pay attention to what’s important rather than what’s compelling. But it’s not about responsibility or maturity. It’s not even about interest.



Provocation is in the eye of the provoked, obviously, so there’s no clear formula. But there’s plenty we can try, depending on the circumstances….


These notions of the Internet age as herald to a new form of social interaction even permeates seemingly unrelated movements like the  Agile Methodology for software development, which promotes:




Individuals and interactions over processes and tools
Working software over comprehensive documentation
Customer collaboration over contract negotiation
Responding to change over following a plan


Even the Open Source movement, which promotes a particular way of distributing software, includes these interesting stipulations in the license they promulgate:




5. No Discrimination Against Persons or Groups

The license must not discriminate against any person or group of persons.


6. No Discrimination Against Fields of Endeavor

The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.


By open, they truly mean open.  Given this emphasis on ideals of individuality, freedom, and equality in Internet culture, it can be seen why any suggestion that it might be in everyone’s interest to curtail any of these is seen as anathema.  It also explains the strange bind those working on O’Reilly’s proposed blogger’s code of conduct find themselves in.  Once it was agreed upon that some sort of action should be taken to deal with the netnasties, it was discovered that nothing could really be done without enforcement, and no one wants enforcement since it is a form of coercion.  Consequently, the code of conduct has turned out to be a document that offends half of the Internet by suggesting mild coercion in the first place, and then draws the dirision of the other half by having no teeth.  The draft code of conduct is currently in a state of flux, and may change radically over the next several weeks.  At one point, however, it included an article that stated that bloggers don’t take themselves seriously.  The intent of this was somewhat lost to me, but the irony was not.  Bloggers don’t take themselves seriously, and yet they feel they need a code of conduct to explicate what they believe in, including the tenet that they don’t take themselves seriously.


As it is shaping up, though, the code of conduct ressembles to a remarkable degree the bylaws of various community forums across the Internet.  What separates forums from blogs is, primarily, that forums are composed of people who consent to obey the oversight of moderators as a mechanism for regulating discussions.  Blogs, on the other hand, are visible and generally accessible to everyone.  Forums usually have mechanisms in place to eject members who repeatedly behave badly.  The Internet has no such mechanism.  Finally, forums usually only provide one with an audience of a hundred to a thousand people.  Blogs offer a potential audience of hundreds of millions of people.


It is likely for this last reason that many people have turned to blogs, rather than forums, as their main outlet for Internet discourse.  If gold was the main currency of the Old West, attention is the main currency of the new frontier — or, as advertisers like to call it, eyeballs.  The public life of many people on the Internet involves acquiring eyeballs, which can then be converted to real money if one chooses to advertise on one’s site, or else may simply be used as a mode of social promotion.  Returning to moderated forums is akin to returning to the towns back East where laws were more stringent, and safety more assured, but opportunities for advancement and transformation were limited.  The blogosphere holds out the promise that anyone can be famous if they turn the right phrase, capture the right attention, come up with the next big idea.

For these very reasons, however, the rules of a community cannot be enforced where no laws exist.  How then does justice get enforced on the digital frontier?


As Crazyfinger (who has an interesting blog of his own about Adam Smith’s Theory of Moral Sentiments) suggests, an analogy can be drawn between the TV drama Deadwood and the current state of the Internet.  Had Deadwood survived another season, we might even have received a definitive answer, but as it is, we only have suggestions.


Deadwood, in the show of the same name, is a gold town in South Dakota marching slowly toward incorporation and civilization.  Alma Garret (a proxy for Kathy Sierra) has accidentally struck it rich when her lot is discovered to contain one of the richest gold veins in the region.  As a consequent, she is a victim of unscrupulous persons anxious to take hold of her, ermm, eyeballs.  Having neither reputation to lose nor character to restrain them, they act provacatively in their efforts to raise their own social status.


Throughout the series, three main ways are provided to afford Alma Garret the protection of civilization she requires, in a place without civilization.  The first is Wild Bill Hickock, who through the authority granted him by his reputation, is able to coerce people to behave appropriately.  This is analogous to the the attempt by Kathy Sierra’s friend Tim O’Reilly, as well as others, to use their reputations to shame people into agreeing to some sort of blogging standards.  Sadly, the attempt is also analogous to the letter written by the town fathers in Season Three in the local paper to turn sentiment against George Hearst, who has designs on Alma’s gold.  Wild Bill, of course, is shot at the end of Season One.  Best character on TV ever.  Nuff said.


Alma Garret’s second line of defense is Sheriff Seth Bullock.  Bullock is of heroic proportions.  Through the exercise of precise and barely restrained violence, Bullock is able to herd and intimidate those who would upset the peace of Deadwood.  Bullock is the equivalent of the sort of hero we occassionally encounter in forums and message boards, who through wit, knowledge, and force of character is able both to inspire people to behave better as well as punish with an acid tongue those who do not.  Alas, on the Internet, there are too few of these, and the few there are tend to retreat into their own preoccupations over time.  A case again, perhaps, of Gresham’s Law: bad money drives good money out of circulation.


The last option Alma Garret has at her disposal is to accede to the wishes of the villainous and violent George Hearst on the best terms she is able.  This is what Alma Garret does at the end of Season Three, rather than force a confrontation that would likely see many of the main characters killed off.  This, as far as I can see, is the only way to bring civilization to the blogosphere, and it is an unhappy turn.  Civilization, once we accept that there will always be netnasties, is only possible when we turn a monopoly of coercive power over to a single entity.  If, as O’Reilly, Sierra and others have argued, it is necessary to make the blogosphere and the Internet, by extension, obey the rules of a community forum, then something like this must occur.  The most likely way for this to happen is if one of the main social networking sites joins forces with one of the major blogging hosts, such as Typepad or LiveJournal, and compels everyone who wants to blog to sign on to their service, following the other principle of the Internet that growth engenders growth.  Having acquired a majority share of the blogosphere, such a monopolistic regime can then enforce community rules such as the ones Tim O’Reilly is attempted to formulate.  This entails the victory of Hearst and of civilazation, and the closing of the frontier.


And, I think, it is also the moral of Deadwood, if there is a moral to be found.  The frontier gives us heroes, but it also engenders monsters.  The idealized vision of the frontier must at some point be confronted with the ugliness it foments; a place of greed, corruption, mysogeny, pornography and guys who say “cocksucker”.  If we can’t find it in ourselves to embrace the ugliness along with the heroism of the frontier, then we must make the great compromise and bear the yoke which assures civility, and which, Rousseau promises, is only a light yoke, after all.

Secret Societies and the Internet


While driving with my family to visit an old friend the other day I caught a bit of Harry Schearer’s  wrap-up of the year 2006 on Weekend Edition.  During the interview Schearer was asked what happened with the 2006 Democratic election victory, and Schearer said yeah, what happened?  What must be going through the minds of all the people who believed that the elections were stolen in 2000 and again in 2004, the people who can point out the series of miniscule irregularities that cumulatively disenfranchised the American people of their right to vote those two previous times?  Did evil take a holiday in 2006?


Conspiracy theories are, of course, the opiate of the masses, but what happens when they are real?  And what must be happening when they disappear?  The most truly worthy conspiracies do not only control the mechanisms of power, but also the perception of power, and in doing so undermine the very Enlightenment notion that truth will set us free, since the conspirators control our perception of the truth. They are everything we like to accuse post-modernists and deconstructionists of being with one difference — they are effective.  Any conspiracy worthy of being treated as a conspiracy, then, cannot simply disappear anymore than it can make itself appear.  Everything we know about conspiracies are, a priori, false, managed, and inauthentic.  An elaborate cover story.


In the very awareness that there is falsity in the world, however, one also becomes aware that there is something being hidden from us, and behind it, eventually, truth. Or as Descartes said in The Meditations :


…hence it follows, not only that what is cannot be produced by what is not, but likewise that the more perfect, in other words, that which contains in itself more reality, cannot be the effect of the less perfect….


One assumes, perhaps erroneously, that those who feed us lies must therefore possess the Truth.  Here, then, is the dilemma for truth-seekers.  What if knowing the truth entails speaking falsehoods to the rest of the world?  We would like it not to be so, but what if the truth is so striking, so peculiar, so melancholic that the truth-seeker, despite herself, will ultimately  be obliged to be mendacious once they are brought before the Truth itself, if only to protect others from what she has come to know?  And if this were not the case, then wouldn’t someone have explained the Truth to us long ago?


One solution is to step back into a sort of pragmatic stance, and judge the pursuit of conspiracy theories ultimately to be delusional in nature.  But — and here’s the rub — doesn’t this go against the evidence we have that conspiracies do in fact occur.  Worse, isn’t this the sort of delusion, isn’t this the sort of lie, that prevents people from trying to unmask these conspiracies in the first place?  Or as Baudelaire informed us,


Mes chers frères, n’oubliez jamais, quand vous entendrez vanter le progrès des lumières, que la plus belle des ruses du diable est de vous persuader qu’il n’existe pas!


If the conspiratorial nature of the world cannot be revealed as a truth, then it must first be revealed as a falsehood.  This is how Leo Strauss, one of the architects of modern neoconservatism, put it in his short but revealing essay, Persecution and the Art of Writing.  Because not everyone is morally, inetellectually, or constitutionally prepared to receive the Truth, truth should only ever be alluded to.  Hints should be dropped, intentional errors and contradictions presented, which will lead the astute and prepared student to ask the correct questions that will eventually initiate him into the company of the elite.  The obvious question rarely raised, then, is whether Strauss ever got the students he felt he deserved.  Or were the allusions too obscure, and the paradoxes too knotted for anyone to follow him along the royal path? 


Worse yet, could Pynchon’s suggestion from The Crying of Lot 49 be correct, and the pursuers of conspiracy in the end are the ones who make conspiracies come to life, taking on the task of hiding the truth that no one initially gave to them, protecting a truth that in the end does not exist?


Yet this denies what we all know in our hearts to be true.  Conspiracies do exist, though not always in the form we imagine them to.  Take, for instance, the recent excerpts in the poetry journal Exquiste Corpse from Nick Bromell’s upcoming The Plan” or How Five Young Conservatives Rescued America


 



Until now, “The Plan” has been merely a rumor. In the late 1980s,  young conservatives spent hours reverently speculating about it over drinks at “The Sign of the Indian King” on M Street, while across town frustrated young liberals in the think tanks around Dupont Circle darkly attributed every conservative victory to this mythic document.

By the mid-1990s, the myth started to fade as each succeeding triumph of the conservative movement made it increasingly improbable that any group, however brilliant, could have planed the whole campaign. Eventually people referred to “The Plan”  as one might refer to the Ark or to the gunman on the grassy knoll: intriguing but fantastical.


Brommell, however, was allowed to view the notes of a historian originally commissioned to write a history of The Plan — a project eventually discarded by the people who hired him — and publishes them for the first time in this online journal, revealing  both the inspiration for and the details of the secret manifesto that has guided the conservative movement for the past fourty years.


There are even anachronisms and contradictions that, for me at least, do much to confirm the veracity of the source.  One that has been mentioned by other commentators on the article is the fact that the included link to the National Enterprise Initiative (the organization founded by the authors of “The Plan” and which initially commissioned the aborted history) either doesn’t work or points to a bogus search site.  For many, this indicates that the original reporting is bogus.  But the obvious question remains as to why such a suposedly elaborate fiction will fail on such a minor detail as a web link?  Who doesn’t know how to post a weblink anymore? On the other side, is it really so remarkable that an organization that wishes to remain hidden should suddenly disappear, along with all traces of it, once an unmasking piece of journalism is published concerning it?  We are, after all, talking about the Internet, the veracity of which we all know to be dubious and mercurial, a vast palimpsest conspiracy. 


Does the fact that something is absent from the Internet prove that it does not exist?


Or is it rather the case, as Neuhaus wrote in his 1623 Advertissiment pieux et utile des freres de la Rosee-Croix, which demonstrated the true existence of the Rosicrucian Brotherhood, a secret society who claimed to guide the history of Europe but that had only first been heard of in 1614 in Germany and quickly became the main topic of European discussion for the next quarter century regarding primarily the question ‘do they or do they not exist’:



By the very fact that they change and alter their name and that they mask their age, and that, by their own confession, they come and go without making themselves known, there is no Logician that could deny the necessity that they exist.

Technological Similes (e.g., Ajax is like…)


 As mentioned in a previous post, programmers typically explain one technology by referencing another more familiar technology.  What sometimes happens, however, is that the technology that was thought to be more familiar, and consequently believed to have explanatory power, in fact was simply originally explained by referencing some third vaguely understood technology; but time made the simile comfortable and vanity made it acceptable.  We only become aware of the semantic web we programmers weave when we are finally forced to use one of the referenced technologies and discover, once again, what a strange and incomprehensible thing programming is.  The experience is a bit like the shock felt by the woman who brought home a stray dog from Paris only to discover that it was really a large tailless rat.


To find out what one of the trendier new technologies is really like, I recently consulted Google.  A google search on “Ajax is like…” turns up the following results:



“Comparing Java and AJAX is like comparing apples and blue.”


“[U]sing Ajax is like consuming alcohol in public.”


“[A]jax is like instant messaging….”


“Customers asking for AJAX is like a prospective homeowner walking over to the contractors hired to do the building and handing them a saw.”


“AJAX is like Flash or HTML.”


“AJAX is like a javascript.”


“AJAX is like Javascript on steroids.”


“AJAX is like web services.”


“[A]jax is like everything else on line, it will be abused by various low lifes.”


“Ajax is like to partial update in Intraweb I am wrong?”


“Ajax is like ‘roller skates for the web.'”


“Ajax is like shell, Perl, Ruby. Ajax is like UNIX.”


“AJAX is like a Hooker turn School Teacher, it has a dirty secret and unless you get it alone and play with it, you won’t pickup on it’s secrets until it’s too late.”


“Ajax is, like stated in the essay, a new way to think about user interfaces on the web….”


“AJAX is like wearing 70’s djeans with an Hugo Boss Shoes….”


“AJAX is like Dinosaur cloning in Jurassic park.”


“AJAX is like folding a web page origami-style into a Lego brick….”


“AJAX is like a house of cards, and when a browser vendor screws up on a revision it’ll all come tumbling down.”


“AJAX is like putting a tiny bandage on a gaping wound the size of a grapefruit.”


“Ajax is like DHTML was 4 years ago, like javascript was 6 years ago, like applets were 8 years ago.”


“AJAX is like the killer buzzword.”


IE7 CSS Rendering Problem for DasBlog Project84 Theme

This site uses DasBlog as it’s blogging engine.  One of the themes that comes with DasBlog, Project84, has a rendering problem in IE7 that causes the footer to bleed into the main page somewhere a few inches from the top, when it should, of course, display at the bottom of the page.


The problem turns out to be only one line in the style.css file:



html, body{height:100%;}

Comment this out and the page should render correctly.  This also affects the Project84Green theme.


I viewed the corrected style in Firefox 1.5, IE6 and IE7 and did not note any issues. 

Giulio Camillo, father of the Personal Computer


I am not the first to suggest it, but I will add my voice to those that want to claim that Giulio Camillo built the precursor of the modern personal computer in the 16th century.  Claiming that anyone invented anything is always a precarious venture, and it can be instructive to question the motives of such attempts.  For instance, trying to determine whether Newton or Leibniz invented calculus is a simple question of who most deserves credit for this remarkable achievement. 


Sometimes the question of firsts is intended to reveal something that we did not know before, such as Harold Bloom’s suggestion that Shakespeare invented the idea of personality as we know it.  In making the claim, Bloom at the same time makes us aware of the possibility that personality is not in fact something innate, but something created.  Edmund Husserl turns this notion on its head a bit with his reference in his writings to the Thales of Geometry.  Geometry, unlike the notion of personality, cannot be so easily reduced to an invention, since it is eidetic in nature.  It is always true, whether anyone understands geometry or not.  And so there is a certain irony in holding Thales to be the originator of Geometry since Geometry is a science that was not and could not have been invented as such.  Similarly, each of us, when we discover the truths of geometry for ourselves, becomes in a way a new Thales of Geometry, having made the same observations and realizations for which Thales receives credit. 


Sometimes the recognition of firstness is a way of initiating people into a secret society.  Such, it struck me, was the case when I read as a teenager from Stephen J. Gould that Darwin was not the first person to discover the evolutionary process, but that it was in fact another naturalist named Alfred Russel Wallace, and suddenly a centuries long conspiracy to steal credit from the truly deserving Wallace was revealed to me — or so it had seemed at the time. 


Origins play a strange role in etymological considerations, and when we read Aristotle’s etymological ruminations, there is certainly a sense that the first meaning of a word will somehow provide the key to understanding the concepts signified by the word.  There is a similar intuition at work in the discusions of ‘natural man’ to be found in the political writings of Hobbes, Locke and Rousseau.  For each, the character of the natural man determines the nature of the state, and consequently how we are to understand it best.  For Hobbes, famously, the life of this kind of man is rather unpleasant.  For Locke, the natural man is typified by his rationality.  For Rousseau, by his freedom.  In each case, the character of the ‘natural man’ serves as a sort of gravitational center for understanding man and his works at any time. I have often wondered whether the discussions of the state of the natural man were intended as a scientific deduction or rather merely as a metaphor for each of these great political philosophers.  I lean toward the latter opinion, in which case another way to understand firsts is not as an attempt to achieve historical accuracy, but rather an attempt to find the proper metaphor for something modern.


So who invented the computer?  Was it Charles Babbage with his Difference Engine in the 19th century, or Alan Turing in the 20th with his template for the Universal Machine?  Or was it Ada Lovelace, as some have suggested, the daughter of Lord Byron and collaborator with Charles Babbage who possibly did all the work while Babbage receives all the credit?


My question is a simpler one: who invented the personal computer, Steve Jobs or Giulio Camillo.  I award the laurel to the Camillo, who was known in his own time as the Divine Camillo because of the remarkable nature of his invention.  And in doing so, of course, I merely am attempting to define what the personal computer really is — the gravitational center that is the role of the personal computer in our lives.


Giulio Camillo spent long years working on his Memory Theater, a miniaturized Vitruvian theater still big enough to walk into, basically a box, that would provide the person who stood before it the gift most prized by Renaissance thinkers: the eloquence of Cicero.  The theater itself was arranged with images and figures from greek and roman mythology.  Throughout it were Christian references intermixed with Hermetic and Kabalistic symbols.  In small boxes beneath various statues inside the theater fragments and adaptations of Cicero’s writings could be pulled out and examined.  Through the proper physical arrangment of the fantastic, the mythological, the philosophical and the occult, Camillo sought to provide a way for anyone who stepped before his theater be able to discourse on any subject no less fluently and eloquently than Cicero himself.


Eloquence in the 16th century was understood as not only the ability of the lawyer or statesman to speak persuasively, but also the ability to evoke beautiful and accurate metaphors, the knack for delighting an audience, the ability to instruct, and mastery of diverse subjects that could be brought forth from the memory in order to enlighten one’s listeners.  Already in Camillo’s time, mass production of books was coming into its own and creating a transformation of culture.  Along with it, the ancient arts of memory and of eloquence (by way of analogy we might call it literacy, today), whose paragon was recognized to be Cicero, was in its decline.  Thus Camillo came along at the end of this long tradition of eloquence to invent a box that would capture all that was best of the old world that was quickly disappearing.  He created, in effect, an artificial memory that anyone could use, simply by stepping before it, to invigorate himself with the accumulated eloquence of all previous generations.


And this is how I think of the personal computer.  It is a box, occult in nature, that provides us with an artificial memory to make us better than we are, better than nature made us.  Nature distributes her gifts randomly, while the personal computer corrects that inherent injustice.  The only limitation to the personal computer, as I see it, is that it can only be a repository for all the knowledge humanity has already acquired.  It cannot generate anything new, as such.  It is a library and not a university.


Which is where the internet comes in.  The personal computer, once it is attached to the world wide web, becomes invigorated by the chaos and serendipity that is the internet.  Not only do we have the dangerous chaos of viruses and trojan horses, but also the positive chaos of online discussions, the putting on of masks and mixing with the online personas of others, the random following of links across the internet that ultimately leads us to make new connections between disparate concepts in ways that seem natural and inevitable.


This leads me to the final connection I want to make in my overburdened analogy.  Just as the personal computer is not merely a box, but also a doorway to the internet, so Giulio Camillo’s Theater of Memory was tied to a Neoplatonic worldview in which the idols of the theater, if arranged properly and fittingly, could draw down the influences of the intelligences, divine beings known variously as the planets (Mars, Venus, etc.), the Sephiroth, or the Archangels.  By standing before Camillo’s box, the spectator was immediately plugged into these forces, the consequences of which are difficult to assess.  There is danger, but also much wonder, to be found on the internet.

I and Thou

Few of us are ever afforded the opportunity to read so thoroughly and deeply as we once did in college.  I am occassionally nostalgic for that period of my life, thinking fondly and even enviously of the boundless amount of leisure time I once had, but did not seem to appreciate.  In fact, the overwhelming mood that pervaded my experience of that time was not one of leisure, but rather one of impatience.  A fellow student remarked to me,  while we were both participants in a seminar with a  longish reading list, that she wished deeply to be free of the burden of having all these books still to read.  And I suggested that while the reading would indeed be difficult, it would be a great pleasure to one day know that we had mastered these books.  She corrected me.  Her desire was peculiar.  She wanted through some miraculous conveyance to have had already read these books, and to have had already mastered them, rather than to have them before her as a large unknown that only emphasized, by contrast, our state of relative ignorance.  She wanted, as many people have in the long history of man, to possess instant wisdom.  Upon hearing her describe this desire, I realized that I wanted this also.


But all that was said before google and wikipedia.  The internet has become, for many, the royal road to wisdom — or at least a simulacrum of wisdom.  In virtual communities across the world wide web, arguments are now settled by appealing to wikipedia.  Despite the well publicized problems with its reliability, wikipedia is still more convenient than looking things up in a book or relying on a shared education.  When one is in want of an argument, political blogs are often the best place to go to find out how one should best defend a given viewpoint.  Not only are arguments provided for any topical issue, but even rhetorical flourishes are provided.  Nor does one have to cite the source of one’s arguments.  Instead one can simply state them as one’s own original viewpoint — they are in a way open source opinions.  Finally, when one needs to appear wiser than one actually is, for the sake of one’s virtual community, google readily points to facts, viewpoints, timelines, interpretations on almost any subject, generally in an easily digestible form.  And again, citations are completely unnecessary.


These wonderful tools: the internet, google, wikipedia, and the blogosphere,  can be used as if they are an extension of our memories.  They offer that for which my college friend longed, the ability to possess knowledge, as well as the perspectives and opinions which come with that knowledge, without the laborious effort that once was required to actually gain this knowledge.  While there may still be some bragging rights associated with having gained knowledge through one’s own effort, it is fair to ask what is the real difference between this new form of wisdom and the traditional kind that had previously been so difficult to attain?  Is it anything more than one of loci: whether the wisdom is contained within our souls or accessed from outside our skulls?


One can already feel the impatience with the state of the technology.  As fast as the internet is, it still does not match the speed with which one can recall information from the internal memory.  And while a search of one’s recollections is often as uncanny as Plato’s aviary it still tends to be less tendentious, if also not so extensive, as a google search.  The technology, however, much like we once said about the mind, can be improved.  As the interface between our minds and the cavernous stores of memory available on the interenet improves, the separation we perceive between the two may eventually become a mere figment, a virtual oddity future generations will read about on a future wikipedia to better understand how people used to think.