The Beatles Rock Band

beatles-yellow-submarine

Scott Hanselman, perhaps the current reigning rock star in the Microsoft development world with an incredibly popular blog, Computer Zen, has approximately 17.5 thousand followers on Twitter.  William Shatner, a television actor currently up for an Emmy, has 114 thousand followers.  Colin Meloy, lead singer of a band I like, The Decemberists, has 910 thousand Twitter minions.

As involved as I tend to be in the life-world of software development – and despite its significance in the technological transformation of business and society –  I sometimes have to admit that it is a bit marginal.  Not only are my rock stars different from other people’s.  They are also less significant in the grand scheme of things.  By contrast, the biggest rock stars in society are, in fact, rock stars.

While it would be nice if we treated our teachers, our doctors, our nurses like rock stars, I am actually missing President Obama’s speech on healthcare tonight in order to play the just released Beatles Rock Band with my family.  According to this glowing review in The New York Times, it is not only the greatest thing since sliced bread – it is possibly better.  [Warning: the phrases cultural watershed and transformative entertainment experience appear in the linked article.]

The game is indeed fun and traces out The Beatles’ careers if one plays in story mode.  We had in fact gotten to 1965 before my 12 year old noticed the chronology and exclaimed, “Oh my Gawd.  They are so old.  I thought they were from the 80’s or something.”

This got me thinking incoherently about the fickle nature of fame which quickly segued into a daydream about sitting in the green room after a concert while my roadies picked out groupies at the door to come in and engage me in stimulating conversation.

Sometime in the 1990’s my philosophy department was trying to lure Hubert Dreyfus, then the leading interpreter of poststructuralists like Derrida and Foucault in America, into our university.  Apparently everything was going swimmingly until the haggling started and we discovered that not only did he want the chairmanship of the department but he also wanted a 300K salary and merchandizing rights to any action figures based on his work.   300K is a lot of money in any profession, but it is an uber-rock star salary when you consider that most American academics supplement their meager incomes by selling real estate and Amway.  Negotiations quickly deteriorated after that.

I’m not saying, of course, that Hubert Dreyfus doesn’t deserve that kind of scratch.  He had his own groupies and everything.  The problem is simply that our society doesn’t value the kind of contributions to the common weal provided by Professor Dreyfus.

Perhaps a video game could change all that.  I could potentially see myself playing an XBOX game in which I kiss-butt as a graduate student (as I recall, I in fact did do that) in a foreign country, write a marginal dissertation, get a teaching position somewhere and then write a counter-intuitive thesis in a major philosophy journal (the kind with at least a thousand subscribers, maybe more) such as “Why Descartes was not a Cartesian”, “Why Spinoza was not a Spinozist”, “Why Plato was not a Platonist” (true, actually) or “Why Nietzsche was not a Nihilist” (at the beginner level).  With the success of that article, the player would then ditch his teaching position at a state college for a big-name university and gather graduate students around himself.  He would then promote his favorite graduate students to tenure track positions and they would in turn write glowing reviews of all the player’s books as well as teach them in all their classes.  It’s called giveback, and the game would be called Academic Rock Star.  I really could potentially see myself playing that game, possibly.

There are rock stars in every field, and one might offer suggestions for other titles such as Financial Rock Star, Accounting Rock Star, Presidential Candidate Rock StarMicrosoft Excel Rock Star, Blogging Rock Star.

Perhaps the reason Microsoft has not picked up on any of these ideas is because – just as we all secretly believe that we will one day be rich – we all secretly believe that becoming a rock star in our own industry or sub-culture is attainable.

No one really believes, however, that he can ever become like The Beatles.  Consequently we settle for the next best thing: pretending to be The Beatles in a video game.

The problem with computer literacy

basoliA recent post on Boing Boing is titled Paper and pencil better for the brain than software?  The gist of the article and its associated links is that software, in guiding us through common tasks, actually makes us dumber.  The Dutch psychologist Christof van Nimwegen has performed studies demonstrating the deleterious effects of being plugged-in.  From the post:

“Van Nimwegen says much software turns us into passive beings, subjected to the whims of computers, randomly clicking on icons and menu options. In the long run, this hinders our creativity and memory, he says.”

This certainly sounds right to me, from personal experience.  About a year ago, my company gave away GPS navigation devices as Christmas gifts to all the consultants.  The results are twofold.  On the one hand, we all make our appointments on time now, because we don’t get lost anymore.  On the other, we have all lost our innate sense of direction — that essential skill that got the species through the hunter-gatherer phase of our development.  Without my GPS, I am effectively as blind as a bat without echolocation.

In Charles Stross’s novel about the near future, Accelerando, this experience is taken a step further.  The protagonist Manfred Macx is at one point mugged on the street, and his connection to the Internet, which he carries around with him hooked up to his glasses, is taken away.  As a man of the pre-singularity, however, his personality has become so distributed over search engines and data portals that without this connection he is no longer able to even identify himself.  This is the nightmare of the technologically dependent.

Doctor van Nimwegen’s study recalls Plato’s ambivalence about the art of writing.  His mentor Socrates, it may be remembered, never put anything to writing, which he found inherently untrustworthy. Consequently all we know of Socrates comes by way of his disciple Plato.  Plato, in turn, was a poet who ultimately became distrustful of his own skills, and railed against it in his philosophical writings.  From the modern viewpoint, however, whatever it is that we lose when we put “living” thoughts down to writing, surely it is only through poetry that we are able to recover and sustain it.

It is through poetic imagery that Plato explains Socrates’s misgivings about letters in the Phaedrus:

At the Egyptian city of Naucratis, there was a famous old god, whose name was Theuth; the bird which is called the Ibis is sacred to him, and he was the inventor of many arts, such as arithmetic and calculation and geometry and astronomy and draughts and dice, but his great discovery was the use of letters. Now in those days the god Thamus was the king of the whole country of Egypt; and he dwelt in that great city of Upper Egypt which the Hellenes call Egyptian Thebes, and the god himself is called by them Ammon. To him came Theuth and showed his inventions, desiring that the other Egyptians might be allowed to have the benefit of them; he enumerated them, and Thamus enquired about their several uses, and praised some of them and censured others, as he approved or disapproved of them. It would take a long time to repeat all that Thamus said to Theuth in praise or blame of the various arts. But when they came to letters, This, said Theuth, will make the Egyptians wiser and give them better memories; it is a specific both for the memory and for the wit. Thamus replied: O most ingenious Theuth, the parent or inventor of an art is not always the best judge of the utility or inutility of his own inventions to the users of them. And in this instance, you who are the father of letters, from a paternal love of your own children have been led to attribute to them a quality which they cannot have; for this discovery of yours will create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.

We can certainly see aspects of Manfred Macx’s experience of disorientation in our dependence on tools like Google and Wikipedia, which provide us all with the same degree of wisdom, or at least the same show of wisdom.  In tracking down the above quote about Theuth, I had to rely on a vague reminiscence that this memory passage occurred in either the Timaeus or the Phaedrus. I then used my browser search functionality to track down the specific paragraph.  Very handy, that search feature.  But how much more wonderful it would have been had I been able to call that up from my own theater of memory.

My only stand against the steady march of progress (from which I make my living, it should be remembered) is that I turn my spell-checker off when I write emails and articles.  A consulting manager recently chastised me for this practice, which he found error prone and somewhat irresponsible.  To this I could only reply, “but I already know how to spell.”  

I should have added, “…for now.”

Reflection

reflection

Like may others, I recently received the fateful email notifying me that Lutz Roeder will be giving up his work on .NET Reflector, the brilliant and essential tool he developed to peer into the internal implementation of .NET assemblies.  Of course the whole idea of reflecting into an assembly is cheating a bit, since one of the principles of OO design is that we don’t care about implementations, only about contracts.  It gets worse, since one of the main reasons for using .NET Reflector is to reverse engineer someone else’s (particularly Microsoft’s) code.  Yet it is the perfect tool when one is good at reading code and simply needs to know how to do something special — something that cannot be explained, but must be seen.

While many terms in computer science are drawn from other scientific fields, reflection appears not to be.  Instead, it is derived from the philosophical “reflective” tradition, and is a synonym for looking inward: introspection.  Reflection and introspection are not exactly the same thing, however.  This is a bit of subjective interpretation, of course, but it seems to me that unlike introspection, which is merely a turning inward, reflection tends to involve a stepping outside of oneself and peering at oneself.  In reflection, there is a moment of stopping and stepping back; the “I” who looks back on oneself is a cold and appraising self, cool and objective as a mirror.

Metaphors pass oddly between the world of philosophy and the world of computer science, often giving rise to peculiar reversals.  When concepts such as memory and CPU’s were being developed, the developers of these concepts drew their metaphors from the workings of the human mind.  The persistent storage of a computer is like the human faculty of memory, and so it was called “memory”.  The CPU works like the processing of the mind, and so we called it the central processing unit, sitting in the shell of the computer like a homunculus viewing a theater across which data is streamed.  Originally it was the mind that was the given, while the computer was modeled upon it.  Within a generation, the flow of metaphors has been reversed, and it is not uncommon find arguments about the computational nature of the brain based on analogies with the workings of computers.  Isn’t it odd that we remember things, just like computers remember things?

The ancient Skeptics had the concept of epoche to describe this peculiar attitude of stepping back from the world, but it wasn’t until Descartes that this philosophical notion became associated with the metaphor of optics.  In a letter to Arnauld from 1648, Descartes writes:

“We make a distinction between direct and reflective thoughts corresponding to the distinction we make between direct and reflective vision, one depending on the first impact of the rays and the other on the second.”

This form of reflective thought, in turn, also turns up in at an essential turning point in Descartes’ discussion of his Method, when he realizes that his moment of self-awareness is logically dependent on something higher:

“In the next place, from reflecting on the circumstance that I doubted, and that consequently my being was not wholly perfect, (for I clearly saw that it was a greater perfection to know than to doubt,) I was led to inquire whence I had learned to think of something more perfect than myself;”

Descartes uses the metaphor in several places in the Discourse on Method.  In each case, it is as if, after doing something, for instance doubting, he is looking out the corner of his eye at a mirror to see what he looks like when he is doing it, like an angler trying to perfect his cast or an orator attempting to improve his hand gestures.  In each case, what one sees is not quite what one expects to see; what one does is not quite what one thought one was doing.  The act of reflection provides a different view of ourselves from what we might observe from introspection alone.  For Descartes, it is always a matter of finding out what one is “really” doing, rather than what one thinks one is doing.

This notion of philosophical “true sight” through reflection is carried forward, on the other side of the channel, by Locke.  In his Essay Concerning Human Understanding, Locke writes:

“This source of ideas every man has wholly in himself; and though it be not sense, as having nothing to do with external objects, yet it is very like it, and might properly enough be called internal sense. But as I call the other Sensation, so I call this REFLECTION, the ideas it affords being such only as the mind gets by reflecting on its own operations within itself. By reflection then, in the following part of this discourse, I would be understood to mean, that notice which the mind takes of its own operations, and the manner of them, by reason whereof there come to be ideas of these operations in the understanding.”

Within a century, reflection becomes so ingrained in philosophical thought, if not identified with it, that Kant is able to talk of “transcendental reflection”:

“Reflection (reflexio) is not occupied about objects themselves, for the purpose of directly obtaining conceptions of them, but is that state of the mind in which we set ourselves to discover the subjective conditions under which we obtain conceptions.

“The act whereby I compare my representations with the faculty of cognition which originates them, and whereby I distinguish whether they are compared with each other as belonging to the pure understanding or to sensuous intuition, I term transcendental reflection.”

In the 20th century, the reflective tradition takes a peculiar turn.  While the phenomenologists continued to use it as the central engine of their philosophizing, Wilfred Sellars began his attack on “the myth of the given” upon which phenomenological reflection depended.  From an epistemological viewpoint, Sellars questions the implicit assumption that we, as thinking individuals, have any privileged access to our own mental states. Instead, Sellars posits that what we actually have is not clear vision of our internal mental states, but rather a culturally mediated “folk psychology” of mind that we use to describe those mental states.  In one fell swoop, Sellars sweeps away the Cartesian tradition of self-understanding that informs the cogito ergo sum.

In a sense, however, this isn’t truly a reversal of the reflective tradition but merely a refinement.  Sellars and his contemporary heirs, such as the Churchlands and Daniel Dennett, certainly provided a devastating blow to the reliability of philosophical introspection.  The Cartesian project, however, was not one of introspection, nor is the later phenomenological project.  The “given” was always assumed to be unreliable in some way, which is why philosophical “reflection” is required to analyze and correct the “given.”  All that Sellars does is to move the venue of philosophical reflection from the armchair to the laboratory, where it no doubt belongs.

A more fundamental attack on the reflective tradition came from Italy approximately 200 hundred years before Sellars.  Giambattista Vico saw the danger of the Cartesian tradition of philosophical reflection as lying in its undermining of the given of cultural institutions.  A professor of oratory and law, Vico believed that common understanding held a society together, and that the dissolution of civilizations occurred not when those institutions no longer held, but rather when we begin to doubt that they even exist.  On the face of it, it sounds like the rather annoying contemporary arguments against “cultural relativism”, but is actually a bit different.  Vico’s argument is rather that we all live in a world of myths and metaphors that help us to regulate our lives, and in fact contribute to what makes us human, and able to communicate with one another.  In the 1730 edition of the New Science, Vico writes:

“Because, unlike in the time of the barbarism of sense, the barbarism of reflection pays attention only to the words and not to the spirit of the laws and regulations; even worse, whatever might have been claimed in these empty sounds of words is believed to be just.  In this way the barbarism of reflection claims to recognize and know the just, what the regulations and laws intend, and endeavors to defraud them through the superstition of words.”

For Vico, the reflective tradition breaks down those civil bonds by presenting man as a rational man who can navigate the world of social institutions as an individual, the solitary cogito who sees clearly, and cooly, the world as it is.

This begets the natural question, does reflection really provide us with true sight, or does it merely dissociate ourselves from our inner lives in such a way that we only see what we want to see?  In computer science of course (not that this should be any guide to philosophy) the latter is the case.  Reflection is accomplished by publishing metadata about a code library which may or may not be true.  It does not allow us to view the code as it really is, but rather provides us a mediated view of the code, which is then associated with the code.  We assume it is reliable, but there is no way of really knowing until something goes wrong.

PBS Sprout and The Hated

caillou bob

So this is how I start the day when I work from home.  I wake up at 5 in the morning, after which I have about 3 hours before anyone else is up.  At 8, the kids start filtering down from upstairs, so I turn PBS Sprout on for them and move from the living room to my office.  PBS Sprout is PBS’s lineup of children’s shows, and our cable provider gives us On Demand access to the episodes, which allows the kids to watch their shows without commercials (oh yes, PBS does have commercials).  My children (at least the youngest) has a fondness for a bald toddler named Caillou.  According to the official site, "the Caillou website and television series features everyday experiences and events that resonate with all children."   I think most parents find him a bit disturbing — but not as disturbing as Teletubbies, of course.

Before Caillou came on today there was a brief intro for PBS Sprout, and in the background was an interesting rendition of Bob Marley’s Three Little Birds which brought back a flood of memories.  The version of the song played by PBS Sprout is by Elizabeth Mitchell.  No, not that Elizabeth Mitchell.  This Elizabeth Mitchell.

Elizabeth Mitchell is married to Daniel Littleton, and in fact Daniel and their son perform on that particular Marley track.  Dan Littleton, in turn, used to play in a punk rock band in Annapolis, Maryland where I went to college.  For my first few years on campus, I used to find chalk drawings all over the sidewalks of Annapolis of The Hated without knowing what it meant.  Then Dan Littleton ended up going to my college (he was a faculty brat, after all) and it all became clear.

Not only that, but I used to hang out with Mark Fisher, who had played guitar and vocals for The Hated, though by the time I met him he was wearing tweed jackets and translating Greek (I think I did the Philoctetes with him), so I never suspected.

And mutatis mutandis, now not only has Bob Marley been gentrified for daytime cartoons, but the founder of The Hated has helped to make it possible.  Is this what middle-age feels like? 

Hear for yourself.

Bob Marley and the Wailers 

Elizabeth Mitchell and family

Impedance Mismatch

impedance_mismatch

Impedance mismatch is a concept from electronics that is gaining some mindshare as an IT metaphor.  It occupies the same social space that cognitive dissonance once did, and works in pretty much the same way to describe any sort of discontinuity.  It is currently being used in IT to describe the difficulty inherent in mapping relational structures, such as relational databases, to object structures common in OOP.  It is shorthand for a circumstance in which two things don’t fit.

Broadening the metaphor a bit, here is my impedance mismatch.  I like reading philosophy.  Unfortunately, I also like reading comic books.  I’m not a full-blown collector or anything.  I pick up comics from the library, and occasionally just sit in the bookstore and catch up on certain franchises I like.  I guess that in the comic book world, I’m the equivalent of someone who only drinks after sun-down or only smokes when someone hands him a cigarette, but never actually buys a pack himself.  A parasite, yes, but not an addict.

The impedance mismatch comes from the sense that I shouldn’t waste time reading comics.  They do not inhabit the same mental world that the other things I like to read do. I often sit thinking that I ought be reading Schopenhauer, with whom I am remarkably unfamiliar for a thirty-something, or at least reading through Justin Smith’s new book on WCF Programming, but instead find myself reading an Astro City graphic novel because Rocky Lhotka recommended it to me.  The problem is not that I feel any sort of bad faith about reading comic books when I ought to be reading something more mature.  Rather, I fear that I am actually being true to myself.

A passage from the most recent New Yorker in an article by Jonathan Rosen nicely illustrates this sort of impedance mismatch:

Sometime in 1638, John Milton visited Galileo Galilei in Florence. The great astronomer was old and blind and under house arrest, confined by order of the Inquisition, which had forced him to recant his belief that the earth revolves around the sun, as formulated in his “Dialogue Concerning the Two Chief World Systems.” Milton was thirty years old—his own blindness, his own arrest, and his own cosmological epic, “Paradise Lost,” all lay before him. But the encounter left a deep imprint on him. It crept into “Paradise Lost,” where Satan’s shield looks like the moon seen through Galileo’s telescope, and in Milton’s great defense of free speech, “Areopagitica,” Milton recalls his visit to Galileo and warns that England will buckle under inquisitorial forces if it bows to censorship, “an undeserved thraldom upon learning.”

Beyond the sheer pleasure of picturing the encounter—it’s like those comic-book specials in which Superman meets Batman—there’s something strange about imagining these two figures inhabiting the same age.

The Aesthetics and Kinaesthetics of Drumming

 sheetmusic

Kant’s Critique of Judgment, also know as the Third Critique since it follows the first on Reason and the second on Morals, is a masterpiece in the philosophy of aesthetics.  With careful reasoning, Kant examines the experience of aesthetic wonder, The Sublime, and attempts to relate it to the careful delineations he has made in his previous works between the phenomenal and noumenal realms.  He appears to allow in the Third Critique what he denies us in the First: a way to go beyond mere experience in order to perceive a purpose in the world.  Along the way, he passes judgment on things like beauty and genius that left an indelible mark on the Romanticism of the 19th century.

Taste, like the power of judgment in general, consists in disciplining (or training) genius.  It severely clips its wings, and makes it civilized, or polished; but at the same time it gives it guidance as to how far and over what it may spread while still remaining purposive.  It introduces clarity and order into a wealth of thought, and hence makes the ideas durable, fit for approval that is both lasting and universal, and hence fit for being followed by others…

Kant goes on to say that where taste and genius conflict, a sacrifice needs be made on the side of genius.

in his First Critique, Kant discusses the "scandal of philosophy" — that after thousands of years philosophers still cannot prove what every simple person knows — that the external world is real.  There are other scandals, too, of course.  There are many questions which, after thousands of years, philosophers continue to argue over and, ergo, for which they have no definitive answers.  There are also the small scandals which give an aspiring philosophy student pause, and make him wonder if the philosophizing discipline isn’t a fraud and a sham after all, such as Martin Heidegger’s Nazi affiliation.  Here the question isn’t why he didn’t realize what every simple German should have known, since even the simple Germans were quite taken up with the movement.  What leaves a bad taste, however, is the sense that a great philosopher should have known better.

A minor scandal concerns Immanuel Kant’s infamous lack of taste.  When it came to music, he seems to have a particular fondness for martial music, das heist, marching bands with lots of drumming and brass.  He discouraged his students from learning to actually play music because he felt it was too time consuming.   We might say that in his personal life, when his taste and his genius came into conflict, Kant chose to sacrifice his taste.

I think I will, also.  In Rock Band, the drums are notoriously the most difficult instrument to play well.  It is also the faux instrument that most resembles the real thing, and it is claimed by some that if you become a good virtual drummer, you will also in the process become a good real drummer.  I’ve tried it but I can’t get beyond the Intermediate level.  I can sing and play guitar on hard, but the drums have a sublime complexity that exceed my abilities to cope.  With uncanny timing Wired magazine has come out with a walkthrough for the drums in Rock Band (h/t to lifehacker.com).  It mostly concerns working with the kick pedal and two alternative techniques, heel-up and heel-down (wax-on/wax-off?) for working with it.  It involves a bit of geometry and a lot of implicit physics.  I would have liked a little more help with figuring out the various rhythm techniques, but according to wired, I would get the best results by simply learning real drum techniques, either with an instructor or through YouTube. 

I wonder what Kant would say about that.

Slouching Towards The Singularity

singularity

This has been a whirlwind week at Magenic.  Monday night Rocky Lhotka (the ‘h’ is silent and the ‘o’ is short, by the way), a Magenic evangelist and Microsoft Regional Director came into town and presented at the Atlanta .NET User’s Group.  The developers at Magenic got a chance to throw questions at him before the event, and then got another chance to hear him at the User’s Group.  The ANUG presentation was extremely good, first of all because Rocky didn’t use his power point presentation as a crutch, but rather had a complete presentation in his head for which the MPP slides — and later the code snippets (see Coding is not a Spectator Sport) — were merely illustrative.  Second, in talking about how to put together an N-Layer application, he gave a great ten year history of application development, and the ways in which we continue to subscribe to architectures that were intended to solve yesterday’s problems. He spoke about N-Layer applications instead of N-Tier (even though it doesn’t roll off the tongue nearly as well)  in order to emphasize the point that, unlike the in COM+ days, these do not always mean the same thing, and in fact all the gains we used to ascribe to N-Tier application — and the reason we always tried so hard to get it onto our resumes — can in fact be accomplished with a logical, rather than a physical, decoupling of the application stack.  One part of the talk was actually interactive, as we tried to remember why we used to try to implement solutions using Microsoft Transaction Server and later COM+.  It was because that was the only way we used to be able to get an application to scale, since the DBMS in a client-server app could typically only serve about 30 concurrent users.  But back then, of course, we were running SQL Server 7 (after some polling, we audience agreed it was actually SQL Server 6.5 back in ’97) on a box with a Pentium 4 (after some polling, the audience concluded that it was a Pentium II) with 2 Gigs (it turned out to be 1 Gig or less) of RAM.  In ten years time, the hardware and software for databases have improved dramatically, and so the case for an N-tier architecture (as opposed to an N-Layer architecture) in which we use two different servers in order to access data simply is not there any more.  This is one example of how we continue to build applications to solve yesterday’s problems.

The reason we do this, of course, is that the technology is moving too fast to keep up with.  As developers, we generally work with rules of thumb — which we then turn around and call best practices — the reasons for which are unclear to us, or simply become forgotten.  Rocky is remarkable in being able to recall that history — and perhaps even for thinking of it as something worth remembering — and so is able to provide an interesting perspective on our tiger ride.  But of course it is only going to get worse.

This is the premise of Vernor Vinge’s concept of The Singularity.  Based loosely on Moore’s Law, Vinge (pronounced Vin-Jee) proposed that our ability to predict future technologies will collapse over time, so that if we could, say, predict technological innovation ten years in the future 50 years ago, today our prescience only extends some five years into the future.  We are working, then, toward some moment in which our ability to predict technological progress will be so short that it is practically nothing.  We won’t be able to tell what comes next.  This will turn out to be a sort of secular chilianism  in which AI’s happen, nanotechnology becomes commonplace, and many of the other plot devices of science fiction (excluding the ones that turn out to be physically impossible) become realities.  The Singularity is essentially progress on speed.

There was some good chatting around after the user group and I got a chance to ask Jim Wooley his opinion of LINQ to SQL vs Entity Framework vs Astoria, and he gave some eyebrow raising answers which I can’t actually blog about because of various NDA’s with Microsoft and my word to Jim that I wouldn’t breath a word of what I’d heard (actually, I’m just trying to sound important — I actually can’t remember exactly what he told me, except that it was very interesting and came down to that old saw, ‘all politics are local’).

Tuesday was the Microsoft 2008 Launch Event, subtitled "Heroes Happen Here" (who comes up with this stuff?).  I worked the Magenic kiosk, which turned out (from everything I heard) to be much more interesting that the talks.  I got a chance to meet with lots of developers and found out what people are building ‘out there’.  Turner Broadcasting just released a major internal app called Traffic, and is in the midst of implementing standards for WCF across the company.  Matria Healthcare is looking at putting in an SOA infrastructure for their healthcare line of products.  CCF – White Wolf, soon to be simply World of Darkness, apparently has the world’s largest SQL Server cluster with 120 blades servicing their Eve Online customers, and is preparing to release a new web site sometime next year for the World of Darkness line, with the possibility of using Silverlight in their storefront application.  In short, lots of people are doing lots of cool things.  I also finally got the chance to meet Bill Ryan at the launch, and he was as cool and as technically competent as I had imagined.

Tuesday night Rocky presented on WPF and Silverlight at the monthly Magenic tech night.  As far as I know, these are Magenic only events, which is a shame because lots of interesting and blunt questions get asked — due in some part to the free flowing beer.  Afterwards we stayed up playing Rock Band on the XBOX 360.  Realizing that we didn’t particularly want to do the first set of songs, various people fired up their browsers to find the cheat codes so we could unlock the advanced songs, and we finished the night with Rocky singing Rush and Metallica.  In all fairness, no one can really do Tom Sawyer justice.  Rocky’s rendition of Enter Sandman, on the other hand, was uncanny.

Wednesday was a sales call with Rocky in the morning, though pretty much I felt like a third wheel and spent my time drinking the client’s coffee and listening to Rocky explain things slightly over my head, followed by technical interviews at the Magenic offices.  Basically a full week, and I’m only now getting back to my Silverlight in Seven Days experiment — will I ever finish it?

Before he left, I did get a chance to ask Rocky the question I’ve always wanted to ask him.  By chance I’ve run into lots of smart people over the years — in philosophy, in government, and in technology — and I always work my way up to asking them this one thing.  Typically they ignore me or they change the subject.  Rocky was kind enough to let me complete my question at least, so I did.  I asked him if he thought I should make my retirement plans around the prospect of The Singularity.  Sadly he laughed, but at least he laughed heartily, and we moved on to talking about the works of Louis L’Amour.

Christmas Tree Blues

67505_006

Every family has its peculiar Christmas traditions.  My family’s holiday traditions are strongly influenced by a linguistic dispute back in A.D. 1054, one consequence of which is that we celebrate Christmas on January 7th, 13 days after almost everyone else we know does so.  This has its virtues and its vices.  One of the vices is that we clean up on holiday shopping, since we are afforded an extra 13 days to pick up last minute presents, which gets us well into the time zone of post-holiday sales.  Another is that we always wait until a period somewhere between December 23rd and December 26th to buy our Christmas tree.  We typically are able to pick up our trees for a song, and last year were even able to get a tall frosty spruce without even singing.

This history of vice has finally caught up with us, for this year, as we stalked forlornly through the suburbs of Atlanta, no Christmas trees were to be found.  Lacking foresight or preparation, we have found ourselves in the midst of a cut-tree shortage.  And what is a belated Christmas without a cut-tree shedding in the living room?

We are now in the position of pondering the unthinkable.  Should we purchase an artificial tree this year (currently fifty-percent off at Target)?  The thought fills us with a certain degree of inexplicable horror.  Perhaps this is owing to an uncanny wariness about the prospects of surrendering to technology, in some way.  While not tree-huggers, as such, we have a fondness for natural beauty, and there are few things so beautiful as a tree pruned over a year to produce the correct aesthetic form, then cut down, transported, and eventually deposited in one’s living room where it is affectionately adorned with trinkets and lighting.

Another potential source for the uneasiness my wife and I are experiencing is an association of these ersatz arboreals with memories of our childhoods in the late 70’s and early 80’s, which are festooned with cigarette smoke, various kinds of loaf for dinner, checkered suits, polyester shirts and, of course, artificial trees.  Is this the kind of life we want for our own children?

In the end, we have opted to get a three-foot, bright pink, pre-lit artificial tree.  Our thinking is that this tree will not offend so greatly if it knows its place and does not put forward pretensions of being real.

The linguistic ambiguity alluded to above has led to other traditions.  For instance, in Appalachia there are still people who cleave to the custom that on the midnight before January 6th, animals participate in a miracle in which they all hold concourse.  Briefly granted the opportunity to speak, all creatures great and small can be heard praying quietly and, one would imagine, discussing the events of the past year.  The significance of January 6th comes from the fact that the Church of England was late to adopt the Gregorian calendar, which changed the way leap years are calculated, and consequently so was America late.  Thus at the time that the Appalachian Mountains were first settled by English emigrants, the discrepancy between the Gregorian calendar and the Julian calendar was about 12 days (the gap, as mentioned above, has grown to 13 days in recent years).  The mystery of the talking animals revolves around a holiday that was once celebrated on January 6th , but is celebrated no more — that is, Christmas.

According to this site, there is a similar tradition in Italy, itself.  On the day of the Epiphany, which commemorates the day the three magi brought gifts to the baby Jesus, the animals speak.

Italians believe that animals can talk on the night of Epiphany so owners feed them well. Fountains and rivers in Calabria run with olive oil and wine and everything turns briefly into something to eat: the walls into ricotta, the bedposts into sausages, and the sheets into lasagna.

The Epiphany is celebrated in Rome on January 6th of the Gregorian calendar.  It is possible, however, that even in Italy, older traditions have persisted under a different guise, and that the traditions of Old Christmas (as it is called in Appalachia) have simply refused to migrate 12 days back into December, and are now celebrated under a new name.  Such is the way that linguistic ambiguities give rise to ambiguities in custom, and ambiguities in custom give rise to anxiety over what to display in one’s living room, and when.

Taliesin’s Riddle

Frank Lloyd Wright's home in Wisconsin: Taliesin

 

As translated by Lady Charlotte Guest, excerpted from Robert Graves’s The White Goddess: A Historical Grammar of Poetic Myth:

 

Discover what it is:

The strong creature from before the Flood

Without flesh, without bone,

Without vein, without blood,

Without head, without feet …

In field, in forest…

Without hand, without foot.

It is also as wide

As the surface of the earth,

And it was not born,

Nor was it seen …

 

[Answer: Ventus]