Coding is not a Spectator Sport

matrix

 

This past Tuesday I attended a Microsoft technology event at a local movie theater.  Ever since the Matrix, movie theaters are apparently the convenient place to go to get technology updates these days.  If you’ve never been to one of these events, they involve a presenter or two with a laptop connected to the largest movie screen in the building.  The presenters then promote new Microsoft offerings by writing code on the big screen while software programmers who have somehow gotten the afternoon off from their bosses watch on.

Jim Wooley presented on LINQ, while a Microsoft employee presented on WCF.  The technologies looked pretty cool, but the presentations were rather dull.  I don’t think this was really the fault of the presenters, though.  The truth is, watching other people code is a bit like watching paint dry, and seems to take longer.  Perhaps this is why pair programming, one of the pillars of extreme programming, has never caught on (failing to document your code, however, another pillar of extreme programming, has been widely adopted and, like Monsieur Jourdain, many developers have found that they’d been doing XP for years without even realizing it). 

Within these constraints — that is that you are basically doing the equivalent of demonstrating how to hammer nails into a board for four hours — the presenters did pretty well, although Mr. Wooley appeared to be somewhat nervous and kept insisting he was doing “Extreme Presenting” whenever he made a coding mistake and the greediest members of the audience would compete with one another to point out his failings.  The Microsoft presenter didn’t encounter any compile errors like Mr. Wooley did, but on the other hand he was following a script and kept referring to it as he typed the code that we were all watching.  Why you should need a script to write uninteresting demo code that ultimately just emits “Hello, world” messages is beyond me, but that’s what he did, and he demonstrated that there could be something even less able to hold the attention than watching someone write code — watching someone write code by rote.

But it is easy to criticize, and in truth I never got to see the presentation on Silverlight given by Shawn Wildermuth (aka “adoguy”), which for all I know may have been much more entertaining and might have undermined my mantra that coding is not a spectator sport, but I’ll never know because I had to skip out on it in order to attend a company dinner.  How I got invited to this dinner I’ll never know, because I wasn’t really very involved in the project that the dinner was intended to celebrate.

I arrived fashionably late by an hour, and as I entered I realized the only seat left was squeezed in between my manager, the CFO of the company and the Senior VP of IT.  This is a dreadful spot to be in, and into that spot I deposited myself.  The problem with being situated next to one’s uppers at a social event is that one spends an inordinate amount of time trying to think of something to say that will impress one’s uppers, while simultaneously trying to avoid saying anything to demonstrate one’s utter unfitness for one’s position.  And here I was next to my boss, who was sitting across from his boss, who was sitting across from his boss.  And as I sat, watching what appeared to be scintillating conversation at the opposite end of the table, my end was completely silent with an air of tension about it.

So I picked up a menu and tried to order.  This was a steak and seafood restaurant, and judging by the prices, approximately twice as good as Longhorn or Outback.  I took the highest priced item, divided the cost by half, and ordered the crawfish pasta with a glass of wine.  Then I sat back to listen to the silence.  Finally someone struck up a conversation about insurance (my industry).  If you want to know how dreadfully dull insurance talk is, it’s a bit like — actually, there is nothing as boring as insurance talk because it is the sine qua non against which all boredom should be judged.  Listening to insurance talk is the sort of thing that makes you want to start cutting yourself for distraction (it’s an old POW trick), and just as I was reaching for the butter knife I found myself telling the jazz story.

The jazz story went over well and seemed to break the ice, so I followed it up with the Berlin mussels story, which was also a hit.  I drank more wine and felt like I was really on a roll.  I’d demonstrated my ability to talk entertainingly around my bosses and as the food arrived I was able to maintain the mood with a jaunty disquisition on men’s fashion and how to select a good hunting dog.  But I grew overconfident.  Over dessert, I decided to play the teacup game, which is a conversation game my friend Conrad at The Varieties had taught me, and it was a disaster.  Apparently I set it up wrong, because a look of disgust formed on the CFO’s face.  My manager tried to save with a distracting story about hygiene, but rather than leave things well enough alone, I decided to continue with the asparagus story, and pretty well ruined the evening.  Oh well.  Bye-bye annual bonus.

Which all goes to show, entertainment is a damnably difficult business.

eddie

I can probably improve my dinner conversation by reading a bit more P.G. Wodehouse and bit less of The New Yorker (which is where I got the fateful asparagus story) but how to improve a Microsoft presentation is a much trickier nut to crack.  How much can you realistically do to dress up watching other people code?

Then again, it is amazing what passes for a spectator sport these days, from Lumberjack Olympics to Dancing with the Stars.  Perhaps one of the strangest cultural trends is the popularity of poker as a spectator sport — something that would have seemed unimaginable back in the day.  The whole thing revolves around a handful of people dressed up in odd combinations of wigs, sunglasses and baseball caps to hide their tells playing a card game that depends largely on luck, partly on a grasp of probabilities, and partly on being able to guess what your opponents are guessing about you.  Is there anything in this jumble of crazy costumes, luck and skill that can be used to improve a typical Microsoft presentation?

The truth is, even skill isn’t so important in creating a successful spectator sport.  Take quiz shows, which once were devoted to very tough questions that left the audience wondering how the contestants could know so much (it turned out, of course, that often they were cheating).  Over time, these shows became simpler and simpler, until we ended up with shows like Are You Smarter Than a 5th Grader (which makes you wonder how they find contestants so dumb) and the very successful Wheel of Fortune (in which you are challenged to list all the letters of the alphabet until a hidden message becomes legible).  Demonstrating skill is not the essence of these games.

If you have ever seen National Lampoon’s Vegas Vacation (fourth in the series, but my personal favorite), you will recall the scene where, after loosing a large portion of his life savings at a casino, Chevy Chase is taken by his cousin Eddie to a special place with some non-traditional games of luck such as rock-paper-scissors, what-card-am-I-holding, and pick-a-number-between-one-and-ten.  This, it turns out, is actually the premise of one of the most popular American game shows of the year, Deal Or No Deal, hosted by the failed-comedian-actor-turned-gameshow-host Howie Mandel.  The point of this game is to pick a number between one and twenty-six, which has a one in twenty-six chance of being worth a million dollars.  The beauty of the game is that the quick and the slow, the clever and the dim, all have an equal chance of winning.  The game is a great leveler, and the apparent pleasure for the audience is in seeing how the contestants squirm.

I had initially thought that Mr. Wooley’s palpable nervousness detracted from his presentation, but the more I think about it, the more I am convinced that his error was in not being nervous enough.  The problem with the format of Microsoft presentations is that there is not enough at stake.   A presenter may suffer the indignity of having people point out his coding errors on stage or of having bloggers ask why he needs a script to write a simple demo app — but at the end of the day there are no clear stakes, no clear winners, no clear losers.

The secret of the modern spectator sport — and what makes it fascinating to watch — is that it is primarily about moving money around.  Televised poker, Survivor-style Reality shows, and TV game shows are all successful because they deal with large sums of money and give us an opportunity to see what people will do for it.  Perhaps at some low level, it even succeeds at distracting us from what we are obliged to do for money.

And money is the secret ingredient that would liven up these perfunctory Microsoft events.  One could set a timer for each code demonstration, and oblige the presenter to finish his code — making sure it both compiles and passes automated unit tests — in the prescribed period in order to win a set sum of money.  Even better, audience members can be allowed to compete against the official Microsoft presenters for the prize money.  Imagine the excitement this would generate, the unhelpful hints from the audience members to the competitors, the jeering, the side-bets, the tension, the drama, the spectacle.  Imagine how much more enjoyable these events would be.

Microsoft events are not the only places where money could liven things up, either.  What if winning a televised presidential debate could free up additional dollars to presidential candidates?  What if, along with answering policy questions, we threw in geography and world event questions with prize money attached?  Ratings for our presidential debates might even surpass the ratings for Deal Or No Deal.

Academia would also be a wonderful place to use money as a motivator.  Henry Kissinger is reported to have said that academic battles are so vicious because the stakes are so low.  Imagine how much more vicious we could make them if we suddenly raised the stakes, offering cash incentives for crushing intellectual blows against one’s enemies in the pages of the Journal of the History of Philosophy, or a thousand dollars for each undergraduate ego one destroys with a comment on a term paper.  Up till now, of course, academics have always been willing to do this sort of thing gratis, but consider how much more civilized, and how clearer the motives would be, if we simply injected money into these common occurrences.

Et in Alisiia Ego

Vercingetorix

 

A city seems like an awfully big thing to lose, and yet this occurs from time to time.  Some people search for cities that simply do not exist, like Shangri-la and El Dorado.  Some lost cities are transformed through legend and art into something else, so that the historical location is something different from the place we long to see.  Such is the case with places like Xanadu and ArcadiaCamelot and Atlantis, on the other hand, fall somewhere in between, due to their tenuous connection to any sort of physical reality.  Our main evidence that a place called Atlantis ever existed, and later fell into the sea, comes from Plato’s account in the Timaeus, yet even at the time Plato wrote this, it already had a legendary quality about it. 

….[A]nd there was an island situated in front of the straits which are by you called the Pillars of Heracles; the island was larger than Libya and Asia put together, and was the way to other islands, and from these you might pass to the whole of the opposite continent which surrounded the true ocean; for this sea which is within the Straits of Heracles is only a harbour, having a narrow entrance, but that other is a real sea, and the surrounding land may be most truly called a boundless continent. Now in this island of Atlantis there was a great and wonderful empire which had rule over the whole island and several others, and over parts of the continent, and, furthermore, the men of Atlantis had subjected the parts of Libya within the columns of Heracles as far as Egypt, and of Europe as far as Tyrrhenia … But afterwards there occurred violent earthquakes and floods; and in a single day and night of misfortune … the island of Atlantis … disappeared in the depths of the sea.

Timaeus

Camelot and Avalon, for reasons I don’t particularly understand, are alternately identified with Glastonbury, though there are also nay-sayers, of course.  Then there are cities like Troy, Carthage and Petra, which may have been legend but which we now know to have been real, if only because we have rediscovered them.  The locations of these cities became forgotten over time because of wars and mass migrations, sand storms and decay.  They were lost, as it were, through carelessness.

But is it possible to lose a city on purpose?  The lost city of Alesia was the site of Vercingetorix’s defeat at the hands of Julius Ceasar, which marked the end of Gallic Wars.  The failure of the Roman Senate to grant Caesar a triumph to honor his victory led to his decision to initiate the Roman Civil War, leading in turn to the end of the Republic and his own reign of power, which only later came to an abrupt end when he was assassinated by, among others, Brutus, his friend and one of his lieutenants at the Battle of Alesia.  Having achieved mastery of Rome in 46 BCE, Caesar finally was able to throw himself the triumph he wanted, which culminated with Vercingetorix — the legendary folk hero of the French nation, the symbol of defiance against one’s oppressors and an inspiration to freedom fighters everywhere –being strangled.

Meanwhile, back in Gaul, Alesia was forgotten, and eventually became a lost city.  It is as if the trauma of such a defeat, in which all the major Gallic tribes were defeated at one blow and brought to their knees, incited the Gauls to erase their past and make the site of their humiliation as if it had never been.  Ironically, when I went to the Internet Classics Archive to find Caesar’s description of this lost city, I found the chapters which cover Caesar’s siege of Alesia to be completely missing.  The online text ends Book Seven of Julius Caesar’s Commentaries on the Gallic and Civil Wars just before the action commences, and begins Book Eight just after the Gauls are subdued, while the intervening thirty chapters appear to have simply disappeared into the virtual ether.

In the 19th century, the French government commissioned archaeologists to rediscover Alesia, and they eventually selected a site near Dijon, today called Alise-Sainte-Reine, as the likely location.  The only problem with the site is that it does not match Caesars description of Alesia, and Caesar’s writings about Alesia is the main source for everything we know about the city and the battle.

I have rooted around in my basement in order to dig up an unredacted copy of Caesar’s Commentaries, containing everything we remember about the lost city of Alesia:

The town itself was situated on the top of a hill, in a very lofty position, so that it did not appear likely to be taken, except by a regular siege. Two rivers, on two different sides, washed the foot of the hill. Before the town lay a plain of about three miles in length; on every other side hills at a moderate distance, and of an equal degree of height, surrounded the town. The army of the Gauls had filled all the space under the wall, comprising a part of the hill which looked to the rising sun, and had drawn in front a trench and a stone wall six feet high. The circuit of that fortification, which was commenced by the Romans, comprised eleven miles. The camp was pitched in a strong position, and twenty-three redoubts were raised in it, in which sentinels were placed by day, lest any sally should be made suddenly; and by night the same were occupied by watches and strong guards.

— Commentaries, Book VII, Chapter 69

Pluralitas non est ponenda sine necessitate

“Plurality should not be posited without necessity.”  — William of Occam

Ptolemaic

I added an extra hour to my commute this morning by taking a shortcut.  The pursuit of shortcuts is a common pastime in Atlanta — and no doubt in most metropolitan areas.  My regular path to work involves taking the Ronald Reagan Highway to Interstate 85, and then the 85 to the 285 until I arrive at the northern perimeter.  Nothing could be simpler, and if it weren’t for all the other cars, it would be a really wonderful drive.  But I can’t help feeling that there is a better way to get to my destination, so I head off on my own through the royal road known as “surface streets”.

Cautionary tales like Little Red Riding Hood should tell us all we need to know about taking the path less traveled, yet that has made no difference to me.   Secret routes along surface streets (shortcuts are always a secret of some kind) generally begin with finding a road that more or less turns the nose of one’s car in the direction of one’s job.  This doesn’t last for long, however.  Instead one begins making various turns, right, left, right, left, in an asymptotic route toward one’s destination. 

There are various rules regarding which turns to make, typically involving avoiding various well-known bottlenecks, such as schools and roadwork, avoiding lights, and of course avoiding left turns.  Colleagues and friends are always anxious to tell me about the secret routes they have discovered.  One drew me complex maps laying out the route she has been refining over the past year, with landmarks where she couldn’t remember the street names, and general impressions about the neighborhoods she drove through when she couldn’t recall any landmarks.  This happened to be the route that made me tardy this morning. 

When I told her what had happened to me, the colleague who had drawn me the map apologized for not telling me about a new side-route off of the main route she had recently found (yes, secret routes beget more secret routes) that would have shaved an additional three minutes off of my drive.  Surface streets are the Ptolemaic epicycles of the modern world.

A friend with whom I sometimes commute has a GPS navigation system on his dashboard, which changes the secret route depending on current road conditions.  This often leads us down narrow residential roads that no one else would dream of taking since they wouldn’t know if the road leads to a dead-end or not — but the GPS system knows, of course.  We are a bit slavish about following the advice of the GPS, even when it tells us to turn the trunk of the car toward work and the nose toward the horizon.  One time we drove along a goat path to Macon, Georgia on the advice of the GPS system in order to avoid an accident on S. North Peachtree Road Blvd. 

All this is made much more difficult, of course, due to the strange space-time characteristics of Atlanta which cause two right turns to always take you back to your starting point and two left turns to always dump you into the parking lot of either a Baptist church or a mall.

Various reasons are offered to explain why the Copernican model of the solar system came to replace the Ptolemaic model, including a growing resentment of the Aristotelian system championed by the Roman Catholic Church, resentment against the Roman Catholic Church itself, and a growing revolutionary humanism that wanted to see the Earth and its inhabitants in motion rather than static.  My favorite, however, is the notion that the principle of parsimony was the deciding factor, and that at a certain point people came to realize that the simplicity, rather than complexity, is the true hallmark of scientific explanation.

The Ptolemaic system, which places the earth at the center of the universe, with the Sun, planets and heavenly sphere revolving around it, was not able to explain the observed motions of the planets satisfactorily.  We know today was due to both having the wrong body placed in the center of the model, as well as insisting on the primacy of circular motion rather than elliptical route the planets actually take. 

In particular, Ptolemy was unable to explain the occasionally observed retrogression of the planets, during which these travelers appear to slow down and then go into reverse during their progression through the sky, without resorting to the artifice of epicycles, or mini circles, which the planets would follow even as they were also following their main circular routes through the sky.  Imagine a ferris wheel on which the chairs do more than hang on their fulcrums; they also do loop-de-loops as they move along with the main wheel.  In Ptolemy’s system, not only would the planets travel along epicycles that traveled on the main planetary paths, but sometimes the epicycles would have their own epicycles, and these would beget additional epicycles.  In the end, Ptolemy required 40 epicycles to explain all the observed motions of the planets.

Copernicus sought to show that, even if his model did not necessarily exceed the accuracy of Ptolemy’s system, he was nevertheless able to get rid of many of these epicycles simply by positing the Sun at the center of the universe rather than the Earth.  At that moment in history simplicity, rather than accuracy per se, became the guiding principle in science.  It is a principle with far reaching ramifications.  Rather than the complex systems of Aristotelian philosophy, with various qualifications and commentaries, the goal of science (in my simplified tale) became the pursuit of simple formulas that would capture the mysteries of the universe.  Whereas Galileo wrote that the book of the universe is written in mathematics, what he really meant is that it is written in a very condensed mathematics and is a very short book, brought down to a level that mere humans can at last contain in their finite minds.

The notion of simplicity is germane not only to astronomy, but also to design.  The success of Apple’s IPod is due not to the many features it offers, but rather to the realization that what people want from their devices is only a small set of features done extremely well.  Simplicity is the manner in which we make notions acceptable and conformable to the human mind.  Why is it that one of the key techniques in legal argumentation is to take a complex notion and reframe it in a metaphor or catchphrase that will resonate with the jurists?  The phrase “If the glove doesn’t fit, you must acquit” was, though bad poetry, rather excellent strategy.  “Separate but not equal” has resonated and guided the American conscience for fifty years.  Joe Camel, for whatever inexplicable reasons, has been shown to be an effective instrument of death.  The paths of the mind are myriad and dark.

Taking a fresh look at the surface streets I have been traveling along, I am beginning to suspect that they do not really save me all that much time.  And even if they do shave a few minutes off my drive, I’m not sure the intricacies of my Byzantine journey are worth all the trouble.  So tomorrow I will return to the safe path, the well known path — along the Ronald Reagan, down the 85, across the top of the 285.  It is simplicity itself.

Yet I know that the draw of surface streets will continue to tug at me on a daily basis, and I fear that in a moment of weakness, while caught behind an SUV or big rig, I may veer off my intended path.  In order to increase the accuracy of his system, even Copernicus was led in the end to add epicycles, and epicycles upon epicycles, to his model of the universe, and by the last pages of On the Revolution of the Heavenly Bodies found himself juggling a total of 48 epicycles — 8 more than Ptolemy had.

Drinking with the Immortals

drinking_horn


There are various legends about drinking with the Immortals.  They typically involve a wanderer lost in the wilderness who is offered shelter by strange people.  He is brought close to the fire and given beer, or wine, or mead, depending on the provenance of the folktale.  As his clothes dry out, he is regaled by tales of ancient times and slowly comes to realize that his companions are not typical folk, but rather denizens from behind the veil.  He has fallen, through no merit of his own, into the midst of an enchanted world, and his deepest fear is not of the danger that is all around him, but rather that once the enchantment is disspelled, he will never be able to recover it again.


It occurred to me recently that I had such an experience about a year ago.  I was sent by my company to the Microsoft campus in Redmond to spend several days with the ASP.NET Team and other luminaries of the .NET world.


The names will mean nothing to most readers, but I had the opportunity to meet Bertrand LeRoy, Scott Guthrie, Eilon Lipton, and others to discuss the (then new) ASP.NET Ajax.  I had been painfully working through the technology for several months, and so found myself able to almost hold a conversation with these designers and developers.


On the final night of the event all the seminar attendees were taken to a local wine bar and had dinner.  As is my wont, I drank as much free wine as was poured into my glass, and began spinning computer yarns that became more and more disassociated from reality as the night wore on.  I’m sure I became rather boorish at some point, but the Microsoft developers listened politely, and in my own mind, of course, I was making brilliant conversation.


Even to those who know something of the people I was talking to, this might seem like no big deal.  I went drinking with colleagues in the same industry I am in — so what.  But for me, it was as if I were suddenly introduced to the people who make the rain that nourishes my fields and the sunlight that warms my days.  Microsoft software simply appears as if by magic out of Redmond, and like millions of others, day in and day out, I dutifully learn and use the new technologies that come out of the software giant.  To find out that there are actually people who design the various tools I use, and build them, and debug them — this is a bit difficult to conceive.


In A Room of One’s Own, Virginia Woolf reflects on Charles Lamb’s encounter with a dog-eared manuscript of one of Milton’s poems, filled with lines scratched out and re-written, words selected and words discarded:



“Lamb then came to Oxbridge perhaps a hundred years ago. Certainly he wrote an essay-the name escapes me-about the manuscript of one of Milton’s poems which he saw here. It was LYCIDAS perhaps, and Lamb wrote how it shocked him to think it possible that any word in LYCIDAS could have been different from what it is. To think of Milton changing the words in that poem seemed to him a sort of sacrilege.”


My own discovery that the things of this world which I consider most solid and most real — because they are so essential to my daily life — could have been otherwise than they are, was a similar moment of shock, tinged with fear. 


In a moment of anxiety during this sweet symposium, I leaned over to the person immediately to my right and confided in him my strange reflections.  He laughed gently, and dismissed my drunken observations about the contingent nature of reality.  I later found out he was the twenty-three year old developer of the ASP.NET login control, used daily in web applications around the world, when he inquired of me whether I had ever used his control, and what I thought of it.

The Price of Progress

dystopia

dasBlog is the engine I have been using over the past year on my web site.  Besides its low cost (free as beer), and a tendency to be a reliable blog engine, I also like it because it uses XML files rather than a database to persist information.  The release version has been running on the .NET 1.1 Framework for quite a long while, and despite a teaser tag on their home site insisting that the new 2.0 version would be released in a matter of weeks, dasBlog 2.0 has actually taken a much, much, longer time to come out.

But now the wait is over, and I plan to upgrade to the newest version sometime later tonight.  As sometimes happens, this may entail the complete collapse of the site and the loss of all prior blog posts — but I’m keeping my fingers crossed and maintaining a positive attitude about it, for such is the price of progress.

Supposing that I am successful in migrating to the newest version, I don’t plan to post a review of the qualities of the new platform since, in this case, the medium is very much the message.

Meme Manqué

treefrog



Pronunciation: n-‘kA
Function: adjective
Etymology: French, from past participle of manquer to lack, fail, from Italian mancare, from manco lacking, left-handed, from Latin, having a crippled hand, probably from manus
: short of or frustrated in the fulfillment of one’s aspirations or talents — used postpositively <a poet manqué>

Merriam-Webster Online


 


During my perusal of the August 27th New Yorker, I came across the word manqué in two different articles, which struck me as noteworthy as I don’t think I have come across this word in several years.  A quick search of the New Yorker archives indicates that besides these two recent uses,  one in a snarky article about Nicolas Sarkozy by Adam Gopnik:



“People close to Sarkozy like to say that he is an American manqué….”


 and the other in a fawning review of Michelangelo Antonioni’s film opus by Anthony Lane:



“This is not to say that the Italian was a novelist manqué.”


the word had been used in an April review of a Richard Gere film, and prior to that had not appeared in the pages of The New Yorker since September of last year, in a short story by Henry Roth.


Occasionally an unusual word achieves a brief period of fashionability due to its rarity, such as was the case with the term disestablishmentarianism, and its dopplegänger anti-disestablishmentarianism, a few years back.  Once it is recognized that such a word has become le mot juste in just too many instances, however, it quickly recedes back into obscurity, like boy bands and one hit wonders. 


Playing with The New Yorker archives reveals similarly suggestive, if not definitive, phenomenological gold about the way rare words become popular for a brief time, and then go underground for a year or more.  Try, for instance, a search on sartorial, zeitgeist, or pusillanimous.  A more interesting project, of course, would involve sifting through the archives of several high-brow publications and graphing the frequency of rare words.  What a memetic field day that would be.


Perhaps this is peculiar to me, but I feel sometimes that using a given word more than once in a blue moon is already an overuse.  Such is my feeling about swearing, which should be used judiciously in order to achieve maximum impact, as well as my feeling about obscure words.  Obscure words, used judiciously, demonstrate erudition and good taste.  Rare words, when abused, simply demonstrate boorishness, false eloquence, and a supercilious character, as well as a proclivity toward intellectual bullying.  That’s fucked up.


My sense that the obscure should be kept obscure does not pertain to words alone.  In the early 90’s I came across an anecdote while watching Star Trek: Next Generation called the frog and the scorpion, which was ascribed to Aesop.  In the version I heard, a scorpion asks a frog to take him across a river and after much deliberation and rationalization, the frog finally agrees.  Unfortunately, the scorpion does decide to sting the frog midstream, after all, and when the frog asks why, the scorpion replies, “It is my nature.”  The punchline is that they both drown.


Oddly I came across the same anecdote again, a few weeks later while watching Senator Robert Byrd of West Virginia deliver a speech on the Senate floor.  It probably was over an important international event, but I only remember the anecdote and no longer recall what the anecdote was meant to illustrate.  What was interesting about Senator Byrd’s version is that he ascribed the story to Chaucer, rather than Aesop.


A while after that (was it months or years?) the anecdote came before me once again in another Star Trek franchise, Voyager, except this time it was described as a Native American myth and was told by the space-faring Indian Commander Chokote, and the protagonists were now a coyote and a scorpion rather than a frog and a scorpion.


A little research indicates that this particular anecdote may have originally been revived from its antique sleep in the movie The Crying Game, before it made its way through public policy papers, senate speeches, and finally into televised science fiction, where I came across it.


The first time I heard it, I found it charming. The second time, I thought it platitudinous.  The third time, I thought it was idiotic and vowed to boycott the next show, politician or foreign policy that attempted to leverage it in order to make a point.  Such is my nature.


Then again, I recall Benjamin Franklin’s prescription that once one has found a word that works, it is unnecessary to go out of one’s way to find synonyms in order simply to avoid overusing the word in a given piece of journalism or essay.  One should just reuse the word as often as one requires it — which is common-sensical advice, I must admit.