Interop Forms Toolkit 2.0 Tutorial

I originally wrote the prize-winning Interop Forms article below for code project.  The prize, an XBox 360 Elite system, was pretty sweet.  Even sweeter, however, was the nod I received from the Microsoft VB Team here: and here:

The feedback from the VB Team, along with some help from Mike Dooney, led me along the right path to rework my C# templates a bit (the toolkit comes with only VB.NET templates).  The differences between the VB and C# templates highlight the fact that although the two languages are often believed to do the same thing once code is compiled to IL, this is not really the case, and in fact, when it comes to COM, VB does a much better job.

To be specific, the VB compiler creates additional code in IL to make a VB class’s events and properties visible in VB6, while the C# compiler does not. I had written the original templates with the assumption that the two compilers would write similar IL — because of this faulty assumption, the events thrown in a C# UserControl were never received in VB6. This has been corrected in the updated C# templates by including in C# the extra attributes and interfaces required to make the events visible — interfaces which the VB compiler automatically creates for you in IL.

The C# templates linked below also automatically create these interfaces for you. When you add a new event to your UserControl, you will just need to be sure to add it to the appropriate interface, as well.  The naming convention for the interfaces (one for exposed events, one for exposed properties) is simply the name of the UserControl class preceded by two underscores and one underscore, respectively. It’s a little bit more manual labor than the VB templates require — but not too much more.

From the feedback I received at code project, it is apparent that while the toolkit is a big help to developers still maintaining VB6 applications, the big gain is for FoxPro developers (really a very solid development framework) who can now squeeze a little more out of their interfaces when they need to, thanks to the Toolkit.



Why use the Interop Toolkit?

A few years ago, the enterprise architects at the company I worked for came up with a central login mechanism for all the company’s applications using web services. They even provided code samples in Java, C# and VB.NET for using their new component. It was intended as a language agnostic solution. When we asked the architects what we should do with the VB6 applications that we were still supporting, the architects were nonplussed. They first provided some esoteric white papers on using SOAP with VB6, then they suggested that we upgrade all of our VB6 apps to .NET, and finally they conceded that VB6 apps simply didn’t have a place in their solution.

Had Interop Forms Toolkit 2.0 been available back then, we could have come up with an integration in under an hour. We would have simply copied the sample code into a new .NET User Control, used the Interop Toolkit to wrap it up as an ActiveX control, and then consumed the control in all of our VB6 apps.

Interop Toolkit 2.0 was just released at the beginning of May. The original Interop Toolkit already allowed VB developers to use .NET Forms in their applications. This is still in Toolkit 2.0, and appears not to have changed much.

What makes Toolkit 2.0 standout is its support for using .NET User Controls as ActiveX controls in VB6.


According to Microsoft, the toolkit is intended as part of a migration strategy for upgrading VB6 applications to .NET piece by piece. I am not sure this is how it is likely to be used, however, or even if it necessarily ought to be used in this way.

Toolkit 2.0 makes most sense as a tool that allows VB6 developers to take advantage of .NET features without being forced onto an upgrade path. Most VB6 applications that are still around obviously meet certain needs very well. Why fix something that isn’t broken?

There are times, however, when you may want to leverage .NET features in your VB6 application. For a long time your only two choices were to upgrade the whole application to .NET, or to forego the nifty new features.

Toolkit 2.0 provides a third option. Simply add the .NET feature you need as a control.

This tutorial will lead you through 1. a mock application that implements the sort of technology we would have used to solve the problem outlined above. It will also cover 2. installing the Interop Toolkit, 3. provide a reference app that gives developers the ability to use real multithreading in their VB6 apps, and finally 4. provide a how-to for integrating XAML files into VB6.

Interop for C# developers

Just as with the previous version, Interop Forms Toolkit 2.0 is geared towards VB.NET developers. The wizard, project templates and item templates that come with the Toolkit only come in VB flavors. This makes a certain amount of sense, since it will mostly be VB developers who will implement these .NET/VB6 integrations. Many developers like to work with both languages, however, and there may be integration scenarios where you need to expose pre-existing C# code to VB6.

For those cases, I’ve written the C# item template and C# project template linked above for Interop User Controls. Simply copy the project template zip file into your project templates folder (the default location is ...\My Documents\Visual Studio 2005\Templates\ProjectTemplates\Visual C#) and the item template zip file into your item templates folder (...\My Documents\Visual Studio 2005\Templates\ItemTemplates\Visual C#). I believe that these templates will only work with Visual Studio 2005, but I havn’t yet tested on older versions of Visual Studio to make sure.

For cases where you need to expose a C# Form, you can use the clever wizard and template written by Leon Langleyben, which I was able to get to work with a bit of tweaking — through no fault of Leon’s, since his add-in was written for the previous version of the Interop Toolkit (refer to the CSXamlEmbeddedForm project in the included CSharp Samples to see what the generated wrapper class should look like in C#).

Installing the Toolkit

Installing the toolkit is fairly straightforward. Navigate to the Toolkit Download Site and, of the three downloads available, run the InteropFormsInstaller.msi file. In most cases, this is all you need to do. When you open the Visual Studio.NET IDE, you should find the new templates, VB6 Interop UserControl and VB6 Interop Form Library, available when you create a new VB.NET project. Under your tools menu, you should also find a new wizard labeled “Generate InteropForm Wrapper Classes”.

If the new wizard does not appear in your tools menu, there may have been a problem installing it. Check Tools | Add-in Manager to make sure that this wizard is selected. If it is present and selected in the Add-in Manager, but still does not appear in your tools menu, you can run the following command in your command line to reset it: Devenv /resetaddin Microsoft.InteropFormTools.InteropFormProxyGenerator.Connect.

Installing the Toolkit on Vista

In order to use the Toolkit on Windows Vista, you will need to download both the msi file as well as the setup file to your harddrive. Then run setup. Running the msi file alone will generate an install error.

To use the C# UserControl templates on Vista, you will need to run Visual Studio as an Administrator. Right-click on the link to your Visual Studio IDE and select the Run as administrator popup menu option. This will let Vista’s UAC feature know that it is alright for the UserControl to write to the registry on build events.

Building a User Control

In this first example, the UserControl will take care of all the processing, and will just sit in the VB6 Form, while the example following it will demonstrate how to pass information between VB6 and VB.NET. Any code snippets will be in VB, though the source code samples linked above include both a VB.NET as well as a C# sample of the control.

In this example we use the Daily Dilbert web service to download a cartoon to our control (you will notice that the Daily Dilbert comes up a lot on CodeProject — for good reason; it happens to be one of the few interesting web services that can actually be used in public tutorials, since it does not require a fee or registration).

Open new project

Begin by creating a new VB6 Interop User Control project using the VB6 Interop UserControl project template. Name the project DailyDilbertControl. By default, a UserControl file is created called InteropUserControl.vb. Since this is the control name that will be displayed in your VB6 control panel, you should rename it to DilbertService.vb, to be more descriptive.

In your new project, you will find the following files: ActiveXControlHelpers.vb, InteropUserControl.bmp, InteropUserControl.manifest. ActiveXControlHelpers.vb, as the name suggests, includes several static helper methods that make the conversion of your UserControl into an ActiveX control possible. There are register and unregister methods, which add details about your control to the registry. There are methods that help convert things like color codes between the .NET scheme and the OLE scheme used in VB6. There is also a method that wires up your events to work with VB6. You can have multiple user controls in your project, but should only have one ActiveXControlHelpers file per project.

The InteropUserControl.bmp is used to display your control in the VB6 Toolbox. I will cover how to customize your ActiveX control image later in this tutorial.

Add a new web reference to the Daily Dilbert web service to your project. To do this, right click on your project and select Add Web Reference… A new dialog will pop up. Enter for the URL and click on Go. Finally, rename the web reference to “DDService” and select Add Reference at the lower right of the dialog. Your Solution Explorer should now look like this.


Add a 650 by 215 pixel PictureBox called DilbertPictureBox to your UserControl, as well as a button called RetrieveButton. Create a new RetrieveButton_Click event handler by double clicking on RetrieveButton. Paste in the following code, which will call the web service and retrieve today’s Dilbert strip.

    Private Sub RetrieveButton_Click(ByVal sender As System.Object, _
        ByVal e As System.EventArgs) Handles RetrieveButton.Click
        Dim myDilbert As New DDService.DailyDilbert()
        Dim DilbertMemoryStream As _
        New System.IO.MemoryStream(myDilbert.DailyDilbertImage())
        With Me.DilbertPictureBox
            .Image = Image.FromStream(DilbertMemoryStream)
            .BorderStyle = BorderStyle.Fixed3D
        End With
    End Sub

To finish making your control visible from VB6, just press F5 to preview the control, or simply build it. It will look like this in your Visual Studio UserControl Test Container:


And that’s all it takes to build build an ActiveX control in Visual Studio.NET.

Adding an ActiveX Control Image

When you build a new control, InteropUserControl.bmp will be used as the default image for your component in the VB6 Toolbar. You can always use a different image, though. For this project, I want to use this image of Dilbert.


To add it, you first have to add the bitmap file you want to use to your project and set its build property to “content”. Now open the InteropUserControl.rc resource script file with notepad. DO NOT use Visual Studio to do this, as this will mess up your resource script file. InteropUserControl.rc can be found under the DailyDilbertControl project folder. Beneath the default 101 BITMAP InteropUserControl.bmp entry, add an additional entry specifying the custom bitmap you want to use.


Save your resource script file. Now open ActiveXHelpers.vb and find the RegisterControl method. This is where registry entries are created. In the section where the bitmap file is specified, replace the default entry, “101”, with a reference to your own bitmap.


Now rebuild your control to make sure a new compiled resource file is created. The new image should appear in the VB6 Toolbox rather than the default image.


More about adding ActiveX Control Images

You can use a different image for each UserControl in your project, but you will have to modify the ActiveXControlHelpers.vb file a bit to make this work. Alter the RegisterControl method signature to take a second string parameter, and then pass this parameter to the line where the code specifies the resource id of the image.

    Public Sub RegisterControl(ByVal t As Type, ByVal BitmapId As String)

    Using bitmapKey As RegistryKey = subkey.CreateSubKey("ToolBoxBitmap32")
    bitmapKey.SetValue("", Assembly.GetExecutingAssembly.Location & ", "  _
    & BitmapId, RegistryValueKind.String)
    End Using
    End Sub

Then, in each UserControl, add the BitmapId you want to use for your control to the RegisterControl call.

    <EditorBrowsable(EditorBrowsableState.Never)> _
    <ComRegisterFunction()> _
    Private Shared Sub Register(ByVal t As Type)
        ComRegistration.RegisterControl(t, "102")
    End Sub

Rebuild the entire project once more.

Adding the UserControl to a VB6 project


This is actually the easiest part. Create a new VB6 project. Press CTRL+T to add a new component to your form, and check the DailyDilbertControl library. Press OK. (Vista behaves a bit strangely when you try to add your ActiveX control. It will occassionally throw an error the first time you select OK, then will work normally the second time you do so. Just to be safe, click Apply first just to see if there is an error, and then OK.) Any UserControls in your project will now appear on the VB Toolbar. Simply select the control you want to use and draw it onto your VB6 form. Press F5 to see your .NET UserControl run in a Visual Basic 6 application.


Adding True Multithreading to VB6

When I was working with VB6 on a regular basis, one of the main impetuses for upgrading to .NET was the ability to implement multithreading. Interop UserControls provide an easy way to add true multithreading to a VB6 application. In a common scenario, you may want the users of your VB app to be able to kick off a process and then continue with their work while the processing occurs in the background. To simulate this scenario, the reference code we are about to build will use a BackgroundWorker control that will perform a time-consuming process in the background while updating a progress bar. In the meantime, the users of the VB6 application that consumes the control can continue with their work.

Create a new VB6 Interop UserControl project called MultithreadedControl. Add a BackgroundWorker control named BackgroundWorker1, a label called LabelWarningMessage and ProgressBar called ProgressBar1. Paste in the following code.

    Public Delegate Sub StartEventHandler(ByVal simpleEventText As String)
    Public Delegate Sub FinishAsyncEventHandler(ByVal asyncEventText As String)

    Public Event StartEvent As StartEventHandler
    Public Event FinishAsyncEvent As FinishAsyncEventHandler

    Public Sub StartProcessing()
            RaiseEvent StartEvent(".NET process starting")
        End Try
    End Sub

    Private Sub BackgroundWorker1_DoWork(ByVal sender As System.Object, _
    ByVal e As System.ComponentModel.DoWorkEventArgs) _
    Handles BackgroundWorker1.DoWork
        Static prog As Integer = 0
        While (prog < 100)
            prog = prog + 2
        End While
        prog = 0
    End Sub

    Private Sub BackgroundWorker1_ProgressChanged(ByVal sender As System.Object, _
    ByVal e As System.ComponentModel.ProgressChangedEventArgs) _
    Handles BackgroundWorker1.ProgressChanged
        Me.LabelWarningMessage.ForegroundColor = Color.Red
        Me.LabelWarningMessage.Text = "Working in background..."
        Me.LabelWarningMessage.Visible = True
        Me.ProgressBar1.Value = e.ProgressPercentage
    End Sub

    Private Sub BackgroundWorker1_RunWorkerCompleted(ByVal sender As System.Object, _
    ByVal e As System.ComponentModel.RunWorkerCompletedEventArgs) _
    Handles BackgroundWorker1.RunWorkerCompleted
        Me.LabelWarningMessage.Visible = False
        RaiseEvent FinishAsyncEvent("Interop User Control process finished.")
    End Sub
    Private Sub BackgroundWorker_Load(ByVal sender As System.Object, _
    ByVal e As System.EventArgs) Handles MyBase.Load
        Me.ProgressBar1.Value = 0
        Me.LabelWarningMessage.Visible = False
    End Sub

Finally, make sure that the BackgroundWorker events are hooked up to the handlers we’ve written.


Also make certain that the BackgroundWorker’s WorkerReportsProgress property is set to true. Build the project.

Open a new VB6 project and add the MultithreadedControl component to your VB6 Form, as you did in the previous example. Also add a multiline TextBox control (Text1), a ListBox (List1), and a CommandButton labeled “Process” (Command1).

In order to receive events from the BackgroundWorker control, you will need to add an additional reference to the control. Click on the menu item Project | References… A reference to MultithreadedControlCtrl will already be checked off from previously adding it as a component. You will now need to also include a reference to the library MultithreadedControl in order to capture events thrown from the .NET Control.


Finally, paste in the following VB6 code. In this code, you declare a new reference to the control, this time decorated with the keyword “WithEvents” in order to expose the control’s events. Wiring up handlers for the events is based only on the names of the procedures, so you have to be careful when typing out the Sub routines that will be used in this way.

Before the underscore, always use the same name you used when you created the second reference (in this case “BackgroundEvents”). Then after the underscore, use the actual event name as it appears in your original .NET control. I’ve seen lots of problems posted to various message boards concerning problems with VB Interop event handling that basically came down to misspelling a handler’s signature — so be careful.

Dim WithEvents BackgroundEvents As MultithreadedControl.BackgroundWorker

Private Sub Command1_Click()
    Me.Text1.Text = ""
    Me.List1.AddItem ("Start processing from VB6: " & DateTime.Now)
End Sub

Private Sub Form_Load()
    Set BackgroundEvents = Me.BackgroundWorker1
End Sub

Private Sub BackgroundEvents_StartEvent(ByVal EventText As String)
    Me.List1.AddItem (EventText)
End Sub

Private Sub BackgroundEvents_FinishAsyncEvent(ByVal EventText As String)
    Me.List1.AddItem (EventText)
    Me.List1.AddItem ("Finish processing from VB6:" & DateTime.Now)
End Sub

This reference app basically demonstrates how a .NET BackgroundWorker can be used inside a VB6 application. Even while the processing is occurring, and updating the status bar to let us know how far it has gotten, the end-user can continue typing into the textbox. If you have never programmed in VB6, then this probably seems like a trivial accomplishment.

For those of us who have worked on Visual Basic 6 apps for large portions of our careers, it is a breakthrough.


Using XAML in VB6

You cannot build a XAML UserControl or a XAML Form and then consume it directly in VB6, unfortunately. You also cannot simply add a XAML Form to a Windows Application project and expose it that way. With a bit of finesse, what you can do is embed a XAML UserControl in a Windows Form and consume that in your VB6 apps. The following walkthrough will show you how.

Create a new VB6 InteropForm Library Project in Visual Studio, and call it XamlEmbeddedForm. Rename the default Windows Form to XamlForm. Now add a second project based on the .NET Framework 3.0 Custom Control Library (WPF) project template and call it XamlUserControl. You can add whatever XAML code you like, at this point. For the reference project, I’ve used the Cube Animation code found in the WPF SDK. Build the XamlUserControl project. In XamlEmbeddedForm, add a reference to the UserControl project. Also add the following four library references: PresentationCore, PresentationFramework, WindowsBase and WindowsFormsIntegration.


You now will need to add some code to the FormLoad event in order to host the XAML UserControl in your .NET Form. The complete code behind should look like this:

Imports Microsoft.InteropFormTools
Imports System.Windows.Forms.Integration
Imports System.ComponentModel
Imports System.Windows.Forms

<InteropForm()> _
Public Class XamlForm

    Private Sub XamlForm_Load(ByVal sender As System.Object, _
        ByVal e As System.EventArgs) _
        Handles MyBase.Load
        ' Create the ElementHost control for hosting the
        ' WPF UserControl.
        Dim host As New ElementHost()
        host.Dock = DockStyle.Fill

        ' Create the WPF UserControl.
        Dim uc As New XamlUserControl.UserControl1()

        ' Assign the WPF UserControl to the ElementHost
        '  control's Child property.
        host.Child = uc

        ' Add the ElementHost control to the form's
        ' collection of child controls.
    End Sub
End Class

Rebuild your solution one more time for good measure. Then go to the Tools menu and select Generate InteropForm Wrapper Classes (if this menu option is missing, refer to the installation instructions above). This will add a new wrapper class to your project that can be exposed to VB6. Rebuild one last time to register your wrapper class in the registry. At this point, your .NET code is complete.

Open a new project in VB6. Open Project | References and check two items to add them to your VB6 project: Microsoft Interop Forms Toolkit Library as well as your .NET project, XAMLEmbeddedForm to add to COM wrapper for your .NET assembly to your VB6 project.

.NET Interop Forms have difficulty knowing when the host VB6 application starts and stops, so some extra code must be added to your VB6 application to handle this. Add the following code snippet to your VB6 Main Form so the .NET code is informed when these events occur.

Public g_InteropToolbox As InteropToolbox

Private Sub Form_Load()
    Set g_InteropToolbox = New InteropToolbox
End Sub

Private Sub Form_QueryUnload(Cancel As Integer, UnloadMode As Integer)
End Sub

We are nearly done. To open your .NET Form from VB6, just add a command button to your Main Form and handle its click event with the following code.

Private Sub Command1_Click()
Dim xaml As New XamlEmbeddedForm.XamlForm
    xaml.Show vbModal
End Sub

If everything goes right, when you click the button, you should see an animated cube, written all in XAML.



This tutorial is intended to walk you through the steps needed to create a useful .NET/VB6 integration. It is also intended to give you a sense of the almost limitless possibilities that are open to VB6 developers now that this technology has been made widely available. Half a decade after people were foretelling the doom of VB6 as a development tool, we should come to terms with the idea that VB6 will still be with us for quite a while longer. Interop Toolkit 2.0 ensures that the many years left to VB6 development will be both graceful and productive. Fall Fashions for .NET Programmers


This fall programmers are going to be a little more sassy.  Whereas in the past, trendy branding has involved concepts such as paradigms, patterns and rails, principles such as object-oriented programming, data-driven programming, test-driven programming and model-driven architecture, or tags like web 2.0, web 3.0, e-, i-, xtreme and agile, the new fall line features “alternative” and the prefix of choice: alt-.  The point of this is that programmers who work with Microsoft technologies no longer have to do things the Microsoft way.  Instead, they can do things the “Alternative” way, rather than the “Mainstream” way.  In the concrete, this seems to involve using a lot of open source frameworks like NHibernate that have been ported over from Java … but why quibble when we are on the cusp of a new age.

Personally I think is more descriptive, but the moniker has been cemented by the October 5th conference.  David Laribee is credited with coining the term earlier this year in this blog post, as well as explicating it in the following way:

What does it mean to be to be ALT.NET? In short it signifies:

  1. You’re the type of developer who uses what works while keeping an eye out for a better way.
  2. You reach outside the mainstream to adopt the best of any community: Open Source, Agile, Java, Ruby, etc.
  3. You’re not content with the status quo. Things can always be better expressed, more elegant and simple, more mutable, higher quality, etc.
  4. You know tools are great, but they only take you so far. It’s the principles and knowledge that really matter. The best tools are those that embed the knowledge and encourage the principles (e.g. Resharper.)

This is almost identical to my manifesto for, except that I included a fifth item about carbon neutrality and a sixth one about loving puppies.  To Dave’s credit, his manifesto is a bit more succinct.

There are a several historical influences on this new fall line.  One is the suspicion that new Microsoft technologies have been driven by a desire to sell their programming frameworks rather than to create good tools.  An analogy can be drawn with the development of the QWERTY standard for the English-language keyboard.  Why are the keys laid out the way the are?  One likely possibility is that all the keys required to spell out “t-y-p-e-w-r-i-t-e-r” can be found on the top row, which is very convenient for your typical typewriter salesman.  Several of the RAD (Rapid Application Development — an older fall line that is treated with a level of contempt some people reserve for Capri pants) tools that have come out of Microsoft over the past few years have tended to have a similar quality.  They are good for sales presentations but are not particularly useful for real world development.  Examples that come to mind are the call-back event model for .NET Remoting (the official Microsoft code samples didn’t actually work) and the MSDataSetGenerator, which is great for quickly building a data layer for an existing database, and is almost impossible to tweak or customize for even mildly complex business scenarios.

A second influence is java-envy.  Whereas the java development tools have always emphasized complex architectures and a low-level knowledge of the language, Microsoft development tools have always emphasized fast results and abstracting the low-level details of their language so the developer can get on with his job.  This has meant that while Java projects can take up to two years, after which you are lucky if you have a working code base, Microsoft-based projects are typically up and running in under six months.  You would think that this would make the Microsoft solution the one people want to work with, but in fact, among developers, it has created Java-envy.  The Java developers are doing a lot of denken work, making them a sort of aristocracy in the coding world, whereas the Microsoft programmers are more or less laborers for whom Microsoft has done much of the thinking.

Within the Microsoft world, itself, this class distinction has created a sort of mass-migration from VB to C#; these are for the most part equivalent languages, yet VB still has the lingering scent of earth and toil about it.  There are in fact even developers who refuse to use C#, which they see as a still bit prole, and instead prefer to use managed C++.  Whatever, right?

In 2005, this class distinction became codified with the coining of the term Mort, used by Java developers to describe Microsoft developers, and C# developers to describe VB developers, and by VB.NET developers to describe their atavistic VB6 cousins.  You can think of the Morts as Eloi, happily pumping out applications for their businesses, while the much more clever Morlocks plan out coding architectures and frameworks for the next hundred years.  The movement grows out of the Morlocks, rather than the Morts, and can in turn be sub-divided between those who simply want to distinguish themselves from the mid-level developers, and those who want to work on betterment projects using coding standards and code reviews to bring the Morts up to their own level. (To be fair, most of the crowd are of the latter variety, rather than the former.)  The movement sees following Microsoft standards as a sort of serfdom, and would prefer to come up with their own best-practices, and in some cases tools, for building Microsoft-based software.

The third influence on the formation of the movement is the trend in off-shoring software development.  Off-shoring is based on the philosophy that one piece of software development work is equivalent to another, and implicitly that for a given software requirement, one developer is equivalent to another, given that they know the same technology.  The only difference worth considering, then, is how much money one must spend in order to realize that software requirement.

This has generated a certain amount of soul-searching among developers.  Previously, they had subscribed to the same philosophy, since their usefulness was based on the notion that a piece of software, and implicitly a developer, could do the same work that a roomful of filers (or any other white-collar employee) could do more quickly, more efficiently and hence more cheaply.

Off-shoring challenged this self-justification for software developer, and created in its place a new identity politics for developers.  A good developer, now, is not to be judged on what he knows at a given moment in time — that is he should not be judged on his current productivity — but rather on his potential productivity — his ability to generate better architectures, more elegant solutions, and other better things over the long that cannot be easily measured, run.  In other words, third-world developers will always be Morts.  If you want high-end software, you need first-world solutions architects and senior developers. 

To solidify this distinction, however, it is necessary to have some sort of certifying mechanism that will clearly distinguish elite developers from mere Mort wannabes.  At this point, the distinction is only self-selecting, and depends on true developers being able to talk the talk (as well as determining what the talk is going to be).  Who knows, however, what the future may hold.

Some mention should also be made concerning the new fall fashions.  Fifties skirts are back in, and the Grace Kelly look will be prevalent.  Whereas last year saw narrow bottom jeans displacing bell bottoms, for this fall anything goes.  This fall we can once again start mixing colors and patterns, rather than stick to a uniform color for an outfit.  This will make accessorizing much more interesting, though you may find yourself spending more time picking out clothes in the morning, since there are now so many more options.  Finally, V-necks are back.  Scoop-necks are out.

In men’s fashion, making this the fifteenth year in a row, golf shirts and khakis are in.

Coding is not a Spectator Sport



This past Tuesday I attended a Microsoft technology event at a local movie theater.  Ever since the Matrix, movie theaters are apparently the convenient place to go to get technology updates these days.  If you’ve never been to one of these events, they involve a presenter or two with a laptop connected to the largest movie screen in the building.  The presenters then promote new Microsoft offerings by writing code on the big screen while software programmers who have somehow gotten the afternoon off from their bosses watch on.

Jim Wooley presented on LINQ, while a Microsoft employee presented on WCF.  The technologies looked pretty cool, but the presentations were rather dull.  I don’t think this was really the fault of the presenters, though.  The truth is, watching other people code is a bit like watching paint dry, and seems to take longer.  Perhaps this is why pair programming, one of the pillars of extreme programming, has never caught on (failing to document your code, however, another pillar of extreme programming, has been widely adopted and, like Monsieur Jourdain, many developers have found that they’d been doing XP for years without even realizing it). 

Within these constraints — that is that you are basically doing the equivalent of demonstrating how to hammer nails into a board for four hours — the presenters did pretty well, although Mr. Wooley appeared to be somewhat nervous and kept insisting he was doing “Extreme Presenting” whenever he made a coding mistake and the greediest members of the audience would compete with one another to point out his failings.  The Microsoft presenter didn’t encounter any compile errors like Mr. Wooley did, but on the other hand he was following a script and kept referring to it as he typed the code that we were all watching.  Why you should need a script to write uninteresting demo code that ultimately just emits “Hello, world” messages is beyond me, but that’s what he did, and he demonstrated that there could be something even less able to hold the attention than watching someone write code — watching someone write code by rote.

But it is easy to criticize, and in truth I never got to see the presentation on Silverlight given by Shawn Wildermuth (aka “adoguy”), which for all I know may have been much more entertaining and might have undermined my mantra that coding is not a spectator sport, but I’ll never know because I had to skip out on it in order to attend a company dinner.  How I got invited to this dinner I’ll never know, because I wasn’t really very involved in the project that the dinner was intended to celebrate.

I arrived fashionably late by an hour, and as I entered I realized the only seat left was squeezed in between my manager, the CFO of the company and the Senior VP of IT.  This is a dreadful spot to be in, and into that spot I deposited myself.  The problem with being situated next to one’s uppers at a social event is that one spends an inordinate amount of time trying to think of something to say that will impress one’s uppers, while simultaneously trying to avoid saying anything to demonstrate one’s utter unfitness for one’s position.  And here I was next to my boss, who was sitting across from his boss, who was sitting across from his boss.  And as I sat, watching what appeared to be scintillating conversation at the opposite end of the table, my end was completely silent with an air of tension about it.

So I picked up a menu and tried to order.  This was a steak and seafood restaurant, and judging by the prices, approximately twice as good as Longhorn or Outback.  I took the highest priced item, divided the cost by half, and ordered the crawfish pasta with a glass of wine.  Then I sat back to listen to the silence.  Finally someone struck up a conversation about insurance (my industry).  If you want to know how dreadfully dull insurance talk is, it’s a bit like — actually, there is nothing as boring as insurance talk because it is the sine qua non against which all boredom should be judged.  Listening to insurance talk is the sort of thing that makes you want to start cutting yourself for distraction (it’s an old POW trick), and just as I was reaching for the butter knife I found myself telling the jazz story.

The jazz story went over well and seemed to break the ice, so I followed it up with the Berlin mussels story, which was also a hit.  I drank more wine and felt like I was really on a roll.  I’d demonstrated my ability to talk entertainingly around my bosses and as the food arrived I was able to maintain the mood with a jaunty disquisition on men’s fashion and how to select a good hunting dog.  But I grew overconfident.  Over dessert, I decided to play the teacup game, which is a conversation game my friend Conrad at The Varieties had taught me, and it was a disaster.  Apparently I set it up wrong, because a look of disgust formed on the CFO’s face.  My manager tried to save with a distracting story about hygiene, but rather than leave things well enough alone, I decided to continue with the asparagus story, and pretty well ruined the evening.  Oh well.  Bye-bye annual bonus.

Which all goes to show, entertainment is a damnably difficult business.


I can probably improve my dinner conversation by reading a bit more P.G. Wodehouse and bit less of The New Yorker (which is where I got the fateful asparagus story) but how to improve a Microsoft presentation is a much trickier nut to crack.  How much can you realistically do to dress up watching other people code?

Then again, it is amazing what passes for a spectator sport these days, from Lumberjack Olympics to Dancing with the Stars.  Perhaps one of the strangest cultural trends is the popularity of poker as a spectator sport — something that would have seemed unimaginable back in the day.  The whole thing revolves around a handful of people dressed up in odd combinations of wigs, sunglasses and baseball caps to hide their tells playing a card game that depends largely on luck, partly on a grasp of probabilities, and partly on being able to guess what your opponents are guessing about you.  Is there anything in this jumble of crazy costumes, luck and skill that can be used to improve a typical Microsoft presentation?

The truth is, even skill isn’t so important in creating a successful spectator sport.  Take quiz shows, which once were devoted to very tough questions that left the audience wondering how the contestants could know so much (it turned out, of course, that often they were cheating).  Over time, these shows became simpler and simpler, until we ended up with shows like Are You Smarter Than a 5th Grader (which makes you wonder how they find contestants so dumb) and the very successful Wheel of Fortune (in which you are challenged to list all the letters of the alphabet until a hidden message becomes legible).  Demonstrating skill is not the essence of these games.

If you have ever seen National Lampoon’s Vegas Vacation (fourth in the series, but my personal favorite), you will recall the scene where, after loosing a large portion of his life savings at a casino, Chevy Chase is taken by his cousin Eddie to a special place with some non-traditional games of luck such as rock-paper-scissors, what-card-am-I-holding, and pick-a-number-between-one-and-ten.  This, it turns out, is actually the premise of one of the most popular American game shows of the year, Deal Or No Deal, hosted by the failed-comedian-actor-turned-gameshow-host Howie Mandel.  The point of this game is to pick a number between one and twenty-six, which has a one in twenty-six chance of being worth a million dollars.  The beauty of the game is that the quick and the slow, the clever and the dim, all have an equal chance of winning.  The game is a great leveler, and the apparent pleasure for the audience is in seeing how the contestants squirm.

I had initially thought that Mr. Wooley’s palpable nervousness detracted from his presentation, but the more I think about it, the more I am convinced that his error was in not being nervous enough.  The problem with the format of Microsoft presentations is that there is not enough at stake.   A presenter may suffer the indignity of having people point out his coding errors on stage or of having bloggers ask why he needs a script to write a simple demo app — but at the end of the day there are no clear stakes, no clear winners, no clear losers.

The secret of the modern spectator sport — and what makes it fascinating to watch — is that it is primarily about moving money around.  Televised poker, Survivor-style Reality shows, and TV game shows are all successful because they deal with large sums of money and give us an opportunity to see what people will do for it.  Perhaps at some low level, it even succeeds at distracting us from what we are obliged to do for money.

And money is the secret ingredient that would liven up these perfunctory Microsoft events.  One could set a timer for each code demonstration, and oblige the presenter to finish his code — making sure it both compiles and passes automated unit tests — in the prescribed period in order to win a set sum of money.  Even better, audience members can be allowed to compete against the official Microsoft presenters for the prize money.  Imagine the excitement this would generate, the unhelpful hints from the audience members to the competitors, the jeering, the side-bets, the tension, the drama, the spectacle.  Imagine how much more enjoyable these events would be.

Microsoft events are not the only places where money could liven things up, either.  What if winning a televised presidential debate could free up additional dollars to presidential candidates?  What if, along with answering policy questions, we threw in geography and world event questions with prize money attached?  Ratings for our presidential debates might even surpass the ratings for Deal Or No Deal.

Academia would also be a wonderful place to use money as a motivator.  Henry Kissinger is reported to have said that academic battles are so vicious because the stakes are so low.  Imagine how much more vicious we could make them if we suddenly raised the stakes, offering cash incentives for crushing intellectual blows against one’s enemies in the pages of the Journal of the History of Philosophy, or a thousand dollars for each undergraduate ego one destroys with a comment on a term paper.  Up till now, of course, academics have always been willing to do this sort of thing gratis, but consider how much more civilized, and how clearer the motives would be, if we simply injected money into these common occurrences.

Et in Alisiia Ego



A city seems like an awfully big thing to lose, and yet this occurs from time to time.  Some people search for cities that simply do not exist, like Shangri-la and El Dorado.  Some lost cities are transformed through legend and art into something else, so that the historical location is something different from the place we long to see.  Such is the case with places like Xanadu and ArcadiaCamelot and Atlantis, on the other hand, fall somewhere in between, due to their tenuous connection to any sort of physical reality.  Our main evidence that a place called Atlantis ever existed, and later fell into the sea, comes from Plato’s account in the Timaeus, yet even at the time Plato wrote this, it already had a legendary quality about it. 

….[A]nd there was an island situated in front of the straits which are by you called the Pillars of Heracles; the island was larger than Libya and Asia put together, and was the way to other islands, and from these you might pass to the whole of the opposite continent which surrounded the true ocean; for this sea which is within the Straits of Heracles is only a harbour, having a narrow entrance, but that other is a real sea, and the surrounding land may be most truly called a boundless continent. Now in this island of Atlantis there was a great and wonderful empire which had rule over the whole island and several others, and over parts of the continent, and, furthermore, the men of Atlantis had subjected the parts of Libya within the columns of Heracles as far as Egypt, and of Europe as far as Tyrrhenia … But afterwards there occurred violent earthquakes and floods; and in a single day and night of misfortune … the island of Atlantis … disappeared in the depths of the sea.


Camelot and Avalon, for reasons I don’t particularly understand, are alternately identified with Glastonbury, though there are also nay-sayers, of course.  Then there are cities like Troy, Carthage and Petra, which may have been legend but which we now know to have been real, if only because we have rediscovered them.  The locations of these cities became forgotten over time because of wars and mass migrations, sand storms and decay.  They were lost, as it were, through carelessness.

But is it possible to lose a city on purpose?  The lost city of Alesia was the site of Vercingetorix’s defeat at the hands of Julius Ceasar, which marked the end of Gallic Wars.  The failure of the Roman Senate to grant Caesar a triumph to honor his victory led to his decision to initiate the Roman Civil War, leading in turn to the end of the Republic and his own reign of power, which only later came to an abrupt end when he was assassinated by, among others, Brutus, his friend and one of his lieutenants at the Battle of Alesia.  Having achieved mastery of Rome in 46 BCE, Caesar finally was able to throw himself the triumph he wanted, which culminated with Vercingetorix — the legendary folk hero of the French nation, the symbol of defiance against one’s oppressors and an inspiration to freedom fighters everywhere –being strangled.

Meanwhile, back in Gaul, Alesia was forgotten, and eventually became a lost city.  It is as if the trauma of such a defeat, in which all the major Gallic tribes were defeated at one blow and brought to their knees, incited the Gauls to erase their past and make the site of their humiliation as if it had never been.  Ironically, when I went to the Internet Classics Archive to find Caesar’s description of this lost city, I found the chapters which cover Caesar’s siege of Alesia to be completely missing.  The online text ends Book Seven of Julius Caesar’s Commentaries on the Gallic and Civil Wars just before the action commences, and begins Book Eight just after the Gauls are subdued, while the intervening thirty chapters appear to have simply disappeared into the virtual ether.

In the 19th century, the French government commissioned archaeologists to rediscover Alesia, and they eventually selected a site near Dijon, today called Alise-Sainte-Reine, as the likely location.  The only problem with the site is that it does not match Caesars description of Alesia, and Caesar’s writings about Alesia is the main source for everything we know about the city and the battle.

I have rooted around in my basement in order to dig up an unredacted copy of Caesar’s Commentaries, containing everything we remember about the lost city of Alesia:

The town itself was situated on the top of a hill, in a very lofty position, so that it did not appear likely to be taken, except by a regular siege. Two rivers, on two different sides, washed the foot of the hill. Before the town lay a plain of about three miles in length; on every other side hills at a moderate distance, and of an equal degree of height, surrounded the town. The army of the Gauls had filled all the space under the wall, comprising a part of the hill which looked to the rising sun, and had drawn in front a trench and a stone wall six feet high. The circuit of that fortification, which was commenced by the Romans, comprised eleven miles. The camp was pitched in a strong position, and twenty-three redoubts were raised in it, in which sentinels were placed by day, lest any sally should be made suddenly; and by night the same were occupied by watches and strong guards.

— Commentaries, Book VII, Chapter 69

Pluralitas non est ponenda sine necessitate

“Plurality should not be posited without necessity.”  — William of Occam


I added an extra hour to my commute this morning by taking a shortcut.  The pursuit of shortcuts is a common pastime in Atlanta — and no doubt in most metropolitan areas.  My regular path to work involves taking the Ronald Reagan Highway to Interstate 85, and then the 85 to the 285 until I arrive at the northern perimeter.  Nothing could be simpler, and if it weren’t for all the other cars, it would be a really wonderful drive.  But I can’t help feeling that there is a better way to get to my destination, so I head off on my own through the royal road known as “surface streets”.

Cautionary tales like Little Red Riding Hood should tell us all we need to know about taking the path less traveled, yet that has made no difference to me.   Secret routes along surface streets (shortcuts are always a secret of some kind) generally begin with finding a road that more or less turns the nose of one’s car in the direction of one’s job.  This doesn’t last for long, however.  Instead one begins making various turns, right, left, right, left, in an asymptotic route toward one’s destination. 

There are various rules regarding which turns to make, typically involving avoiding various well-known bottlenecks, such as schools and roadwork, avoiding lights, and of course avoiding left turns.  Colleagues and friends are always anxious to tell me about the secret routes they have discovered.  One drew me complex maps laying out the route she has been refining over the past year, with landmarks where she couldn’t remember the street names, and general impressions about the neighborhoods she drove through when she couldn’t recall any landmarks.  This happened to be the route that made me tardy this morning. 

When I told her what had happened to me, the colleague who had drawn me the map apologized for not telling me about a new side-route off of the main route she had recently found (yes, secret routes beget more secret routes) that would have shaved an additional three minutes off of my drive.  Surface streets are the Ptolemaic epicycles of the modern world.

A friend with whom I sometimes commute has a GPS navigation system on his dashboard, which changes the secret route depending on current road conditions.  This often leads us down narrow residential roads that no one else would dream of taking since they wouldn’t know if the road leads to a dead-end or not — but the GPS system knows, of course.  We are a bit slavish about following the advice of the GPS, even when it tells us to turn the trunk of the car toward work and the nose toward the horizon.  One time we drove along a goat path to Macon, Georgia on the advice of the GPS system in order to avoid an accident on S. North Peachtree Road Blvd. 

All this is made much more difficult, of course, due to the strange space-time characteristics of Atlanta which cause two right turns to always take you back to your starting point and two left turns to always dump you into the parking lot of either a Baptist church or a mall.

Various reasons are offered to explain why the Copernican model of the solar system came to replace the Ptolemaic model, including a growing resentment of the Aristotelian system championed by the Roman Catholic Church, resentment against the Roman Catholic Church itself, and a growing revolutionary humanism that wanted to see the Earth and its inhabitants in motion rather than static.  My favorite, however, is the notion that the principle of parsimony was the deciding factor, and that at a certain point people came to realize that the simplicity, rather than complexity, is the true hallmark of scientific explanation.

The Ptolemaic system, which places the earth at the center of the universe, with the Sun, planets and heavenly sphere revolving around it, was not able to explain the observed motions of the planets satisfactorily.  We know today was due to both having the wrong body placed in the center of the model, as well as insisting on the primacy of circular motion rather than elliptical route the planets actually take. 

In particular, Ptolemy was unable to explain the occasionally observed retrogression of the planets, during which these travelers appear to slow down and then go into reverse during their progression through the sky, without resorting to the artifice of epicycles, or mini circles, which the planets would follow even as they were also following their main circular routes through the sky.  Imagine a ferris wheel on which the chairs do more than hang on their fulcrums; they also do loop-de-loops as they move along with the main wheel.  In Ptolemy’s system, not only would the planets travel along epicycles that traveled on the main planetary paths, but sometimes the epicycles would have their own epicycles, and these would beget additional epicycles.  In the end, Ptolemy required 40 epicycles to explain all the observed motions of the planets.

Copernicus sought to show that, even if his model did not necessarily exceed the accuracy of Ptolemy’s system, he was nevertheless able to get rid of many of these epicycles simply by positing the Sun at the center of the universe rather than the Earth.  At that moment in history simplicity, rather than accuracy per se, became the guiding principle in science.  It is a principle with far reaching ramifications.  Rather than the complex systems of Aristotelian philosophy, with various qualifications and commentaries, the goal of science (in my simplified tale) became the pursuit of simple formulas that would capture the mysteries of the universe.  Whereas Galileo wrote that the book of the universe is written in mathematics, what he really meant is that it is written in a very condensed mathematics and is a very short book, brought down to a level that mere humans can at last contain in their finite minds.

The notion of simplicity is germane not only to astronomy, but also to design.  The success of Apple’s IPod is due not to the many features it offers, but rather to the realization that what people want from their devices is only a small set of features done extremely well.  Simplicity is the manner in which we make notions acceptable and conformable to the human mind.  Why is it that one of the key techniques in legal argumentation is to take a complex notion and reframe it in a metaphor or catchphrase that will resonate with the jurists?  The phrase “If the glove doesn’t fit, you must acquit” was, though bad poetry, rather excellent strategy.  “Separate but not equal” has resonated and guided the American conscience for fifty years.  Joe Camel, for whatever inexplicable reasons, has been shown to be an effective instrument of death.  The paths of the mind are myriad and dark.

Taking a fresh look at the surface streets I have been traveling along, I am beginning to suspect that they do not really save me all that much time.  And even if they do shave a few minutes off my drive, I’m not sure the intricacies of my Byzantine journey are worth all the trouble.  So tomorrow I will return to the safe path, the well known path — along the Ronald Reagan, down the 85, across the top of the 285.  It is simplicity itself.

Yet I know that the draw of surface streets will continue to tug at me on a daily basis, and I fear that in a moment of weakness, while caught behind an SUV or big rig, I may veer off my intended path.  In order to increase the accuracy of his system, even Copernicus was led in the end to add epicycles, and epicycles upon epicycles, to his model of the universe, and by the last pages of On the Revolution of the Heavenly Bodies found himself juggling a total of 48 epicycles — 8 more than Ptolemy had.

Drinking with the Immortals


There are various legends about drinking with the Immortals.  They typically involve a wanderer lost in the wilderness who is offered shelter by strange people.  He is brought close to the fire and given beer, or wine, or mead, depending on the provenance of the folktale.  As his clothes dry out, he is regaled by tales of ancient times and slowly comes to realize that his companions are not typical folk, but rather denizens from behind the veil.  He has fallen, through no merit of his own, into the midst of an enchanted world, and his deepest fear is not of the danger that is all around him, but rather that once the enchantment is disspelled, he will never be able to recover it again.

It occurred to me recently that I had such an experience about a year ago.  I was sent by my company to the Microsoft campus in Redmond to spend several days with the ASP.NET Team and other luminaries of the .NET world.

The names will mean nothing to most readers, but I had the opportunity to meet Bertrand LeRoy, Scott Guthrie, Eilon Lipton, and others to discuss the (then new) ASP.NET Ajax.  I had been painfully working through the technology for several months, and so found myself able to almost hold a conversation with these designers and developers.

On the final night of the event all the seminar attendees were taken to a local wine bar and had dinner.  As is my wont, I drank as much free wine as was poured into my glass, and began spinning computer yarns that became more and more disassociated from reality as the night wore on.  I’m sure I became rather boorish at some point, but the Microsoft developers listened politely, and in my own mind, of course, I was making brilliant conversation.

Even to those who know something of the people I was talking to, this might seem like no big deal.  I went drinking with colleagues in the same industry I am in — so what.  But for me, it was as if I were suddenly introduced to the people who make the rain that nourishes my fields and the sunlight that warms my days.  Microsoft software simply appears as if by magic out of Redmond, and like millions of others, day in and day out, I dutifully learn and use the new technologies that come out of the software giant.  To find out that there are actually people who design the various tools I use, and build them, and debug them — this is a bit difficult to conceive.

In A Room of One’s Own, Virginia Woolf reflects on Charles Lamb’s encounter with a dog-eared manuscript of one of Milton’s poems, filled with lines scratched out and re-written, words selected and words discarded:

“Lamb then came to Oxbridge perhaps a hundred years ago. Certainly he wrote an essay-the name escapes me-about the manuscript of one of Milton’s poems which he saw here. It was LYCIDAS perhaps, and Lamb wrote how it shocked him to think it possible that any word in LYCIDAS could have been different from what it is. To think of Milton changing the words in that poem seemed to him a sort of sacrilege.”

My own discovery that the things of this world which I consider most solid and most real — because they are so essential to my daily life — could have been otherwise than they are, was a similar moment of shock, tinged with fear. 

In a moment of anxiety during this sweet symposium, I leaned over to the person immediately to my right and confided in him my strange reflections.  He laughed gently, and dismissed my drunken observations about the contingent nature of reality.  I later found out he was the twenty-three year old developer of the ASP.NET login control, used daily in web applications around the world, when he inquired of me whether I had ever used his control, and what I thought of it.

The Price of Progress


dasBlog is the engine I have been using over the past year on my web site.  Besides its low cost (free as beer), and a tendency to be a reliable blog engine, I also like it because it uses XML files rather than a database to persist information.  The release version has been running on the .NET 1.1 Framework for quite a long while, and despite a teaser tag on their home site insisting that the new 2.0 version would be released in a matter of weeks, dasBlog 2.0 has actually taken a much, much, longer time to come out.

But now the wait is over, and I plan to upgrade to the newest version sometime later tonight.  As sometimes happens, this may entail the complete collapse of the site and the loss of all prior blog posts — but I’m keeping my fingers crossed and maintaining a positive attitude about it, for such is the price of progress.

Supposing that I am successful in migrating to the newest version, I don’t plan to post a review of the qualities of the new platform since, in this case, the medium is very much the message.

Meme Manqué


Pronunciation: n-‘kA
Function: adjective
Etymology: French, from past participle of manquer to lack, fail, from Italian mancare, from manco lacking, left-handed, from Latin, having a crippled hand, probably from manus
: short of or frustrated in the fulfillment of one’s aspirations or talents — used postpositively <a poet manqué>

Merriam-Webster Online


During my perusal of the August 27th New Yorker, I came across the word manqué in two different articles, which struck me as noteworthy as I don’t think I have come across this word in several years.  A quick search of the New Yorker archives indicates that besides these two recent uses,  one in a snarky article about Nicolas Sarkozy by Adam Gopnik:

“People close to Sarkozy like to say that he is an American manqué….”

 and the other in a fawning review of Michelangelo Antonioni’s film opus by Anthony Lane:

“This is not to say that the Italian was a novelist manqué.”

the word had been used in an April review of a Richard Gere film, and prior to that had not appeared in the pages of The New Yorker since September of last year, in a short story by Henry Roth.

Occasionally an unusual word achieves a brief period of fashionability due to its rarity, such as was the case with the term disestablishmentarianism, and its dopplegänger anti-disestablishmentarianism, a few years back.  Once it is recognized that such a word has become le mot juste in just too many instances, however, it quickly recedes back into obscurity, like boy bands and one hit wonders. 

Playing with The New Yorker archives reveals similarly suggestive, if not definitive, phenomenological gold about the way rare words become popular for a brief time, and then go underground for a year or more.  Try, for instance, a search on sartorial, zeitgeist, or pusillanimous.  A more interesting project, of course, would involve sifting through the archives of several high-brow publications and graphing the frequency of rare words.  What a memetic field day that would be.

Perhaps this is peculiar to me, but I feel sometimes that using a given word more than once in a blue moon is already an overuse.  Such is my feeling about swearing, which should be used judiciously in order to achieve maximum impact, as well as my feeling about obscure words.  Obscure words, used judiciously, demonstrate erudition and good taste.  Rare words, when abused, simply demonstrate boorishness, false eloquence, and a supercilious character, as well as a proclivity toward intellectual bullying.  That’s fucked up.

My sense that the obscure should be kept obscure does not pertain to words alone.  In the early 90’s I came across an anecdote while watching Star Trek: Next Generation called the frog and the scorpion, which was ascribed to Aesop.  In the version I heard, a scorpion asks a frog to take him across a river and after much deliberation and rationalization, the frog finally agrees.  Unfortunately, the scorpion does decide to sting the frog midstream, after all, and when the frog asks why, the scorpion replies, “It is my nature.”  The punchline is that they both drown.

Oddly I came across the same anecdote again, a few weeks later while watching Senator Robert Byrd of West Virginia deliver a speech on the Senate floor.  It probably was over an important international event, but I only remember the anecdote and no longer recall what the anecdote was meant to illustrate.  What was interesting about Senator Byrd’s version is that he ascribed the story to Chaucer, rather than Aesop.

A while after that (was it months or years?) the anecdote came before me once again in another Star Trek franchise, Voyager, except this time it was described as a Native American myth and was told by the space-faring Indian Commander Chokote, and the protagonists were now a coyote and a scorpion rather than a frog and a scorpion.

A little research indicates that this particular anecdote may have originally been revived from its antique sleep in the movie The Crying Game, before it made its way through public policy papers, senate speeches, and finally into televised science fiction, where I came across it.

The first time I heard it, I found it charming. The second time, I thought it platitudinous.  The third time, I thought it was idiotic and vowed to boycott the next show, politician or foreign policy that attempted to leverage it in order to make a point.  Such is my nature.

Then again, I recall Benjamin Franklin’s prescription that once one has found a word that works, it is unnecessary to go out of one’s way to find synonyms in order simply to avoid overusing the word in a given piece of journalism or essay.  One should just reuse the word as often as one requires it — which is common-sensical advice, I must admit.

Coming Soon…


I’ve been stuck in a dilemma that many bloggers find themselves in.  I have been busy at work and can’t find the time to write anything.  And I’m not the only one.  Look at Steve Yegge’s blog.  He hasn’t written anything in about a year.  Of course he has a huge readership and I have almost none — which tempts me to just leave the blog fallow for a while. 

At the same time, what’s the point of paying ten dollars a month if I don’t say something?  As this thought occurs to me every few days, I start on the five or six ideas I have for a blog entry, but typically these ideas grow out of my control, and I find that I can’t start talking about a movie I like without at least discussing Aristotle’s Prime Mover, and I don’t want to do that without mentioning Heiddeger’s analysis of final and efficient causes in the Essay Concerning Technology, and so on and so forth…  Clearly, pretension is my Achilles heel.

Nevertheless, I need to write something, if only to get those fornicating monkeys off the top of my main page.

I considered posting an observational post, as many people do.  Just a few words about how I have been listening to such and such a song so what do you think about it please comment? — but this seemed a bit too pathetic.

Next, I thought of resorting to what many bloggers do when they run out of ideas.  They post about how they aren’t going to write anything for a while, which both informs readers of the situation and furtively counts as an actual post.

And then I came across this surfing blog, of which I am very fond for sundry reasons.  At this blog, the authors occasionally post about something they plan to write about but haven’t yet found the gumption to actually pen.  Perhaps the convention has been around for a while, but I have not come across it before.  It’s a brilliant notion.  So here goes … my first “trailers” post.

Aristotle In Love — in which the author contrasts the notion of efficient causes in ancient and modern times, as well as the way in which the ancient notion still exists in the attempt to find the cause of public works in private inspiration, and how this reveals an on-going concern with teleology and the metaphysics of essences — with a side-discussion of contemporary cinema.

Zombies III — in which the author attempts to extend his exploration of this cultural phenomenon from the perspective of privacy, with a further discussion of different notions of privacy over the centuries, revolving primarily around Kant’s treatment of the subject in his political essay What is the Enlightenment?

Hillary’s Knee — in which the author discusses the films of Eric Rohmer and his own fascination with the inner life of one of the most public figures in American culture.

Catch Twenty-Two — in which the author interweaves a discussion of war novels with the problem of threading deadlocks in software programming.  Hilarity ensues.

Why the Phantom of the Opera Is So Cool and The Cure is are Overrated — in which the author writes about some of the music he has recently been listening to.

The Bonobo, the Potato, and the Giant


Beth at Cup-Of-Coffey has a new entry about why she loves the Internet involving a video of hundreds of inmates at a filipino prison performing Michael Jackson’s Thriller.  It’s a testament to the human spirit, sort of, but more importantly it is a testament to the peculiar character of our modern world in which wonder can be inspired simply by clicking a link.

The New Yorker has an article about Bonobo apes — also known as hippie apes due to their gentle natures, compared to humans and chimps, as well as their sexual promiscuity — in which one of the leading researchers in the field comments, regarding field work:

“You always think there’s going to be something round the next bend, but there never is.”

My experience this week on the web has been quite the opposite.  The Internet is much better than I have been led to believe, and here are a few reasons why.

Conrad H. Roth, over at Varieties of Unreligious Experience, has a film-review of the 1966 documentary Africa Addio unlike any film review I have ever read.  The film itself is a disturbing and violent portrayal of the chaos of post-colonial Africa, but Conrad’s explanation and recommendation of the film raises it to the level of a dark portrayal of the human condition.  Conrad brings up the petite-tyrant Roger Ebert’s review, summed up in the words ‘brutal, dishonest, racist’, only to convince us not only of Ebert’s smallness of character but also how this basically accurate description of Africa Addio is part of what makes the movie great.  It is all this and more.

The Polyglot Vegetarian, who hadn’t posted anything since April, has finally blogged about the Potato.  PV has picked out a special niche in the blogosphere — he blogs eruditely about veggies, giving their linguistic and social history.  He makes the lowly noble.

If you liked The Da Vinci Code, or if you happened to prefer the original version by Baigent and Leigh, then you will certainly enjoy Raminagrobis’s explanation of “the much and justly maligned” Claude-Sosthène Grasset d’Orcet’s theories about how to decode Rabelais’s Gargantua and Pantagruel through the discovery of the proper uses of punning.

Finally, the Beta 2 of Visual Studio 2008 has just be released for download, as explained on Scott Guthrie’s blog.  In certain corners of the world, this is a fairly momentous event, but falling in such an interesting week, it is a bit underwhelming for me against the backdrop of dancing prisoners, darkest Africa, the bonobo, the potato, and the giant.

Authentically Virtual