Philosophical Classics for Nerds

eternal_sunshine

It is 60 days after the day I thought the U.S. presidential election would have been settled … and yet. Intellectually, I recognize the outrageousness of the situation, based on the Constitution, based on my high school civics lessons, and based on my memories of the 2000 presidential election between Bush and Gore when everyone felt that any wrong move or overreach back then would have threatened the stability of the republic and the rule of law.

At the same time I have become inured to the cray-cray and as I listen today to recordings of President Trump’s corrupt, self-serving call to Georgia Secretary of State Raffensperger, I find that my intellectual recognition that norms are being broken (the norm that we trust the democratic system, the norm that we should all abide by the rules as they are written , the norm that we should assiduously avoid tampering with ‘the process’ in any way) is not accompanied by the familiar gut uneasiness that signals to humans that norms have been disturbed. That thing that makes up “common sense”, a unanimity between thought and feeling, is missing for me due to four and more years of gaslighting.

gaslight

When common sense breaks down in this way, there are generally two possible causes. Either you have gone crazy or everyone else has. Like Ingrid Bergman in that George Cukor film, our first instinct is to look for a Joseph Cotton to reassure us that we are right and Charles Boyer is wrong. What always causes me dread about that movie, though, is the notion that things wouldn’t have gone so well had Ingrid Bergman not been gorgeous and drawn Cotton’s gaze and concern.

In another film from a parallel universe, Cotton might have ignored Bergman, and she would have withdrawn from the world, into herself, and pursued a hobby she had full control over, like crochet, or woodworking, or cosplaying. Many do.

Over the past two decades, nerdiness has shifted from being a character flaw into a virtue, from something tacitly acknowledged into a lifestyle to be pursued. The key characteristic of “nerdiness” is the willingness to allow a passion to bloom into an obsession to the point of wanting to know every trivial and quadrivial  aspect of a subject. True nerdiness is achieved when we take a matter just that bit too far, when friendships are broken over opinions concerning the Star Wars prequels, or when marriages are split over the classification of a print font.

The loss of the sensus communis  can also mark the point where mere thought becomes philosophical. The hallmark of philosophical reflection is that moment when the familiar suddenly becomes unfamiliar and then demands our gaze with new fascination, like Ingrid Bergman suddenly drawing Joseph Cotton’s attention. For Heidegger this was the uncanniness of the world. For Husserl it was the epoche in which we bring into question the givenness of the world. And for Plato it is the desire for one’s lover, which one transfers to beauty in general, and finally to Truth itself.

http://www.metmuseum.org/art/collection/search/436105

Philosophers as a rule take things too far. They say forbidden things. They draw unexpected conclusions. They examine all the nooks and crannies of thought, exhaustively, to reach the conclusions they reach, often to the boredom of their audience. They were nerds before we knew what nerds were.

Even in the world of philosophy, however, there are books and ideas that used to be considered too important to overlook but too nerdy to be made central to the discipline. Instead, they have existed on the margins of philosophy waiting for a moment when the Zeitgeist was ready to receive them.

Here are five works of speculative philosophy whose time, I believe has come.

desert_real

Simulacra and Simulations by Jean Baudrillard – This book describes virtual reality, a bit like William Gibson did with Neuromancer, before it was really a thing. The Wachowskis cite it as an inspiration for The Matrix and even put phrases from this work in one of Morpheus’s monologues. It is blessedly a short work that captures the essence of our virtual world today from a distance of almost half a century (it was written in 1983). No one should be working in tech today without understanding what Baudrillard meant by “the desert of the real.”

transporter

Reasons and Persons by Derek Parfit – Parfit took apart the notion of identity using thought experiments drawn from science fiction. One of his most striking arguments, introduced in Part Three of his work, in a section called Simple Teletransportation and the Branch-Line Case, Parfit posits a machine that allows speed-of-light travel by scanning a person into data, sending that data to another planet, and then reconstituting that data as matter to recreate the original person. Of course, we have to destroy the original copy during this process of teletransportation. Parfit toys with our intuitions of what it means to be a person in order to arrive at philosophical gold. If the reader is troubled by this scenario of murder and cloning cum teleportation, Parfit is able to point out that this is what we go through in our lives. How much of the matter we were born with is still a part of our physical bodies? Little to none?

For the coup de gras, one can apply the lessons of teletransportation to address our pointless fear of death. What is death, after all, but a journey through the teleporter without a known terminus?

third_eye

The Conscious Mind by  David J Chalmers – Just as the 4th century BCE saw a flourishing of philosophy and science in Greece, or the 16th century saw an explosion of literary invention in England, in the 1990’s Australia become the home of the most innovative works on the Philosophy of Mind in the world. Out of that period of wild genius David J Chalmers came out against the general trend driven by Daniel Dennett and Paul Churchland that denied the reality of consciousness. Chalmers, on the other hand, made the case through exacting arguments that consciousness is not only real, but is a fundamental property of the universe, alongside spatiality and temporality.

babel

Hegel and the Metaphysics of Absolute Negativity by Brady Bowman – Since Dale Carnegie’s important work reforming the habits of white collar labor, positive thinking has been the ethos of professional life. The Marxian threat of alienated labor is eliminated by refusing to acknowledge the possibility of alienation in the corporate managerial class. Just as movie Galadriel tells us that “history became legend, legend became myth”, the power of positive thinking became a tenet of faith, then a method of prosperous Biblical exegesis, and finally a secret.

Do you ever get tired of mindless positivism? What if the underlying engine of the universe turns out not to be positive thinking but absolute negativity? And what if this can be proven through Hegel’s advanced dialectical logic? How much would you pay for a secret like that?

154932300

The Emperor’s New Mind by Roger Penrose – Penrose was a brilliant mathematical physicist who unleashed his learned background to the problem of human consciousness. Do physics and quantum physics in particular confirm or reject our theories about the human soul? I’ve always loved this book because Penrose comes up with a solution to human consciousness in a  somewhat unphilosophical way – which made many philosophers nervous. The crux of his argument for the place of mind in a quantum universe is the size horizon of some features of the human brain. Ultimately, I think, Penrose provides a way to reconcile Kantian metaphysics with modern cutting edge physics and biology in a way that works – or that at least is consistent and the ground for the possibility of Kantianism.

Dark_Side_of_the_Moon

Honorable mention: Darkside by Tom Stoppard – if you have been watching The Good Place then you should be familiar with The Trolley Problem, a thought experiment used to tease our ethical intuitions and commitments. What could make The Trolley Problem even better? What if it is incorporated into  a radio play by one of our greatest living English dramatists, performed to the tracks of Pink Floyd’s Dark Side of the Moon, and acted out by Bill Nighy (The Hitchhiker’s Guide to the Galaxy), Rufus Sewell (Dark City) and Iwan Rheon (Game of Thrones).

upgraded blog engine

I just finished upgrading my blogengine.net from version 1.6 to version 3.1.1. Long overdue and about half a day’s work. BlogEngine.net is definitely a tool for developers rather than consumers. Now that it’s up, I’m pretty happy, though. Thanks also to the OrcsWeb support folks (OrcsWeb hosts my blog) who helped me when I kept locking myself out of the system because I can’t remember my ftp password.

I started this out with dasBlog back in 2006. I’m glad that I’ve only had to do a few upgrades in the years between.

Promiscuity and Software: The Thirty-one Percent Solution

 kiss

There is one big player in the software development world, and her name is Microsoft. Over the years many vendors and startups have attempted to compete against the lumbering giant, and Microsoft has typically resorted to one of two methods for dealing with her rivals. Either she pours near-unlimited money into beating the competition, as Microsoft did with Netscape in the 90’s, or she buys her rival right out. It is the typical build or buy scenario. But with the ALT.NET world, she seems to be taking a third approach.

The premise of the ALT.NET philosophy is that developers who work within the Microsoft .NET domain should still be free to use non-Microsoft sanctioned technologies, and there is even a certain Rome versus the barbarians approach to this, with Microsoft naturally taking the part of the Arian invaders. The best solution to a technical problem it is claimed (and rightly so) need not be one provided by Microsoft, which ultimately is a very small sub-set of the aggregate body of developers in the world. Instead, solutions should be driven by the developer community who know much more about the daily problems encountered by businesses than Microsoft does. Microsoft in turn may or may not provide the best tools to implement these solutions; when she doesn’t, the developer community may come up with their own, such as NHibernate, NUnit, Ajax, Windsor and RhinoMocks (all free, by the way).

What is interesting about each of these tools is that, when they came out, Microsoft didn’t actually have a competing tool for any of these technologies. Instead of competing with Microsoft on her own field, the ALT.NET community began by competing with Microsoft in the places where she had no foothold. Slowly, however, Microsoft came out with competing products for each of these but the last. MSUnit was released about three years ago to compete with NUnit. ASP.NET AJAX (formerly ATLAS, a much cooler name) competes with the various Ajax scripting libraries. ASP.NET MVC competes with the PHP development world. Entity Framework and the Unity Framework were recently released to compete with NHibernate and Windsor, respectively.

Unlike the case with the browser wars of the 90’s, Microsoft’s offerings are not overwhelmingly better. The reception of the Entity Framework (mostly orchestrated by the ALT.NET community itself, it should be admitted) was an extreme case in point, for scores of developers including a few MVP’s (Microsoft’s designation for recognized software community leaders) publicly pilloried the technology in an open letter and petition decrying its shortcomings.

Microsoft, in these cases, is not trying to overwhelm the competition. She does not throw unlimited resources at the problem. Instead, she has been throwing limited resources at each of these domains and, in a sense, has accomplished what the ALT.NET world originally claimed was their goal: to introduce a bit a of competition into the process and allow developers to select the most fitting solution.

Not too long ago I came across an article that suggested to me a less benign strategy on Microsoft’s part, one that involves ideological purity and software promiscuity. The ALT.NET world, one might be tempted to say, has a bit of a religious aspect to it, and the various discussion board flames concerning ALT.NET that pop up every so often have a distinct religious patina to them.

The relationship between ALT.NET-ers to Microsoft is a bit like the relationship of of Evangelicals and Fundamentalists to the world. We do, after all, have to live in this world, and we don’t have the ability or the influence at all times to shape it the way we want. Consequently, compromises must be made, and the only question worth asking is to what extent we must compromise. The distinction between Evangelicals and Fundamentalists rests squarely on this matter, with Evangelicals believing that some sort of co-existence can be accomplished, while Fundamentalists believe that the cognitive dissonance between their view of the world and the world’s view of itself are too great to be bridged. For Fundamentalists, the Evangelicals are simply fooling themselves, and worse opening themselves up to temptation without realizing it.

All this being background to Margaret Talbot’s article in the November New Yorker Red Sex Blue Sex: Why do so many evangelical teen-agers become pregnant?  Ms. Talbot raises the question of abstinence only programs which are widely ridiculed for being unsuccessful. 

“Nationwide, according to a 2001 estimate, some two and a half million people have taken a pledge to remain celibate until marriage. Usually, they do so under the auspices of movements such as True Love Waits or the Silver Ring Thing. Sometimes, they make their vows at big rallies featuring Christian pop stars and laser light shows, or at purity balls, where girls in frothy dresses exchange rings with their fathers, who vow to help them remain virgins until the day they marry. More than half of those who take such pledges—which, unlike abstinence-only classes in public schools, are explicitly Christian—end up having sex before marriage, and not usually with their future spouse.”

The programs are not totally unsuccessful.  In general pledgers delay sex eighteen months longer than non-pledgers.  The real indicator of the success of an abstinence only program, however, is how popular they become.  The success of an abstinence only program is ironically inversely proportional to its popularity and ubiquity.

“Bearman and Brückner have also identified a peculiar dilemma: in some schools, if too many teens pledge, the effort basically collapses. Pledgers apparently gather strength from the sense that they are an embattled minority; once their numbers exceed thirty per cent, and proclaimed chastity becomes the norm, that special identity is lost. With such a fragile formula, it’s hard to imagine how educators can ever get it right: once the self-proclaimed virgin clique hits the thirty-one-per-cent mark, suddenly it’s Sodom and Gomorrah.”

The ALT.NET chest of development tools is not widely used, although its proponents are very vocal about the need to use them.  Unit testing, which is a very good practice, has limited actual adherence though many developers will publicly avow its usefulness.  NHibernate, Windsor and related technologies have an even weaker hold on the mind share of the developer community — much less than the thirty percent, I would say — an actuality which belies the volume and vehemence, as well as exposure, of their proponents.

With the thirty-one percent solution, Microsoft does not have to improve on the ALT.NET technologies and methodologies in order to win.  All she has to do is to help the proponents of IOC, Mocking and ORMs to get to that thirty-one percent adoption level.  She can do this by releasing interesting variations of the ALT.NET community tools, thus gentrifying these tools for the wider Microsoft development community.  Even within the ALT.NET world, as in our world, there are more Evangelicals than Fundamentalists, people who are always willing to try something once.

Microsoft’s post-90’s strategy need no longer be build or buy.  She can take this third approach of simply introducing a bit of software promiscuity, a little temptation here, a little skin there, and pretty soon it’s a technical Sodom and Gomorrah.

Technical Interview Questions

flogging

Interviewing has been on a my mind, of late, as my company is in the middle of doing quite a bit of hiring.  Technical interviews for software developers are typically an odd affair, performed by technicians who aren’t quite sure of what they are doing upon unsuspecting job candidates who aren’t quite sure of what they are in for.

Part of the difficulty is the gap between hiring managers, who are cognizant of the fact that they are not in position to evaluate the skills of a given candidate, and the in-house developers, who are unsure of what they are supposed to be looking for.  Is the goal of a technical interview to verify that the interviewee has the skills she claims to possess on her resume?  Is it to rate the candidate against some ideal notion of what a software developer ought to be?  Is it to connect with a developer on a personal level, thus assuring through a brief encounter that the candidate is someone one will want to work with for the next several years?  Or is it merely to pass the time, in the middle of more pressing work, in order to have a little sport and give job candidates a hard time?

It would, of course, help if the hiring manager were able to give detailed information about the kind of job that is being filled, the job level, perhaps the pay range — but more often than not, all he has to work with is an authorization to hire “a developer”, and he has been tasked with finding the best that can be got within limiting financial constraints.  So again, the onus is upon the developer-cum-interviewer to determine his own goals for this hiring adventure.

Imagine yourself as the technician who has suddenly been handed a copy of a resume and told that there is a candidate waiting in the meeting room.  As you approach the door of the meeting room, hanging slightly ajar, you consider what you will ask of him.  You gain a few more minutes to think this over as you shake hands with the candidate, exchange pleasantries, apologize for not having had time to review his resume and look blankly down at the sheet of buzzwords and dates on the table before you.

Had you more time to prepare in advance, you might have gone to sites such as Ayenda’s blog, or techinterviews.com, and picked up some good questions to ask.  On the other hand, the value of these questions is debatable, as it may not be clear that these questions are necessarily a good indicator that the interviewee had actually been doing anything at his last job.  He may have been spending his time browsing these very same sites and preparing his answers by rote.  It is also not clear that understanding these high-level concepts will necessarily make the interviewee good in the role he will eventually be placed in, if hired. 

Is understanding how to compile a .NET application with a command line tool necessarily useful in every (or any) real world business development task?  Does knowing how to talk about the observer pattern make him a good candidate for work that does not really involve developing monumental code libraries?  On the other hand, such questions are perhaps a good gauge of the candidate’s level of preparation for the interview, and can be as useful as checking the candidate’s shoes for a good shine to determine how serious he is about the job and what level of commitment he has put into getting ready for it.  And someone who prepares well for an interview will, arguably, also prepare well for his daily job.

You might also have gone to Joel Spolsky’s blog and read The Guerrilla Guide To Interviewing in order to discover that what you are looking for is someone who is smart and gets things done.  Which, come to think of it, is especially helpful if you are looking for superstar developers and have the money to pay them whatever they want.  With such a standard, you can easily distinguish between the people who make the cut and all the other maybe candidates.  On the other hand, in the real world, this may not be an option, and your objective may simply be to distinguish between the better maybe candidates and the less-good maybe candidates.  This task is made all the harder since you are interviewing someone who is already a bit nervous and, maybe, has not even been told, yet, what he will be doing in the job (look through computerjobs.com sometime to see how remarkably vague most job descriptions are) for which he is interviewing.

There are many guidelines available online giving advice on how to identify brilliant developers (but is this really such a difficult task?)  What there is a dearth of is information on how to identify merely good developers — the kind that the rest of us work with on a daily basis and may even be ourselves.  Since this is the real purpose of 99.9% of all technical interviews, to find a merely good candidate, following online advice about how to find great candidates may not be particularly useful, and in fact may even be counter-productive, inspiring a sense of inferiority and persecution in a job candidate that is really undeserved and probably unfair.

Perhaps a better guideline for finding candidates can be found not in how we ought to conduct interviews in an ideal world (with unlimited budgets and unlimited expectations), but in how technical interviews are actually conducted in the real world.  Having done my share of interviewing, watching others interview, and occasionally being interviewed myself, it seems to me that in the wild, technical interviews can be broken down into three distinct categories.

Let me, then, impart my experience, so that you may find the interview technique most appropriate to your needs, if you are on that particular side of the table, or, conversely, so that you may better envision what you are in for, should you happen to be on the other side of the table.  There are three typical styles of technical interviewing which I like to call: 1) Jump Through My Hoops, 2) Guess What I’m Thinking, and 3) Knock This Chip Off My Shoulder.

 

Jump Through My Hoops

tricks

Jump Through My Hoops is, of course, a technique popularized by Microsoft and later adopted by companies such as Google.  In its classical form, it requires an interviewer to throw his Birkenstock shod feet over the interview table and fire away with questions that have nothing remotely to do with programming.  Here are a few examples from the archives.  The questions often involve such mundane objects as manhole covers, toothbrushes and car transmissions, but you should feel free to add to this bestiary more philosophical archetypes such as married bachelors, morning stars and evening stars, Cicero and Tully,  the author of Waverly, and other priceless gems of the analytic school.  The objective, of course, is not to hire a good car mechanic or sanitation worker, but rather to hire someone with the innate skills to be a good car mechanic or sanitation worker should his IT role ever require it.

Over the years, technical interviewers have expanded on the JTMH with tasks such as writing out classes with pencil and paper, answering technical trivia, designing relational databases on a whiteboard, and plotting out a UML diagram with crayons.  In general, the more accessories required to complete this type of interview, the better.

Some variations of JTMH rise to the level of Jump Through My Fiery Hoops.  One version I was involved with required calling the candidate the day before the job interview and telling him to write a complete software application to specification, which would then be picked apart by a team of architects at the interview itself.  It was a bit of overkill for an entry-level position, but we learned what we needed to out of it.  The most famous JTMFH is what Joel Spolsky calls The Impossible Question, which entails asking a question with no correct answer, and requires the interviewer to frown and shake his head whenever the candidate makes any attempt to answer the question.  This particular test is also sometimes called the Kobayashi Maru, and is purportedly a good indicator of how a candidate will perform under pressure.

 

Guess What I’m Thinking

brain

Guess What I’m Thinking, or GWIT, is a more open ended interview technique.  It is often adopted by interviewers who find JTMH a bit too constricting.  The goal in GWIT is to get through an interview with the minimum amount of preparation possible.  It often takes the form, “I’m working on such-and-such a project and have run into such-and-such a problem.  How would you solve it?”  The technique is most effective when the job candidate is given very little information about either the purpose of the project or the nature of the problem.  This establishes for the interviewer a clear standard for a successful interview: if the candidate can solve in a few minutes a problem that the interviewer has been working on for weeks, then she obviously deserves the job.

A variation of GWIT which I have participated in requires showing a candidate a long printout and asking her, “What’s wrong with this code?”  The trick is to give the candidate the impression that there are many right answers to this question, when in fact there is only one, the one the interviewer is thinking of.  As the candidate attempts to triangulate on the problem with hopeful answers such as “This code won’t compile,” “There is a bracket missing here,” “There are no code comments,” and “Is there a page missing?” the interviewer can sagely reply “No, that’s not what I’m looking for,” “That’s not what I’m thinking of, “That’s not what I’m thinking of, either,” “Now you’re really cold” and so on.

This particular test is purportedly a good indicator of how a candidate will perform under pressure.

 

Knock This Chip Off My Shoulder

eveready

KTCOMS is an interviewing style often adopted by interviewers who not only lack the time and desire to prepare for the interview, but do not in fact have any time for the interview itself.  As the job candidate, you start off in a position of wasting the interviewer’s time, and must improve his opinion of you from there.

The interviewer is usually under a lot of pressure when he enters the interview room.  He has been working 80 hours a week to meet an impossible deadline his manager has set for him.  He is emotionally in a state of both intense technical competence over a narrow area, due to his life-less existence for the past few months, as well as great insecurity, as he has not been able to satisfy his management’s demands. 

While this interview technique superficially resembles JTMFH, it is actually quite distinct in that, while JTMFH seeks to match the candidate to abstract notions about what a developer ought to know, KTCOMS is grounded in what the interviewer already knows.  His interview style is, consequently, nothing less that a Nietzschean struggle for self-affirmation.  The interviewee is put in the position of having to prove herself superior to the interviewer or else suffer the consequences.

Should you, as the interviewer, want to prepare for KTCOMS, the best thing to do is to start looking up answers to obscure problems that you have encountered in your recent project, and which no normal developer would ever encounter.  These types of questions, along with an attitude that the job candidate should obviously already know the answers, is sure to fluster the interviewee. 

As the interviewee, your only goal is to submit to the superiority of the interviewer.  “Lie down” as soon as possible.  Should you feel any umbrage, or desire to actually compete with the interviewer on his own turf, you must crush this instinct.  Once you have submitted to the interviewer (in the wild, dogs generally accomplish this by lying down on the floor with their necks exposed, and the alpha male accepts the submissive gesture by laying its paw upon the submissive animal) he will do one of two things;  either he will accept your acquiescence, or he will continue to savage you mercilessly until someone comes in to pull him away.

This particular test is purportedly a good indicator of how a candidate will perform under pressure.

 

Conclusion

moderntimes

I hope you have found this survey of common interviewing techniques helpful.  While I have presented them as distinct styles of interviewing, this should certainly not discourage you from mixing-and-matching them as needed for your particular interview scenario.  The schematism I presented is not intended as prescriptive advice, but merely as a taxonomy of what is already to be found in most IT environments, from which you may draw as you require.  You may, in fact, already be practicing some of these techniques without even realizing it.

sassy.net: Fall Fashions for .NET Programmers

janerussell

This fall programmers are going to be a little more sassy.  Whereas in the past, trendy branding has involved concepts such as paradigms, patterns and rails, principles such as object-oriented programming, data-driven programming, test-driven programming and model-driven architecture, or tags like web 2.0, web 3.0, e-, i-, xtreme and agile, the new fall line features “alternative” and the prefix of choice: alt-.  The point of this is that programmers who work with Microsoft technologies no longer have to do things the Microsoft way.  Instead, they can do things the “Alternative” way, rather than the “Mainstream” way.  In the concrete, this seems to involve using a lot of open source frameworks like NHibernate that have been ported over from Java … but why quibble when we are on the cusp of a new age.

Personally I think sassy.net is more descriptive, but the alt.net moniker has been cemented by the October 5th alt.net conference.  David Laribee is credited with coining the term earlier this year in this blog post, as well as explicating it in the following way:

What does it mean to be to be ALT.NET? In short it signifies:

  1. You’re the type of developer who uses what works while keeping an eye out for a better way.
  2. You reach outside the mainstream to adopt the best of any community: Open Source, Agile, Java, Ruby, etc.
  3. You’re not content with the status quo. Things can always be better expressed, more elegant and simple, more mutable, higher quality, etc.
  4. You know tools are great, but they only take you so far. It’s the principles and knowledge that really matter. The best tools are those that embed the knowledge and encourage the principles (e.g. Resharper.)

This is almost identical to my manifesto for sassy.net, except that I included a fifth item about carbon neutrality and a sixth one about loving puppies.  To Dave’s credit, his manifesto is a bit more succinct.

There are a several historical influences on this new fall line.  One is the suspicion that new Microsoft technologies have been driven by a desire to sell their programming frameworks rather than to create good tools.  An analogy can be drawn with the development of the QWERTY standard for the English-language keyboard.  Why are the keys laid out the way the are?  One likely possibility is that all the keys required to spell out “t-y-p-e-w-r-i-t-e-r” can be found on the top row, which is very convenient for your typical typewriter salesman.  Several of the RAD (Rapid Application Development — an older fall line that is treated with a level of contempt some people reserve for Capri pants) tools that have come out of Microsoft over the past few years have tended to have a similar quality.  They are good for sales presentations but are not particularly useful for real world development.  Examples that come to mind are the call-back event model for .NET Remoting (the official Microsoft code samples didn’t actually work) and the MSDataSetGenerator, which is great for quickly building a data layer for an existing database, and is almost impossible to tweak or customize for even mildly complex business scenarios.

A second influence is java-envy.  Whereas the java development tools have always emphasized complex architectures and a low-level knowledge of the language, Microsoft development tools have always emphasized fast results and abstracting the low-level details of their language so the developer can get on with his job.  This has meant that while Java projects can take up to two years, after which you are lucky if you have a working code base, Microsoft-based projects are typically up and running in under six months.  You would think that this would make the Microsoft solution the one people want to work with, but in fact, among developers, it has created Java-envy.  The Java developers are doing a lot of denken work, making them a sort of aristocracy in the coding world, whereas the Microsoft programmers are more or less laborers for whom Microsoft has done much of the thinking.

Within the Microsoft world, itself, this class distinction has created a sort of mass-migration from VB to C#; these are for the most part equivalent languages, yet VB still has the lingering scent of earth and toil about it.  There are in fact even developers who refuse to use C#, which they see as a still bit prole, and instead prefer to use managed C++.  Whatever, right?

In 2005, this class distinction became codified with the coining of the term Mort, used by Java developers to describe Microsoft developers, and C# developers to describe VB developers, and by VB.NET developers to describe their atavistic VB6 cousins.  You can think of the Morts as Eloi, happily pumping out applications for their businesses, while the much more clever Morlocks plan out coding architectures and frameworks for the next hundred years.  The alt.net movement grows out of the Morlocks, rather than the Morts, and can in turn be sub-divided between those who simply want to distinguish themselves from the mid-level developers, and those who want to work on betterment projects using coding standards and code reviews to bring the Morts up to their own level. (To be fair, most of the alt.net crowd are of the latter variety, rather than the former.)  The alt.net movement sees following Microsoft standards as a sort of serfdom, and would prefer to come up with their own best-practices, and in some cases tools, for building Microsoft-based software.

The third influence on the formation of the alt.net movement is the trend in off-shoring software development.  Off-shoring is based on the philosophy that one piece of software development work is equivalent to another, and implicitly that for a given software requirement, one developer is equivalent to another, given that they know the same technology.  The only difference worth considering, then, is how much money one must spend in order to realize that software requirement.

This has generated a certain amount of soul-searching among developers.  Previously, they had subscribed to the same philosophy, since their usefulness was based on the notion that a piece of software, and implicitly a developer, could do the same work that a roomful of filers (or any other white-collar employee) could do more quickly, more efficiently and hence more cheaply.

Off-shoring challenged this self-justification for software developer, and created in its place a new identity politics for developers.  A good developer, now, is not to be judged on what he knows at a given moment in time — that is he should not be judged on his current productivity — but rather on his potential productivity — his ability to generate better architectures, more elegant solutions, and other better things over the long that cannot be easily measured, run.  In other words, third-world developers will always be Morts.  If you want high-end software, you need first-world solutions architects and senior developers. 

To solidify this distinction, however, it is necessary to have some sort of certifying mechanism that will clearly distinguish elite developers from mere Mort wannabes.  At this point, the distinction is only self-selecting, and depends on true alt.net developers being able to talk the talk (as well as determining what the talk is going to be).  Who knows, however, what the future may hold.

Some mention should also be made concerning the new fall fashions.  Fifties skirts are back in, and the Grace Kelly look will be prevalent.  Whereas last year saw narrow bottom jeans displacing bell bottoms, for this fall anything goes.  This fall we can once again start mixing colors and patterns, rather than stick to a uniform color for an outfit.  This will make accessorizing much more interesting, though you may find yourself spending more time picking out clothes in the morning, since there are now so many more options.  Finally, V-necks are back.  Scoop-necks are out.

In men’s fashion, making this the fifteenth year in a row, golf shirts and khakis are in.