The Lees and Scum of Bygone Men

 

chinese_book

The following is a parable about the difference between theory and practice, which Michael Oakeshott frames as the difference between technical and practical knowledge, found as a footnote in Michael Oakeshott’s essay Rationalism In Politics.  I find that it has some bearing, which I will discuss in the near future, to certain Internet debates about pedagogy and software programming:

Duke Huan of Ch’i was reading a book at the upper end of the hall; the wheelwright was making a wheel at the lower end.  Putting aside his mallet and chisel, he called to the Duke and asked him what book he was reading.  ‘One that records the words of the Sages,’ answered the Duke.  ‘Are those Sages alive?’ asked the wheelwright.  ‘Oh, no,’ said the Duke, ‘they are dead.’  ‘In that case,’ said the wheelwright, ‘what you are reading can be nothing but the lees and scum of bygone men.’  ‘How dare you, a wheelwright, find fault with the book I am reading.  If you can explain your statement, I will let it pass.  If not, you shall die.’  ‘Speaking as a wheelwright,’ he replied, ‘I look at the matter in this way; when I am making a wheel, if my stroke is too slow, then it bites deep but is not steady; if my stroke is too fast, then it is steady, but it does not go deep.  The right pace, neither slow nor fast, cannot get into the hand unless it comes from the heart.  It is a thing that cannot be put into rules; there is an art in it that I cannot explain to my son.  That is why it is impossible for me to let him take over my work, and here I am at the age of seventy still making wheels.  In my opinion it must have been the same with the men of old.  All that was worth handing on, died with them; the rest, they put in their books.  That is why I said that what you were reading was the lees and scum of bygone men.'”

Chuang Tzu

The Self-Correcting Process

carnival

Science is all about making proposals that can be tested (especially after Karl Popper’s formulation of the Falsifiability Criterion), and then undergoing the experience of having that proposal rejected.  This is the essence of any successful process — not that it eliminates errors altogether, but rather that it is able to make corrections despite these errors so that the target need never shift.

Professor Alain Connes recently gave his opinion of Xin-Jing Li’s proof for the Riemann Hypothesis — a proof which relies in part on Professor Connes’ work –  in a blog comment to his own blog (by way of Slashdot):

I dont like to be too negative in my comments. Li’s paper is an attempt to prove a variant of the global trace formula of my paper in Selecta. The "proof" is that of Theorem 7.3 page 29 in Li’s paper, but I stopped reading it when I saw that he is extending the test function h from ideles to adeles by 0 outside ideles and then using Fourier transform (see page 31). This cannot work and ideles form a set of measure 0 inside adeles (unlike what happens when one only deals with finitely many places).

 

Self-correcting extends to other professions, as well.  Scott Hanselman recently posted to correct an opinion he discovered here which he felt required some testing.  Through his own tests, he discovered that nesting a using directive inside a  namespace declaration provides no apparent performance benefit over placing it outside the namespace.

This leads him to draw these important lesson:

  • Don’t believe everything you read, even on a Microsoft Blog.
  • Don’t believe this blog, either!
  • Decide for yourself with experiments if you need a tiebreaker!

 

The sentiment recalls Ralph Waldo Emerson’s memorable words:

 

There is a time in every man’s education when he arrives at the conviction that envy is ignorance; that imitation is suicide; that he must take himself for better, for worse, as his portion; that though the wide universe is full of good, no kernel of nourishing corn can come to him but through his toil bestowed on that plot of ground which is given to him to till. The power which resides in him is new in nature, and none but he knows what that is which he can do, nor does he know until he has tried.

Trust thyself: every heart vibrates to that iron string. Accept the place the divine providence has found for you, the society of your contemporaries, the connection of events.

 

A similar sentiment is expressed in Hobbes’ Leviathan, though with a wicked edge:

 

And as to the faculties of the mind, setting aside the arts grounded upon words, and especially that skill of proceeding upon general and infallible rules, called science, which very few have and but in few things, as being not a native faculty born with us, nor attained, as prudence, while we look after somewhat else, I find yet a greater equality amongst men than that of strength. For prudence is but experience, which equal time equally bestows on all men in those things they equally apply themselves unto. That which may perhaps make such equality incredible is but a vain conceit of one’s own wisdom, which almost all men think they have in a greater degree than the vulgar; that is, than all men but themselves, and a few others, whom by fame, or for concurring with themselves, they approve. For such is the nature of men that howsoever they may acknowledge many others to be more witty, or more eloquent or more learned, yet they will hardly believe there be many so wise as themselves; for they see their own wit at hand, and other men’s at a distance. But this proveth rather that men are in that point equal, than unequal. For there is not ordinarily a greater sign of the equal distribution of anything than that every man is contented with his share. [emphasis mine]

 

We find it again expressed in Descartes’ Discours de la méthode. Descartes, it might be remembered, occasionally exchanged letters with Hobbes:

 

Le bon sens est la chose du monde la mieux partagée; car chacun pense en être si bien pourvu, que ceux même qui sont les plus difficiles à contenter en toute autre chose n’ont point coutume d’en désirer plus qu’ils en ont.

 

Both Hobbes and Descartes formulate their defense of common sense somewhat ironically.  In a recent post, Steve Yegge takes out the irony (or perhaps takes out the kernel of truth and leaves nothing but the irony) in his argument against Joel Spolsky’s widely aknowledged criteria for a desirable employee: "smart, and gets things done."

According to Yegge, the crux of the problem is this:

 

Unfortunately, smart is a generic enough concept that pretty much everyone in the world thinks [he’s] smart.

So looking for Smart is a bit problematic, since we aren’t smart enough to distinguish it from B.S. The best we can do is find people who we think are smart because they’re a bit like us.

So, like, what kind of people is this Smart, and Gets Things Done adage actually hiring?

 

And yet the self-correcting process continues, on the principle that we are all smart enough, collectively, to solve our problems in the aggregate, even if we can’t solve them as individuals.

Presidential candidate Barack Obama recently held a news conference to correct a misunderstanding he had made a few hours earlier about his stance on the Iraq War.  According to CNN:

 

Obama on Thursday denied that he’s shying away from his proposed 16-month phased withdrawal of all combat troops from Iraq, calling it "pure speculation" and adding that his "position has not changed."

However, he told reporters questioning his stance that he will "continue to refine" his policies as warranted.

His comments prompted the Republican National Committee to put out an e-mail saying the presumed Democratic nominee was backing away from his position on withdrawal.

Obama called a second news conference later Thursday to reiterate that he is not changing his position.

 

This is, of course, merely a blip in the history of self-correction.  A more significant one can be found in Bakhtin’s attempt to interpret the works of Rabelais, and to demonstrate (convincingly) that everyone before him misunderstood the father of Gargantua. 

Bakhtin’s analysis of Rabelais in turn brought to light one of the great discoveries of his career: The Carnival — though a colleague once found an earlier reference to the concept in one of Ernst Cassirer’s works.  Against the notion of a careful and steady self-correcting mechanism in history, Bakhtin introduced the metaphor of the Medieval Carnival:

 

The essential principle of grotesque realism is degradation, that is, the lowering of all that is high, spiritual, ideal, abstract; it is a transfer to the material level, to the sphere of earth and body in their indissoluble unity.

Degradation and debasement of the higher do not have a formal and relative character in grotesque realism. "Upward" and "downward" have here an absolute and strictly topographical meaning….Earth is an element that devours, swallows up (the grave, the womb) and at the same time an element of birth, of renascence (the maternal breasts)….Degradation digs a bodily grave for a new birth….To degrade an object does not imply merely hurling it into the void of nonexistence, into absolute destruction, but to hurl it down to the reproductive lower stratum, the zone in which conception and a new birth take place.

 

The Carnival serves to correct inequalities and resentments in society and its subcultures not by setting it upon a surer footing, but rather by affording us an opportunity to air our grievances publicly in a controlled ceremony which allows society and its hierarchical institutions to continue as they are.  It is a release, rather than an adjustment.  A pot party at a rock festival rather than a general strike.

As for the Internet, it is sometimes hard to say what is actually occurring in the back-and-forth that occurs between various blogs.  Have we actually harnessed the wisdom of crowds and created a self-correcting process that responds more rapidly to intellectual propositions, nudging them over a very short time to the correct solution, or have we in fact recreated the Medieval Carnival, a massive gathering of people in one location which breaks down the normal distinctions between wisdom and folly, knowledge and error, competence and foolhardiness? 

Agile Methodology and Promiscuity

lacan

The company I am currently consulting with uses Scrum, a kind of Agile methodology.  I like it.  Its main features are index cards taped to a wall and quick "sprints", or development cycles.  Scrum’s most peculiar feature is the notion of a "Scrum Master", which makes me feel dirty whenever I think of it.  It’s so much a part of the methodology, however, that you can even become certified as a "Scrum Master", and people will put it on their business cards.  Besides Scrum, other Agile methodologies include Extreme Programming (XP) and the Rational Unified Process (RUP) which is actually more of a marketing campaign than an actual methodology — but of course you should never ever say that to a RUP practitioner.

The main thing that seems to unify these Agile methodologies is the fact that they are not Waterfall.  And because Waterfall is notoriously unsuccessful, except when it is successful, Agile projects are generally considered to be successful, except when they aren’t.  And when they aren’t, there are generally two explanations that can be given for the lack of success.  First, the flavor of Agile being practiced wasn’t practiced correctly.  Second, the agile methodology was followed too slavishly, when at the heart of agile is the notion that it must be adapted to the particular qualities of a particular project.

In a recent morning stand up (yet another Scrum feature) the question was raised about whether we were following Scrum properly, since it appeared to some that we were introducing XP elements into our project management.  Even before I had a chance to think about it, I found myself appealing to the second explanation of Agile and arguing that it was a danger to apply Scrum slavishly.  Instead, we needed to mix and match to find the right methodology for us.

A sense of shame washed over me even as I said it, as if I were committing some fundamental category mistake.  However, my remarks were accepted as sensible and we moved on.

For days afterward, I obsessed about the cause of my sense of shame.  I finally worked it up to a fairly thorough theory.  I decided that it was rooted in my undergraduate education and the study of Descartes, who claimed that just as a city designed by one man is eminently more rational than one built through aggregation over ages, so the following of a single method, whether right or wrong, will lead to more valid results than philosophizing willy-nilly ever will.  I also thought of how Kant always filled me with a sense of contentment, whereas Hegel, who famously said against Kant that whenever we attempt to draw lines we always find ourselves crossing over them, always left me feeling uneasy and disoriented.  Along with this was the inappropriate (philosophically speaking) recollection that Kant died a virgin, whereas Hegel’s personal life was marked by drunkenness and carousing.  Finally I thought of Nietzsche, whom Habermas characterized as one of the "dark" philosophers for, among other things, insisting that one set of values were as good as another and, even worse, arguing in The Genealogy of Morals that what we consider to be noble in ourselves is in fact base, and what we consider moral weakness is in fact spiritual strength — a transvaluation of all values.  Nietzsche not only crossed the lines, but so thoroughly blurred them that we are still trying to recover them after almost a century and a half.

But lines are important to software developers — we who obsess about interfaces and abhor namespace collisions the way Aristotle claimed nature abhors a vacuum — as if there were nothing worse than the same word meaning two different things.  We are also obsessed with avoiding duplication of code — as if the only thing worse than the same word meaning two different things is the same thing being represented by two different words.  What a reactionary, prescriptivist, neurotic bunch we all are.

This seemed to explain it for me.  I’ve been trained to revere the definition, and to form fine demarcations in my mind.  What could be more horrible, then, than to casually introduce the notion that not only can one methodology be exchanged for another, but that they can be mixed and matched as one sees fit.  Like wearing a brown belt with black shoes, this fundamentally goes against everything thing I’ve been taught to believe not only about software, but also about the world.  If we allow this one thing, it’s a slippery slope to Armageddon and the complete dissolution of civil society.

Then I recalled Slavoj Zizek’s introduction to one of his books about Jacques Lacan (pictured above), and a slightly different sense of discomfort overcame me.  I quote it in part:

I have always found extremely repulsive the common practice of sharing the main dishes in a Chinese restaurant.  So when, recently, I gave expression to this repulsion and insisted on finishing my plate alone, I became the victim of an ironic "wild psychoanalysis" on the part of my table neighbor: is not this repulsion of mine, this resistance to sharing a meal, a symbolic form of the fear of sharing a partner, i.e., of sexual promiscuity?  The first answer that came to my mind, of course, was a variation on de Quincey’s caution against the "art of murder" — the true horror is not sexual promiscuity but sharing a Chinese dish: "How many people have entered the way of perdition with some innocent gangbang, which at the time was of no great importance to them, and ended by sharing the main dishes in a Chinese restaurant!"

Waterfall and Polygamy

polygamy

Methodology is one of those IT topics that generally make my eyes glaze over.  There is currently a hefty thread over on the altdotnet community rehashing the old debates about waterfall vs. agile vs. particular flavors of agile. The topic follows this well-worn pattern: waterfall, which dominated the application development life cycle for so many years, simply didn’t work, so someone had to invent a lightweight methodology like XP to make up for its deficiencies.  But XP also didn’t always work, so it was necessary to come up with other alternatives, like Scrum, Rational, etc., all encapsulated under the rubric "Agile". (Which agile methodology came first is a sub-genre of the which agile methodology should I use super-genre, by the way.)  Both Waterfall and the various flavors of Agile are contrasted against the most common software development methodology, "Cowboy Coding" or "Seat-of-the-pants" programming, which is essentially a lack of structure.  Due to the current common wisdom regarding agile, that one should mix-and-match various agile methodologies until one finds a religion one can love, there is some concern that this is not actually all that distinguishable from cowboy coding. 

For an interesting take on the matter, you should consult Steve Yegge’s classic post, Good Agile, Bad Agile.

I have a friend who, in all other ways, is a well-grounded, rational human being in the engineering field, but when the topic of Druids comes up, almost always at his instigation, I feel compelled to find a reason to leave the room.  The elm, the ewe, the mistletoe, the silver sickle: these subjects in close constellation instill in me a sudden case of restless legs syndrome.

Not surprisingly, discussions concerning Methodology give me a similar tingly feeling in my toes.  This post in the altdotnet discussion caught my eye, however:

I also don’t believe it is possible to do the kind of planning waterfall requires on any sufficiently large project. So usually what happens is that changes are made to the plan along the way as assumptions and understandings are changed along the way.

Rather than belittle waterfall methodology as inherently misguided, the author expresses the novel notion that it is simply too difficult to implement.  The fault in other words, dear Brutus, lies not in our stars, but in ourselves.

Rabbi Gershom Ben Judah, also known as the Light of the Exile, besides being a formidable scholar, is also notable for his prohibition of polygamy in the 10th century, a prohibition that applied to all Ashkenazy jews, and which was later adopted by Sephardis as well. The prohibition required particular care, since tradition establishes that David, Solomon, and Abraham all had multiple wives.  So why should it be that what was it good for the goose is not so for the gander?

Rabbi Gershom’s exegesis in large part rests on this observation: we are not what our forefathers were.  David, Solomon, and Abraham were all great men, with the virtue required to maintain and manage  polygamous households.  However, as everyone knows, virtue tends to become diluted when it flows downhill.  The modern (even the 10th century modern) lacks the requisite wisdom to prevent natural jealousies between rival wives, the necessary stamina to care for all of his wives as they deserve, and the practical means to provide for them.  For the modern to attempt to live as did David, Solomon, or Abraham, would be disastrous personally, and inimical to good order generally.

What giants of virtue must have once walked the earth.  There was a time, it seems, when the various agile methodologies were non-existent and yet large software development projects were completed, all the same.  It is perhaps difficult for the modern software developer to even imagine such a thing, for in our benighted state, stories about waterfall methodology sending men to the moon seem fanciful and somewhat dodgy — something accomplished perhaps during the mythical man month, but not in real time.

Yet it is so.  Much of modern software is built on the accomplishments of people who had nothing more than the waterfall method to work with, and where we are successful, with XP or Scrum or whatever our particular religion happens to be, it is because we stand on the shoulders of giants.

I find that I am not tempted, all the same.  I know my personal shortcomings, and I would no more try to implement a waterfall project than I would petition for a second wife.  I am not the man my forefathers were.

Technical Interview Questions

flogging

Interviewing has been on a my mind, of late, as my company is in the middle of doing quite a bit of hiring.  Technical interviews for software developers are typically an odd affair, performed by technicians who aren’t quite sure of what they are doing upon unsuspecting job candidates who aren’t quite sure of what they are in for.

Part of the difficulty is the gap between hiring managers, who are cognizant of the fact that they are not in position to evaluate the skills of a given candidate, and the in-house developers, who are unsure of what they are supposed to be looking for.  Is the goal of a technical interview to verify that the interviewee has the skills she claims to possess on her resume?  Is it to rate the candidate against some ideal notion of what a software developer ought to be?  Is it to connect with a developer on a personal level, thus assuring through a brief encounter that the candidate is someone one will want to work with for the next several years?  Or is it merely to pass the time, in the middle of more pressing work, in order to have a little sport and give job candidates a hard time?

It would, of course, help if the hiring manager were able to give detailed information about the kind of job that is being filled, the job level, perhaps the pay range — but more often than not, all he has to work with is an authorization to hire “a developer”, and he has been tasked with finding the best that can be got within limiting financial constraints.  So again, the onus is upon the developer-cum-interviewer to determine his own goals for this hiring adventure.

Imagine yourself as the technician who has suddenly been handed a copy of a resume and told that there is a candidate waiting in the meeting room.  As you approach the door of the meeting room, hanging slightly ajar, you consider what you will ask of him.  You gain a few more minutes to think this over as you shake hands with the candidate, exchange pleasantries, apologize for not having had time to review his resume and look blankly down at the sheet of buzzwords and dates on the table before you.

Had you more time to prepare in advance, you might have gone to sites such as Ayenda’s blog, or techinterviews.com, and picked up some good questions to ask.  On the other hand, the value of these questions is debatable, as it may not be clear that these questions are necessarily a good indicator that the interviewee had actually been doing anything at his last job.  He may have been spending his time browsing these very same sites and preparing his answers by rote.  It is also not clear that understanding these high-level concepts will necessarily make the interviewee good in the role he will eventually be placed in, if hired. 

Is understanding how to compile a .NET application with a command line tool necessarily useful in every (or any) real world business development task?  Does knowing how to talk about the observer pattern make him a good candidate for work that does not really involve developing monumental code libraries?  On the other hand, such questions are perhaps a good gauge of the candidate’s level of preparation for the interview, and can be as useful as checking the candidate’s shoes for a good shine to determine how serious he is about the job and what level of commitment he has put into getting ready for it.  And someone who prepares well for an interview will, arguably, also prepare well for his daily job.

You might also have gone to Joel Spolsky’s blog and read The Guerrilla Guide To Interviewing in order to discover that what you are looking for is someone who is smart and gets things done.  Which, come to think of it, is especially helpful if you are looking for superstar developers and have the money to pay them whatever they want.  With such a standard, you can easily distinguish between the people who make the cut and all the other maybe candidates.  On the other hand, in the real world, this may not be an option, and your objective may simply be to distinguish between the better maybe candidates and the less-good maybe candidates.  This task is made all the harder since you are interviewing someone who is already a bit nervous and, maybe, has not even been told, yet, what he will be doing in the job (look through computerjobs.com sometime to see how remarkably vague most job descriptions are) for which he is interviewing.

There are many guidelines available online giving advice on how to identify brilliant developers (but is this really such a difficult task?)  What there is a dearth of is information on how to identify merely good developers — the kind that the rest of us work with on a daily basis and may even be ourselves.  Since this is the real purpose of 99.9% of all technical interviews, to find a merely good candidate, following online advice about how to find great candidates may not be particularly useful, and in fact may even be counter-productive, inspiring a sense of inferiority and persecution in a job candidate that is really undeserved and probably unfair.

Perhaps a better guideline for finding candidates can be found not in how we ought to conduct interviews in an ideal world (with unlimited budgets and unlimited expectations), but in how technical interviews are actually conducted in the real world.  Having done my share of interviewing, watching others interview, and occasionally being interviewed myself, it seems to me that in the wild, technical interviews can be broken down into three distinct categories.

Let me, then, impart my experience, so that you may find the interview technique most appropriate to your needs, if you are on that particular side of the table, or, conversely, so that you may better envision what you are in for, should you happen to be on the other side of the table.  There are three typical styles of technical interviewing which I like to call: 1) Jump Through My Hoops, 2) Guess What I’m Thinking, and 3) Knock This Chip Off My Shoulder.

 

Jump Through My Hoops

tricks

Jump Through My Hoops is, of course, a technique popularized by Microsoft and later adopted by companies such as Google.  In its classical form, it requires an interviewer to throw his Birkenstock shod feet over the interview table and fire away with questions that have nothing remotely to do with programming.  Here are a few examples from the archives.  The questions often involve such mundane objects as manhole covers, toothbrushes and car transmissions, but you should feel free to add to this bestiary more philosophical archetypes such as married bachelors, morning stars and evening stars, Cicero and Tully,  the author of Waverly, and other priceless gems of the analytic school.  The objective, of course, is not to hire a good car mechanic or sanitation worker, but rather to hire someone with the innate skills to be a good car mechanic or sanitation worker should his IT role ever require it.

Over the years, technical interviewers have expanded on the JTMH with tasks such as writing out classes with pencil and paper, answering technical trivia, designing relational databases on a whiteboard, and plotting out a UML diagram with crayons.  In general, the more accessories required to complete this type of interview, the better.

Some variations of JTMH rise to the level of Jump Through My Fiery Hoops.  One version I was involved with required calling the candidate the day before the job interview and telling him to write a complete software application to specification, which would then be picked apart by a team of architects at the interview itself.  It was a bit of overkill for an entry-level position, but we learned what we needed to out of it.  The most famous JTMFH is what Joel Spolsky calls The Impossible Question, which entails asking a question with no correct answer, and requires the interviewer to frown and shake his head whenever the candidate makes any attempt to answer the question.  This particular test is also sometimes called the Kobayashi Maru, and is purportedly a good indicator of how a candidate will perform under pressure.

 

Guess What I’m Thinking

brain

Guess What I’m Thinking, or GWIT, is a more open ended interview technique.  It is often adopted by interviewers who find JTMH a bit too constricting.  The goal in GWIT is to get through an interview with the minimum amount of preparation possible.  It often takes the form, “I’m working on such-and-such a project and have run into such-and-such a problem.  How would you solve it?”  The technique is most effective when the job candidate is given very little information about either the purpose of the project or the nature of the problem.  This establishes for the interviewer a clear standard for a successful interview: if the candidate can solve in a few minutes a problem that the interviewer has been working on for weeks, then she obviously deserves the job.

A variation of GWIT which I have participated in requires showing a candidate a long printout and asking her, “What’s wrong with this code?”  The trick is to give the candidate the impression that there are many right answers to this question, when in fact there is only one, the one the interviewer is thinking of.  As the candidate attempts to triangulate on the problem with hopeful answers such as “This code won’t compile,” “There is a bracket missing here,” “There are no code comments,” and “Is there a page missing?” the interviewer can sagely reply “No, that’s not what I’m looking for,” “That’s not what I’m thinking of, “That’s not what I’m thinking of, either,” “Now you’re really cold” and so on.

This particular test is purportedly a good indicator of how a candidate will perform under pressure.

 

Knock This Chip Off My Shoulder

eveready

KTCOMS is an interviewing style often adopted by interviewers who not only lack the time and desire to prepare for the interview, but do not in fact have any time for the interview itself.  As the job candidate, you start off in a position of wasting the interviewer’s time, and must improve his opinion of you from there.

The interviewer is usually under a lot of pressure when he enters the interview room.  He has been working 80 hours a week to meet an impossible deadline his manager has set for him.  He is emotionally in a state of both intense technical competence over a narrow area, due to his life-less existence for the past few months, as well as great insecurity, as he has not been able to satisfy his management’s demands. 

While this interview technique superficially resembles JTMFH, it is actually quite distinct in that, while JTMFH seeks to match the candidate to abstract notions about what a developer ought to know, KTCOMS is grounded in what the interviewer already knows.  His interview style is, consequently, nothing less that a Nietzschean struggle for self-affirmation.  The interviewee is put in the position of having to prove herself superior to the interviewer or else suffer the consequences.

Should you, as the interviewer, want to prepare for KTCOMS, the best thing to do is to start looking up answers to obscure problems that you have encountered in your recent project, and which no normal developer would ever encounter.  These types of questions, along with an attitude that the job candidate should obviously already know the answers, is sure to fluster the interviewee. 

As the interviewee, your only goal is to submit to the superiority of the interviewer.  “Lie down” as soon as possible.  Should you feel any umbrage, or desire to actually compete with the interviewer on his own turf, you must crush this instinct.  Once you have submitted to the interviewer (in the wild, dogs generally accomplish this by lying down on the floor with their necks exposed, and the alpha male accepts the submissive gesture by laying its paw upon the submissive animal) he will do one of two things;  either he will accept your acquiescence, or he will continue to savage you mercilessly until someone comes in to pull him away.

This particular test is purportedly a good indicator of how a candidate will perform under pressure.

 

Conclusion

moderntimes

I hope you have found this survey of common interviewing techniques helpful.  While I have presented them as distinct styles of interviewing, this should certainly not discourage you from mixing-and-matching them as needed for your particular interview scenario.  The schematism I presented is not intended as prescriptive advice, but merely as a taxonomy of what is already to be found in most IT environments, from which you may draw as you require.  You may, in fact, already be practicing some of these techniques without even realizing it.

Coding is not a Spectator Sport

matrix

 

This past Tuesday I attended a Microsoft technology event at a local movie theater.  Ever since the Matrix, movie theaters are apparently the convenient place to go to get technology updates these days.  If you’ve never been to one of these events, they involve a presenter or two with a laptop connected to the largest movie screen in the building.  The presenters then promote new Microsoft offerings by writing code on the big screen while software programmers who have somehow gotten the afternoon off from their bosses watch on.

Jim Wooley presented on LINQ, while a Microsoft employee presented on WCF.  The technologies looked pretty cool, but the presentations were rather dull.  I don’t think this was really the fault of the presenters, though.  The truth is, watching other people code is a bit like watching paint dry, and seems to take longer.  Perhaps this is why pair programming, one of the pillars of extreme programming, has never caught on (failing to document your code, however, another pillar of extreme programming, has been widely adopted and, like Monsieur Jourdain, many developers have found that they’d been doing XP for years without even realizing it). 

Within these constraints — that is that you are basically doing the equivalent of demonstrating how to hammer nails into a board for four hours — the presenters did pretty well, although Mr. Wooley appeared to be somewhat nervous and kept insisting he was doing “Extreme Presenting” whenever he made a coding mistake and the greediest members of the audience would compete with one another to point out his failings.  The Microsoft presenter didn’t encounter any compile errors like Mr. Wooley did, but on the other hand he was following a script and kept referring to it as he typed the code that we were all watching.  Why you should need a script to write uninteresting demo code that ultimately just emits “Hello, world” messages is beyond me, but that’s what he did, and he demonstrated that there could be something even less able to hold the attention than watching someone write code — watching someone write code by rote.

But it is easy to criticize, and in truth I never got to see the presentation on Silverlight given by Shawn Wildermuth (aka “adoguy”), which for all I know may have been much more entertaining and might have undermined my mantra that coding is not a spectator sport, but I’ll never know because I had to skip out on it in order to attend a company dinner.  How I got invited to this dinner I’ll never know, because I wasn’t really very involved in the project that the dinner was intended to celebrate.

I arrived fashionably late by an hour, and as I entered I realized the only seat left was squeezed in between my manager, the CFO of the company and the Senior VP of IT.  This is a dreadful spot to be in, and into that spot I deposited myself.  The problem with being situated next to one’s uppers at a social event is that one spends an inordinate amount of time trying to think of something to say that will impress one’s uppers, while simultaneously trying to avoid saying anything to demonstrate one’s utter unfitness for one’s position.  And here I was next to my boss, who was sitting across from his boss, who was sitting across from his boss.  And as I sat, watching what appeared to be scintillating conversation at the opposite end of the table, my end was completely silent with an air of tension about it.

So I picked up a menu and tried to order.  This was a steak and seafood restaurant, and judging by the prices, approximately twice as good as Longhorn or Outback.  I took the highest priced item, divided the cost by half, and ordered the crawfish pasta with a glass of wine.  Then I sat back to listen to the silence.  Finally someone struck up a conversation about insurance (my industry).  If you want to know how dreadfully dull insurance talk is, it’s a bit like — actually, there is nothing as boring as insurance talk because it is the sine qua non against which all boredom should be judged.  Listening to insurance talk is the sort of thing that makes you want to start cutting yourself for distraction (it’s an old POW trick), and just as I was reaching for the butter knife I found myself telling the jazz story.

The jazz story went over well and seemed to break the ice, so I followed it up with the Berlin mussels story, which was also a hit.  I drank more wine and felt like I was really on a roll.  I’d demonstrated my ability to talk entertainingly around my bosses and as the food arrived I was able to maintain the mood with a jaunty disquisition on men’s fashion and how to select a good hunting dog.  But I grew overconfident.  Over dessert, I decided to play the teacup game, which is a conversation game my friend Conrad at The Varieties had taught me, and it was a disaster.  Apparently I set it up wrong, because a look of disgust formed on the CFO’s face.  My manager tried to save with a distracting story about hygiene, but rather than leave things well enough alone, I decided to continue with the asparagus story, and pretty well ruined the evening.  Oh well.  Bye-bye annual bonus.

Which all goes to show, entertainment is a damnably difficult business.

eddie

I can probably improve my dinner conversation by reading a bit more P.G. Wodehouse and bit less of The New Yorker (which is where I got the fateful asparagus story) but how to improve a Microsoft presentation is a much trickier nut to crack.  How much can you realistically do to dress up watching other people code?

Then again, it is amazing what passes for a spectator sport these days, from Lumberjack Olympics to Dancing with the Stars.  Perhaps one of the strangest cultural trends is the popularity of poker as a spectator sport — something that would have seemed unimaginable back in the day.  The whole thing revolves around a handful of people dressed up in odd combinations of wigs, sunglasses and baseball caps to hide their tells playing a card game that depends largely on luck, partly on a grasp of probabilities, and partly on being able to guess what your opponents are guessing about you.  Is there anything in this jumble of crazy costumes, luck and skill that can be used to improve a typical Microsoft presentation?

The truth is, even skill isn’t so important in creating a successful spectator sport.  Take quiz shows, which once were devoted to very tough questions that left the audience wondering how the contestants could know so much (it turned out, of course, that often they were cheating).  Over time, these shows became simpler and simpler, until we ended up with shows like Are You Smarter Than a 5th Grader (which makes you wonder how they find contestants so dumb) and the very successful Wheel of Fortune (in which you are challenged to list all the letters of the alphabet until a hidden message becomes legible).  Demonstrating skill is not the essence of these games.

If you have ever seen National Lampoon’s Vegas Vacation (fourth in the series, but my personal favorite), you will recall the scene where, after loosing a large portion of his life savings at a casino, Chevy Chase is taken by his cousin Eddie to a special place with some non-traditional games of luck such as rock-paper-scissors, what-card-am-I-holding, and pick-a-number-between-one-and-ten.  This, it turns out, is actually the premise of one of the most popular American game shows of the year, Deal Or No Deal, hosted by the failed-comedian-actor-turned-gameshow-host Howie Mandel.  The point of this game is to pick a number between one and twenty-six, which has a one in twenty-six chance of being worth a million dollars.  The beauty of the game is that the quick and the slow, the clever and the dim, all have an equal chance of winning.  The game is a great leveler, and the apparent pleasure for the audience is in seeing how the contestants squirm.

I had initially thought that Mr. Wooley’s palpable nervousness detracted from his presentation, but the more I think about it, the more I am convinced that his error was in not being nervous enough.  The problem with the format of Microsoft presentations is that there is not enough at stake.   A presenter may suffer the indignity of having people point out his coding errors on stage or of having bloggers ask why he needs a script to write a simple demo app — but at the end of the day there are no clear stakes, no clear winners, no clear losers.

The secret of the modern spectator sport — and what makes it fascinating to watch — is that it is primarily about moving money around.  Televised poker, Survivor-style Reality shows, and TV game shows are all successful because they deal with large sums of money and give us an opportunity to see what people will do for it.  Perhaps at some low level, it even succeeds at distracting us from what we are obliged to do for money.

And money is the secret ingredient that would liven up these perfunctory Microsoft events.  One could set a timer for each code demonstration, and oblige the presenter to finish his code — making sure it both compiles and passes automated unit tests — in the prescribed period in order to win a set sum of money.  Even better, audience members can be allowed to compete against the official Microsoft presenters for the prize money.  Imagine the excitement this would generate, the unhelpful hints from the audience members to the competitors, the jeering, the side-bets, the tension, the drama, the spectacle.  Imagine how much more enjoyable these events would be.

Microsoft events are not the only places where money could liven things up, either.  What if winning a televised presidential debate could free up additional dollars to presidential candidates?  What if, along with answering policy questions, we threw in geography and world event questions with prize money attached?  Ratings for our presidential debates might even surpass the ratings for Deal Or No Deal.

Academia would also be a wonderful place to use money as a motivator.  Henry Kissinger is reported to have said that academic battles are so vicious because the stakes are so low.  Imagine how much more vicious we could make them if we suddenly raised the stakes, offering cash incentives for crushing intellectual blows against one’s enemies in the pages of the Journal of the History of Philosophy, or a thousand dollars for each undergraduate ego one destroys with a comment on a term paper.  Up till now, of course, academics have always been willing to do this sort of thing gratis, but consider how much more civilized, and how clearer the motives would be, if we simply injected money into these common occurrences.

Pluralitas non est ponenda sine necessitate

“Plurality should not be posited without necessity.”  — William of Occam

Ptolemaic

I added an extra hour to my commute this morning by taking a shortcut.  The pursuit of shortcuts is a common pastime in Atlanta — and no doubt in most metropolitan areas.  My regular path to work involves taking the Ronald Reagan Highway to Interstate 85, and then the 85 to the 285 until I arrive at the northern perimeter.  Nothing could be simpler, and if it weren’t for all the other cars, it would be a really wonderful drive.  But I can’t help feeling that there is a better way to get to my destination, so I head off on my own through the royal road known as “surface streets”.

Cautionary tales like Little Red Riding Hood should tell us all we need to know about taking the path less traveled, yet that has made no difference to me.   Secret routes along surface streets (shortcuts are always a secret of some kind) generally begin with finding a road that more or less turns the nose of one’s car in the direction of one’s job.  This doesn’t last for long, however.  Instead one begins making various turns, right, left, right, left, in an asymptotic route toward one’s destination. 

There are various rules regarding which turns to make, typically involving avoiding various well-known bottlenecks, such as schools and roadwork, avoiding lights, and of course avoiding left turns.  Colleagues and friends are always anxious to tell me about the secret routes they have discovered.  One drew me complex maps laying out the route she has been refining over the past year, with landmarks where she couldn’t remember the street names, and general impressions about the neighborhoods she drove through when she couldn’t recall any landmarks.  This happened to be the route that made me tardy this morning. 

When I told her what had happened to me, the colleague who had drawn me the map apologized for not telling me about a new side-route off of the main route she had recently found (yes, secret routes beget more secret routes) that would have shaved an additional three minutes off of my drive.  Surface streets are the Ptolemaic epicycles of the modern world.

A friend with whom I sometimes commute has a GPS navigation system on his dashboard, which changes the secret route depending on current road conditions.  This often leads us down narrow residential roads that no one else would dream of taking since they wouldn’t know if the road leads to a dead-end or not — but the GPS system knows, of course.  We are a bit slavish about following the advice of the GPS, even when it tells us to turn the trunk of the car toward work and the nose toward the horizon.  One time we drove along a goat path to Macon, Georgia on the advice of the GPS system in order to avoid an accident on S. North Peachtree Road Blvd. 

All this is made much more difficult, of course, due to the strange space-time characteristics of Atlanta which cause two right turns to always take you back to your starting point and two left turns to always dump you into the parking lot of either a Baptist church or a mall.

Various reasons are offered to explain why the Copernican model of the solar system came to replace the Ptolemaic model, including a growing resentment of the Aristotelian system championed by the Roman Catholic Church, resentment against the Roman Catholic Church itself, and a growing revolutionary humanism that wanted to see the Earth and its inhabitants in motion rather than static.  My favorite, however, is the notion that the principle of parsimony was the deciding factor, and that at a certain point people came to realize that the simplicity, rather than complexity, is the true hallmark of scientific explanation.

The Ptolemaic system, which places the earth at the center of the universe, with the Sun, planets and heavenly sphere revolving around it, was not able to explain the observed motions of the planets satisfactorily.  We know today was due to both having the wrong body placed in the center of the model, as well as insisting on the primacy of circular motion rather than elliptical route the planets actually take. 

In particular, Ptolemy was unable to explain the occasionally observed retrogression of the planets, during which these travelers appear to slow down and then go into reverse during their progression through the sky, without resorting to the artifice of epicycles, or mini circles, which the planets would follow even as they were also following their main circular routes through the sky.  Imagine a ferris wheel on which the chairs do more than hang on their fulcrums; they also do loop-de-loops as they move along with the main wheel.  In Ptolemy’s system, not only would the planets travel along epicycles that traveled on the main planetary paths, but sometimes the epicycles would have their own epicycles, and these would beget additional epicycles.  In the end, Ptolemy required 40 epicycles to explain all the observed motions of the planets.

Copernicus sought to show that, even if his model did not necessarily exceed the accuracy of Ptolemy’s system, he was nevertheless able to get rid of many of these epicycles simply by positing the Sun at the center of the universe rather than the Earth.  At that moment in history simplicity, rather than accuracy per se, became the guiding principle in science.  It is a principle with far reaching ramifications.  Rather than the complex systems of Aristotelian philosophy, with various qualifications and commentaries, the goal of science (in my simplified tale) became the pursuit of simple formulas that would capture the mysteries of the universe.  Whereas Galileo wrote that the book of the universe is written in mathematics, what he really meant is that it is written in a very condensed mathematics and is a very short book, brought down to a level that mere humans can at last contain in their finite minds.

The notion of simplicity is germane not only to astronomy, but also to design.  The success of Apple’s IPod is due not to the many features it offers, but rather to the realization that what people want from their devices is only a small set of features done extremely well.  Simplicity is the manner in which we make notions acceptable and conformable to the human mind.  Why is it that one of the key techniques in legal argumentation is to take a complex notion and reframe it in a metaphor or catchphrase that will resonate with the jurists?  The phrase “If the glove doesn’t fit, you must acquit” was, though bad poetry, rather excellent strategy.  “Separate but not equal” has resonated and guided the American conscience for fifty years.  Joe Camel, for whatever inexplicable reasons, has been shown to be an effective instrument of death.  The paths of the mind are myriad and dark.

Taking a fresh look at the surface streets I have been traveling along, I am beginning to suspect that they do not really save me all that much time.  And even if they do shave a few minutes off my drive, I’m not sure the intricacies of my Byzantine journey are worth all the trouble.  So tomorrow I will return to the safe path, the well known path — along the Ronald Reagan, down the 85, across the top of the 285.  It is simplicity itself.

Yet I know that the draw of surface streets will continue to tug at me on a daily basis, and I fear that in a moment of weakness, while caught behind an SUV or big rig, I may veer off my intended path.  In order to increase the accuracy of his system, even Copernicus was led in the end to add epicycles, and epicycles upon epicycles, to his model of the universe, and by the last pages of On the Revolution of the Heavenly Bodies found himself juggling a total of 48 epicycles — 8 more than Ptolemy had.

Concerning Facts


In the early chapters of A Study In Scarlet, John H. Watson observes of his friend Sherlock Holmes:



His ignorance was as remarkable as his knowledge.  Of contemporary literature, philosophy and politics he appeared to know next to nothing.  Upon my quoting Thomas Carlyle, he inquired in the naivest way who he might be and what he had done.  My surprise reached a climax, however, when I found incidentally that he was ignorant of the Copernican Theory and of the composition of the Solar System.  That any civilized human being in this nineteenth century should not be aware that the earth travelled round the sun appeared to be to me such an extraordinary fact that I could harldly realize it.



There is a similar strain of incredulity, both within the United States as well as without, when it is observed that a vast number of Americans claim they do not believe in Evolution.  It is a source of such consternation that the beliefs of presidential candidates on this matter are speculated upon and discussed as a sort of key that will reveal the secret heart of these men and women.  Are people who do not believe in Evolution simply of lower intellectual abilities than the rest of us?  Or is it rather that the decision not to believe is an indication of other values, tied together in a web of beliefs, that hinge on certain characteristics which make these people ultimately alien in their thought patterns, radically other in their perception of reality?  Do these people pose a threat to the homogeneity of world view that we take for granted in public discourse?



“You appear to be astonished,” he said, smiling at my expression of surprise.  “Now that I do know it I shall do my best to forget it.”


“To forget it!”


“You see,” he explained, “I consider that a man’s brain originally is like alittle empty attic, and you have to stock it with such furniture as you choose. A fool takes in all the lumber of every sort that he comes across, so that the knowledge which might be ueful to him gets crowded out, or at best is jumbled up with a lot of other tings, so that he has a difficulty in laying his hands upon it…. Depend upon it there comes a time when for every addition of knowledge you forget something that you knew before. It is of the highest importance, therefore, not to have useless facts elbowing out the useful ones.”



If they deny one fact, what else might they deny?  If they hold this set of beliefs, what else might they cleave to?  As a Scientific American article put it,



“Embarrassingly, in the 21st century, in the most scientifically advanced nation the world has ever known, creationists can still persuade politicians, judges and ordinary citizens that evolution is a flawed, poorly supported fantasy. They lobby for creationist ideas such as “intelligent design” to be taught as alternatives to evolution in science classrooms.



“In addition to the theory of evolution, meaning the idea of descent with modification, one may also speak of the fact of evolution.  The NAS defines a fact as “an observation that has been repeatedly confirmed and for all practical purposes is accepted as true …. All sciences frequently rely on indirect evidence. Physicists cannot see subatomic particles directly, for instance, so they verify their existence by watching for telltale tracks that the particles leave in cloud chambers. The absence of direct observation does not make physicists’ conclusions less certain.”


The atheism entry on About.com puts it more baldly:



“Evolutionary theory is the organizing principle for all modern biology – denial of it is like denying relativity in modern physics. The fact of evolution — the fact that allele frequencies change in populations over time — is as undeniable as are the actions of gravity or continental shifts. Despite this, only a third of Americans actually think that evolution is supported by the evidence…. People who don’t “accept” evolution are guilty of very unfortunate ignorance, but it’s probably an understandable ignorance. I wouldn’t be surprised if people were similarly ignorant of other aspects of science. It’s a sign of the great scientific illiteracy of American culture.”



“But the Solar System!” I protested.


“What the deuce is it to me?” he interrupted impatiently: “you say that we go round the sun. If we went round the moon it would not make a pennyworth of difference to me or to my work.”



Alasdair MacIntyre in After Virtue marks this conflation of theory and facts (evolution is a fact, but it is also a theory) as a product of the 17th and 18th centuries.  Empiricism is based on the notion that what we see is what there actually is.  It depends on the belief (or is it fact?) that our experiences are reliable.  Natural science, on the other hand, depends on tools of measurement as the arbiter of what is real.  The telescope tells us more than our unreliable eyes can. 



“…[I]n the measurement of temperature the effect of heat on spirits of alcohol or mercury is given priority over the effect of heat on sunburnt skin or parched throats.”


Just as theory is dependent upon measurement to verify it, so measurement is dependent on theory to justify it.  We require a theory of how heat affects mercury in order to be able to rely on our thermometer.  Yet this is so far from the notions of common sense and perception which undergird empiricism.



“There is indeed therefore something extraordinary in the coexistence of empiricism and natural science in the same culture, for they represent radically different and incompatible ways of approaching the world.  But in the eighteenth century both could be incorporated and expressed within one and the same world-view.  It follows that that world-view is at its best radically incoherent….”


Out of this notion of the fact, as something both self-evident and obscure at the same time, Max Weber formulated the opposition central to his theorizing, and still central to the modern world view: the fact-value distinction.  Just as a fact has a dual nature, a value also has an inherent ambiguity. It is both a choice as well as something imposed upon us by society.  In its second form, it is something that can be studied by the social sciences, and consequently can be analyzed to some degree as a fact.  In the first form, it is radically subjective, and as indeterminate as the swerve of Lucretius.


The matter can be framed as something even stranger than that.  In existentialist terms, the choice is something we are always obliged to make, so that the notion of a “factual” value is ultimately false, or worse, inauthentic.  From a scientific view point, on the other hand, choice is illusory, and merely a stand-in for facts we do not yet know.


It is these two terms, as vague as they are, that inform our public discourse.  On the one hand, facts are something we should all agree upon; the replacement of values for facts is considered an act of civil disobedience.  If we can’t agree on the facts, then what can we agree on?  On the other hand, we should not be driven by facts alone.  Is it enough to say that a market economy in the long run is the most efficient way to distribute goods?  What about social justice?  What about idealism?  What about values?



I was on the point of asking him what that work might be, but something in his manner showed me that the question would be an unwelcome one. I pondered over our short conversation, however, and endeavoured to draw my deductions from it. He said that he would acquire no knowledge which did not bear upon his object. Therefore all the knowledge which he possessed was such as would be useful to him.


It is the state-of-mind of Evolution-deniers I find most fascinating.  The more I think about them, the more I long to be one.  They hold a strange position that while they want to leave room in public science education for the creationism — hence the insistence on the public avowal that evolution is “only a theory” — they appear to have no desire to actually displace the teaching of evolutionary biology.  Perhaps this is merely strategic, a camel’s nose under the tent. 


But what if we take them at their word?  In that case, they want to find a way to make the fact of evolution and the value of creationism exist side-by-side.  They want to take nothing away from evolution to the extent that it is a practical tool that provides technology for them and extends their lives, but they also want to take nothing away from faith to the extent that it provides a reason to live and a way to go about it.  It is a world-view only possible with the construction of the fact-value distinction.  It is a beautiful attempt to reconcile the irreconcilable, and to make possible a plurality of beliefs that should not co-exist.  Still, it is a world-view that is at its best radically incoherent. 



“I don’t like to talk much with people who always agree with me. It is amusing to coquette with an echo for a little while, but one soon tires of it.”


— Thomas Carlyle

48% of Americans Reject Darwinian Evolution

 

A new Newsweek poll reveals frightening data about the curious disjunct between faith and science among Americans.  Pundits have attributed these results to anything from poor science education in pre-K programs to global warming.  According to the poll, while 51% percent of Americans still ascribe to Darwin’s theory of gradual evolution through adaptation, an amazing 42% continue to cleave to Lamarkianism, while only 6% believe in Punctuated Equilibrium. 1% remain uncommitted and are waiting to hear more before they come to a final decision.

This has led me to wonder what else Americans believe:

The 2002 Roper Poll found that 48% of americans believe in UFO’s, while 37% believe that there has been first hand contact between aliens and humans.  25% of Americans believe in alien abductions, while approximately 33% believe that humans are the only intelligent life in the universe, and that all the UFO stuff is bunk.

The 33% of people who ascribe to the anthropocentric view of the universe corresponds numerically with the 33% of Americans who opposed the recent deadline for troop withdrawal from Iraq (PEW Research center poll).   According to the Gallup poll, in 1996 33% of Americans thought they would become rich someday.  By 2003, this number had dropped to 31%.  According to a Scripps Howard/Ohio University poll, 33% of the American public suspects that federal officials assisted in the 9/11 terrorist attacks or took no action to stop them so the United States could go to war in the Middle East.  A Harris poll discovered that in 2004, 33% of adult Americans considered themselves Democrats.

PEW says that as of 2004, 33 million American internet users had reviewed or rated something as part of an online rating system.  33 million Americans were living in povery in 2001, according to the U.S. Census Bureau.  According to PEW, in 2006 33 million Americans had heard of VOIP.  Each year, 33 million Americans use mental health services or services to treat their problems and illnesses resluting from alcohol, inappropirate use of prescription medications, or illegal drugs.  The New York Times says that out of 33 countries, Americans are least likely to believe in evolution.  Researchers estimate that 33% of Americans born in 2000 will develop diabetes.  In the same year, 33 million Americans lost their jobs.

CBS pollsters discovered that 22% of Americans have seen or felt the presence of a ghost.  48% believe in ghosts.  ICR says 48% of Americans oppose embryonic stem-cell research.  CBS finds that 61% support embryonic stem-cell research.  There is no poll data available on whether they believe that embryos used for stem-cell research will one day become ghosts themselves.

82% of Americans believe that global warming is occuring, according to Fox News/Opinion Dynamics.  79% believe people’s behavior has contributed to global warming.  89% do not believe the U.S. government staged or faked the Apollo moon landing, according to Gallup.  Gallup also found that 41% of Americans believe in ESP, 25% believe in Astrology, 20% believe in reincarnation, while only 9% believe in channeling.  A USA TODAY/ABC News/Stanford University Medical Center poll found that 5% of American adults have turned to acupuncture for pain relief.

According to Gallup, 44% of Americans go out of their way to see movies starring Tom Hanks.  34% go out of their way to avoid movies starring Tom Cruise.  Only 18% go out of their way to avoid Angelina Jolie.

philosophia perennis

In the Valentine’s edition of The New Yorker, there was a rather nice portrait by Larissa MacFarquhar of Paul and Pat Churchland, connubial philosophers of the mind-body problem at UC San Diego.  For years they have been basically decrying in the wilderness against the way that philosophy of mind was being done without any regard for the experimental data being produced by studies in neurophysiology.  In the article, Pat Churchland says this prevalent approach was the result of Anglo-American common language philosophy, which holds that the object of philosophy is to clarify our ideas by analyizing language. The problem, as she sees it, is that clarifying incorrect notions about the relationship between mind and body does not get us to truth, but rather leads us simply to have sophisticated bad ideas.  The mind-body problem had become a problematic (to borrow from Foucault), when the evidence from neurophysiology was very clear — there is the brain and that’s it.  Everything else is language games.

The article continues on a disappointed note:

These days, many philosophers give Pat credit for admonishing them that a person who wants to think seriously about the mind-body problem has to pay attention to the brain.  But this acknowledgment is not always extended to Pat herself, or to the work she does now.

 

The common language philosophy that Pat Churchland critisizes has its roots in german philosophy and the general post-Kantian diminishing of the relevance of Metaphysics.  The deathknell for metaphysics in the 20th century may have arrived with Wittgenstein’s pronouncement in the Tractatus Logico-Philosophicus that  “[w]ovon man nicht sprechen kann, darĂ¼ber muss man schweigen.”  There are different ways to take this, of course, one of which is to say that, as with the dove-tailing of Kant’s first and second critiques, it delimits metaphysics in order to make room for faith (or occultism, or theosophy, or whatever).

The other is that it states what is already well known, that Metaphysic is dead, and there is nothing more to say about her.  But if philosophers can no longer talk about metaphysics, then what shall they talk about?  For years in Anglo-American philosophy, they talked about language.  Instead of the relation between appearance and reality in the world, they talked about appearance and meaning in language instead.  What the Churchlands found disturbing about this was that this seemed simply to be a way to practice metaphysics underground.  Philosophers could dismiss metaphysics on the one hand, but then reintroduce it in their conversations about language instead — though insisting that all they were doing was discussing how we talk about metaphysical notions, not metaphysics itself.  Like vampire hunters to the rescue (though under-appreciated, as indicated above) the Churchlands moved in and reapplied Wittgenstein’s dictum to this underground metaphysics.  I like to think of them as latter day versions of Maximus the Confessor, pointing out that the compromise monothelite christology was in fact simply the monophysite heresy under a new guise.  Claiming that Christ has two natures but one will is no better than claiming that he has one nature.  Claiming that mind and body are the same in the world but separated in language is no better than claiming that they are different in the world, also.

The natural endpoint for the Churchlands is, then, to make our language conform to the world, in order to remove these errors of thought.

One afternoon recently, Paul says, he was home making dinner when Pat burst in the door, having come straight from a frustrating faculty meeting. “She said, ‘Paul, don’t speak to me, my serotonin levels have hit bottom, my brain is awash in glucocorticoids, my blood vessels are full of adrenaline, and if it weren’t for my endogenous opiates I’d have driven the car into a tree on the way home.  My dopamine levels need lifting.  Pour me a Chardonnay, and I’ll be down in a minute.'”  Paul and Pat have noticed that it is not just they who talk this way — their students now talk of psychopharmacology as comfortably as of food.

 

But if we cannot do metaphysics, and we should not even talk of it anymore, what should philosophers do with themselves?  Open Court Press may have found an answer with their Popular Culture and Philosophy series.  Not all the books listed below are from their press, but they do emphasize the point that if we cannot speak of metaphysics, that is if we cannot use philosophy to go beyond what we already know, then we ought to use her instead to explore those things that we are familiar with.  We should practice the perennial philosophy.

  1. The Beatles and Philosophy
  2. Monty Python and Philosophy
  3. U2 and Philosophy
  4. Undead and Philosophy
  5. Bob Dylan and Philosophy
  6. The Simpsons and Philosophy
  7. Harry Potter and Philosophy
  8. Buffy the Vampire Slayer and Philosophy
  9. James Bond and Philosophy
  10. The Sopranos and Philosophy
  11. Star Wars and Philosophy
  12. Baseball and Philosophy
  13. The Matrix and Philosophy
  14. More Matrix and Philosophy
  15. Woody Allen and Philosophy
  16. South Park and Philosophy
  17. The Lord of the Rings and Philosophy
  18. Poker and Philosophy
  19. Hip-Hop and Philosophy
  20. Basketball and Philosophy
  21. Hitchcock and Philosophy
  22. The Atkins Diet and Philosophy
  23. Superheroes and Philosophy
  24. Harley-Davidson and Philosophy
  25. The Grateful Dead and Philosophy
  26. Seinfeld and Philosophy
  27. Mel Gibson’s Passion and Philosophy
  28. The Chronicles of Narnia and Philosophy
  29. Bullshit and Philosophy
  30. Johnny Cash and Philosophy