$5 eBooks from Packt

The technical publisher Packt is offering eBooks for $5 through January 6th, 2015 as a holiday promotion. I encourage you to look very carefully through their selection and see what appeals. If you have time to read on, however, I’d like to explain in greater detail my mixed feelings about Packt (this was probably not the marketing department’s intention when they sent me an email asking me to publicize the promotion but I think it will ultimately be helpful to them).

Packt Publishing has always been hit or miss for me. They are typically much more adventurous regarding computer book topics than other publishers like Apress or O’Reilly (Apress is my publisher, by the way, and are pretty fantastic to work with and very professional). At the same time, I have the impression that Packt’s bar for accepting authors tends to be lower than other publishers’, which allows them to be prolific in their offerings but at the same time entails that they produce, quite honestly, some clunkers.

A specific example of one of their clunkers would be the Packt book Unity iOS Game Development Beginner’s Guide by Greg Pierce. The topic sounds great (at least it did to me) but it turns out the book mostly just copies from publicly available documentation.

To quote from one of the Amazon reviews from 2012 by C Toussieng:

“This book is unbelievably bad. What specifically? All of it. It takes information which can be easily garnered from the Unity and/or Apple websites, distills it down to a minimally useful amount, then charges you for it.

And this one from 2012 by JasonR:

The book basically covers a few pages of the Unity docs, then goes into 3rd party plugins they recommend, each plugin gets a couple of pages. Frankly, a simple search on Google will give you more insight.”

This is a shame since, even as more learning material is always appearing on the Internet which displaces the traditional place of technical books in the software ecosystem – material that is often free – there is still an important role for print books (and their digital equivalent, the eBook). While online material can be thrown out quickly, often covering about a fifth to a tenth of a chapter of a book that goes through the print publishing industry, they tend to lack the cohesiveness that is only possible in a work that has taken months to write and rewrite. A 300-page software book is a distillation of experience which has undergone multiple revisions and fact checking. A really good software book tries to tell a story.

The flip side, of course, is that modern technical books quickly become outdated while technical blog posts simply disappear. All in all, though, I find that sitting down with a book that tries to explain the broader impact of a given technology serves a different and more important purpose than a web tutorial that only shows how to perform streamlined – and often ideal –  tasks.

A propos of the thesis that good software books are distillations of years of experience – we could even say distillations of 10,000 hours of experience – I’d like to point you to some of the gems I’ve discovered through Packt Publishing over the years.

All of the Packt OpenCV books are interesting. I’m particularly fond of Mastering OpenCV with Practical Computer Vision Projects by Daniel Lélis Baggio, but I think all of them – at least the ones I’ve read – are pretty good. Daniel’s bio says that he “…started his works in computer vision through medical image processing at InCor (Instituto do Coração – Heart Institute) in São Paulo, Brazil.”

Another great one is Mastering openFrameworks: Creative Coding Demystified by Denis Perevalov. According to his bio, Denis is a computer vision research scientist at the Ural Branch of the Russian Academy of Sciences and co-author of two Russian patents of robotics.

One I really like simply because the topic is so specific is Kenny Lammers’ Unity Shaders and Effects Cookbook. His bio states that Kenny has been in the game industry for 13 years working for companies like “… Microsoft, Activision, and the late Surreal Software.”

I hope a theme is emerging here. The people who write these books actually have a lot of experience and are trying to pass their knowledge on to you in something more than easily digestible exercises. Best of all – ignoring the example from above – the material is typically highly original. It isn’t copy and pasted from 20 other websites covering the same material. Instead, the reader gets an opinionated and distinct take on the technology covered in each of these books.

What I especially appreciate about the $5 promotions Packt occasionally surfaces is that, for five dollars, you aren’t really obligated to try to read the entire book to get your money’s worth. I’ve taken advantage of similar deals in the past to simply read very specific chapters that are of interest to me such as Basic heads-up-display with custom GUI from Dr. Sebastian Koenig’s Unity for Architectural Visualization or Lighting and Rendering from Jen Rizzo’s Cinema 4D Beginner’s Guide. It’s also a great price when all I want to do is to skim a book on a topic I know pretty well in order to find out if there are any holes in my knowledge. Mastering Leap Motion by Brandon Sanders was extremely helpful for this and, indeed, there were holes in my knowledge.

According to his biography, by the way, Brandon is “… an 18-year-old roboticist who spends much of his time designing, building, and programming new and innovative systems, including simulators, autonomous coffee makers, and robots for competition. At present, he attends Gilbert Finn Polytechnic (which is a homeschool) as he prepares for college. He is the founder and owner of Mechakana Systems, a website and company devoted to robotic systems and solutions.”

I Can Haz the Unconscious?

pet therapy

The scientific method is one of the great wonders of deliberative thought.  It isn’t just our miraculous modern world that is built upon it, but also our confidence in rationality in general.  It is for this reason that we are offended on a visceral level at all sorts of climate change deniers, creationists, birthers, conspiracy theorists and the constant string of yahoos that seem to pop up using the trappings of rationality to deny the results of the scientific method and basic common sense.

It is so much worse, however, when the challenge to the scientific method comes from within.  Dr. Yoshitaka Fujii has been unmasked as perhaps one of the greatest purveyors of made up data in scientific experimentation, and while the peer review process seems to have finally caught him out, he still had a nearly 20 year run and some 200 journal articles credited to him.  Diederik Stapel is another prominent scientific fraudster whose activities put run-of-the-mill journalistic fraudsters like Jayson Blair to shame. 

Need we even bring up the demotion of Pluto, the proposed removal of narcissistic personality disorder as a diagnosis in the DSM V (narcissists were sure this was an intentional slight against them in particular), or the little-known difficulty of predicting Italian earthquakes (seven members of the National Commission for the Forecast and Prevention of Major Risks in Italy were convicted of manslaughter for not forecasting and preventing a major seismological event)?

It’s the sort of thing that gives critics ammo when they want to discredit scientific findings like Jerry Mahlman’s hockey stick graph in climatology.  And the great tragedy isn’t that we reach a stage where we no longer believe in the scientific method, but that we now believe in any scientific method.  Everyone can choose their own scientific facts to believe in and a general opinion that incompatible scientific positions do not need to be resolved with experimentation but rather through politics prevails.

Unconscious Thought Theory is now the object of similar reconsiderations.  A Malcolm Gladwell pet theory based on the experiments of Ap Dijksterhuis, Unconscious Thought Theory posits that we simply perform certain cognitive activities better when we are not actively cognizing.  As a software programmer, I am familiar with this phenomenon in terms of “sleep coding”.  If I am working all day on a difficult problem, I will sometimes have dreams about coding in my sleep and wake up the next morning with a solution.  When I arrive back at my work, it will effectively take me a few minutes to finish typing a routine into my IDE that I’ve been working for a day or several days trying to crack. 

I am a firm believer in this phenomenon and, as they say in late night infomercials, “it really works!”  I even build a certain amount of sleep coding into my programming estimates these days.  A project may take three days of conscious effort, one night of sleep, and then an additional five minutes to code up.  Sometimes the best thing to do when a problem seems insurmountable is simply to fire up the Internets, watch some cat videos and lolcatz the unconscious.

Imagine also how salvific the notion of a powerful unconscious is following the recent series of financial crisis.  At the first level, the interpretation of financial debacles blames excessive greed for our current problems (second great depression and all that jazz).  But that’s so 1980’s Gordon Gecko.  A deeper interpretation holds that the problem comes down to falsely assuming that in economic matters we are rational actors – an observation that has given birth (or at least a second wind) to the field of behavioral economics. 

I can haz Asimo

Lots of cool counter-factual papers and books about how remarkably irrational the consumer is has come out of this movement.  The coolest has got to be not only that we are much more irrational than we think, but that our irrational unconscious selves are much more capable than our conscious selves are.  It’s a bit like the end of of Isaac Asimov’s I, Robot (spoilers ahead) where after working out all the issues with Robots someone discovers that things are just going too smoothly in the world and comes to the realization that humans are not smart enough to end wars and cure diseases like this.  After some investigation, the intrepid hero discovers that our benign computer systems have taken over the running of the world and haven’t told us because they don’t want to freak us out about it.  They want us to go on thinking that we are still in charge and to feel good about ourselves.  It’s a dis-distopian ending of sorts.

As I mentioned, however, Unconscious Thought Theory is undergoing some discreditation.  One of the rules of the scientific method is that with experiments, they gots to be reproducible, and Dijksterhuis’s do not appear to be.  Multiple experiments have not been able to replicate Dijksterhuis’s “priming effect” experiments which used social priming techniques (for instance, having something think about a professor or a football hooligan before an exam) and then evaluating the exam scores correlated with the type of priming that happened.  There’s a related social priming experiment by someone else, also not reproducible, that seemed to show that exposing people to notions about aging and old people would make them walk slower.  The failure to replicate and verify the findings of Dijksterhuis’s social priming experiments lead one inevitably to conclude that Dijksterhuis’s other experiments promoting Unconscious Thought Theory are likewise questionable.

a big friggin' eye full of clouds

On the other had, that’s exactly what a benevolent, intelligent, all-powerful, collective supra-unconscious would want us to think.  Consider that if Dijksterhuis is correct about the unconscious being, in many circumstances, basically smarter at complex thinking activities than our conscious minds are, then the last thing this unconscious would want is for us to suddenly start being conscious of it.  It works behind the scenes, after all. 

When we find the world too difficult to understand, we are expected to give up and miraculously, after a good’s night sleep, the unconscious provides us with solutions.  How many scientific eureka moments throughout history have come about this way?  How many of our greatest technological discoveries are driven by humanity’s collective unconscious working carefully and tirelessly behind the scenes while we sleep?  Who, after all, made all those cat videos to distract us from psychological experiments on the power of the unconscious while the busy work of running the world was being handled by others?  Who created YouTube to host all of those videos?  Who invented the Internet – and why? 

The Lees and Scum of Bygone Men

 

chinese_book

The following is a parable about the difference between theory and practice, which Michael Oakeshott frames as the difference between technical and practical knowledge, found as a footnote in Michael Oakeshott’s essay Rationalism In Politics.  I find that it has some bearing, which I will discuss in the near future, to certain Internet debates about pedagogy and software programming:

Duke Huan of Ch’i was reading a book at the upper end of the hall; the wheelwright was making a wheel at the lower end.  Putting aside his mallet and chisel, he called to the Duke and asked him what book he was reading.  ‘One that records the words of the Sages,’ answered the Duke.  ‘Are those Sages alive?’ asked the wheelwright.  ‘Oh, no,’ said the Duke, ‘they are dead.’  ‘In that case,’ said the wheelwright, ‘what you are reading can be nothing but the lees and scum of bygone men.’  ‘How dare you, a wheelwright, find fault with the book I am reading.  If you can explain your statement, I will let it pass.  If not, you shall die.’  ‘Speaking as a wheelwright,’ he replied, ‘I look at the matter in this way; when I am making a wheel, if my stroke is too slow, then it bites deep but is not steady; if my stroke is too fast, then it is steady, but it does not go deep.  The right pace, neither slow nor fast, cannot get into the hand unless it comes from the heart.  It is a thing that cannot be put into rules; there is an art in it that I cannot explain to my son.  That is why it is impossible for me to let him take over my work, and here I am at the age of seventy still making wheels.  In my opinion it must have been the same with the men of old.  All that was worth handing on, died with them; the rest, they put in their books.  That is why I said that what you were reading was the lees and scum of bygone men.'”

Chuang Tzu

The Self-Correcting Process

carnival

Science is all about making proposals that can be tested (especially after Karl Popper’s formulation of the Falsifiability Criterion), and then undergoing the experience of having that proposal rejected.  This is the essence of any successful process — not that it eliminates errors altogether, but rather that it is able to make corrections despite these errors so that the target need never shift.

Professor Alain Connes recently gave his opinion of Xin-Jing Li’s proof for the Riemann Hypothesis — a proof which relies in part on Professor Connes’ work –  in a blog comment to his own blog (by way of Slashdot):

I dont like to be too negative in my comments. Li’s paper is an attempt to prove a variant of the global trace formula of my paper in Selecta. The "proof" is that of Theorem 7.3 page 29 in Li’s paper, but I stopped reading it when I saw that he is extending the test function h from ideles to adeles by 0 outside ideles and then using Fourier transform (see page 31). This cannot work and ideles form a set of measure 0 inside adeles (unlike what happens when one only deals with finitely many places).

 

Self-correcting extends to other professions, as well.  Scott Hanselman recently posted to correct an opinion he discovered here which he felt required some testing.  Through his own tests, he discovered that nesting a using directive inside a  namespace declaration provides no apparent performance benefit over placing it outside the namespace.

This leads him to draw these important lesson:

  • Don’t believe everything you read, even on a Microsoft Blog.
  • Don’t believe this blog, either!
  • Decide for yourself with experiments if you need a tiebreaker!

 

The sentiment recalls Ralph Waldo Emerson’s memorable words:

 

There is a time in every man’s education when he arrives at the conviction that envy is ignorance; that imitation is suicide; that he must take himself for better, for worse, as his portion; that though the wide universe is full of good, no kernel of nourishing corn can come to him but through his toil bestowed on that plot of ground which is given to him to till. The power which resides in him is new in nature, and none but he knows what that is which he can do, nor does he know until he has tried.

Trust thyself: every heart vibrates to that iron string. Accept the place the divine providence has found for you, the society of your contemporaries, the connection of events.

 

A similar sentiment is expressed in Hobbes’ Leviathan, though with a wicked edge:

 

And as to the faculties of the mind, setting aside the arts grounded upon words, and especially that skill of proceeding upon general and infallible rules, called science, which very few have and but in few things, as being not a native faculty born with us, nor attained, as prudence, while we look after somewhat else, I find yet a greater equality amongst men than that of strength. For prudence is but experience, which equal time equally bestows on all men in those things they equally apply themselves unto. That which may perhaps make such equality incredible is but a vain conceit of one’s own wisdom, which almost all men think they have in a greater degree than the vulgar; that is, than all men but themselves, and a few others, whom by fame, or for concurring with themselves, they approve. For such is the nature of men that howsoever they may acknowledge many others to be more witty, or more eloquent or more learned, yet they will hardly believe there be many so wise as themselves; for they see their own wit at hand, and other men’s at a distance. But this proveth rather that men are in that point equal, than unequal. For there is not ordinarily a greater sign of the equal distribution of anything than that every man is contented with his share. [emphasis mine]

 

We find it again expressed in Descartes’ Discours de la méthode. Descartes, it might be remembered, occasionally exchanged letters with Hobbes:

 

Le bon sens est la chose du monde la mieux partagée; car chacun pense en être si bien pourvu, que ceux même qui sont les plus difficiles à contenter en toute autre chose n’ont point coutume d’en désirer plus qu’ils en ont.

 

Both Hobbes and Descartes formulate their defense of common sense somewhat ironically.  In a recent post, Steve Yegge takes out the irony (or perhaps takes out the kernel of truth and leaves nothing but the irony) in his argument against Joel Spolsky’s widely aknowledged criteria for a desirable employee: "smart, and gets things done."

According to Yegge, the crux of the problem is this:

 

Unfortunately, smart is a generic enough concept that pretty much everyone in the world thinks [he’s] smart.

So looking for Smart is a bit problematic, since we aren’t smart enough to distinguish it from B.S. The best we can do is find people who we think are smart because they’re a bit like us.

So, like, what kind of people is this Smart, and Gets Things Done adage actually hiring?

 

And yet the self-correcting process continues, on the principle that we are all smart enough, collectively, to solve our problems in the aggregate, even if we can’t solve them as individuals.

Presidential candidate Barack Obama recently held a news conference to correct a misunderstanding he had made a few hours earlier about his stance on the Iraq War.  According to CNN:

 

Obama on Thursday denied that he’s shying away from his proposed 16-month phased withdrawal of all combat troops from Iraq, calling it "pure speculation" and adding that his "position has not changed."

However, he told reporters questioning his stance that he will "continue to refine" his policies as warranted.

His comments prompted the Republican National Committee to put out an e-mail saying the presumed Democratic nominee was backing away from his position on withdrawal.

Obama called a second news conference later Thursday to reiterate that he is not changing his position.

 

This is, of course, merely a blip in the history of self-correction.  A more significant one can be found in Bakhtin’s attempt to interpret the works of Rabelais, and to demonstrate (convincingly) that everyone before him misunderstood the father of Gargantua. 

Bakhtin’s analysis of Rabelais in turn brought to light one of the great discoveries of his career: The Carnival — though a colleague once found an earlier reference to the concept in one of Ernst Cassirer’s works.  Against the notion of a careful and steady self-correcting mechanism in history, Bakhtin introduced the metaphor of the Medieval Carnival:

 

The essential principle of grotesque realism is degradation, that is, the lowering of all that is high, spiritual, ideal, abstract; it is a transfer to the material level, to the sphere of earth and body in their indissoluble unity.

Degradation and debasement of the higher do not have a formal and relative character in grotesque realism. "Upward" and "downward" have here an absolute and strictly topographical meaning….Earth is an element that devours, swallows up (the grave, the womb) and at the same time an element of birth, of renascence (the maternal breasts)….Degradation digs a bodily grave for a new birth….To degrade an object does not imply merely hurling it into the void of nonexistence, into absolute destruction, but to hurl it down to the reproductive lower stratum, the zone in which conception and a new birth take place.

 

The Carnival serves to correct inequalities and resentments in society and its subcultures not by setting it upon a surer footing, but rather by affording us an opportunity to air our grievances publicly in a controlled ceremony which allows society and its hierarchical institutions to continue as they are.  It is a release, rather than an adjustment.  A pot party at a rock festival rather than a general strike.

As for the Internet, it is sometimes hard to say what is actually occurring in the back-and-forth that occurs between various blogs.  Have we actually harnessed the wisdom of crowds and created a self-correcting process that responds more rapidly to intellectual propositions, nudging them over a very short time to the correct solution, or have we in fact recreated the Medieval Carnival, a massive gathering of people in one location which breaks down the normal distinctions between wisdom and folly, knowledge and error, competence and foolhardiness? 

Agile Methodology and Promiscuity

lacan

The company I am currently consulting with uses Scrum, a kind of Agile methodology.  I like it.  Its main features are index cards taped to a wall and quick "sprints", or development cycles.  Scrum’s most peculiar feature is the notion of a "Scrum Master", which makes me feel dirty whenever I think of it.  It’s so much a part of the methodology, however, that you can even become certified as a "Scrum Master", and people will put it on their business cards.  Besides Scrum, other Agile methodologies include Extreme Programming (XP) and the Rational Unified Process (RUP) which is actually more of a marketing campaign than an actual methodology — but of course you should never ever say that to a RUP practitioner.

The main thing that seems to unify these Agile methodologies is the fact that they are not Waterfall.  And because Waterfall is notoriously unsuccessful, except when it is successful, Agile projects are generally considered to be successful, except when they aren’t.  And when they aren’t, there are generally two explanations that can be given for the lack of success.  First, the flavor of Agile being practiced wasn’t practiced correctly.  Second, the agile methodology was followed too slavishly, when at the heart of agile is the notion that it must be adapted to the particular qualities of a particular project.

In a recent morning stand up (yet another Scrum feature) the question was raised about whether we were following Scrum properly, since it appeared to some that we were introducing XP elements into our project management.  Even before I had a chance to think about it, I found myself appealing to the second explanation of Agile and arguing that it was a danger to apply Scrum slavishly.  Instead, we needed to mix and match to find the right methodology for us.

A sense of shame washed over me even as I said it, as if I were committing some fundamental category mistake.  However, my remarks were accepted as sensible and we moved on.

For days afterward, I obsessed about the cause of my sense of shame.  I finally worked it up to a fairly thorough theory.  I decided that it was rooted in my undergraduate education and the study of Descartes, who claimed that just as a city designed by one man is eminently more rational than one built through aggregation over ages, so the following of a single method, whether right or wrong, will lead to more valid results than philosophizing willy-nilly ever will.  I also thought of how Kant always filled me with a sense of contentment, whereas Hegel, who famously said against Kant that whenever we attempt to draw lines we always find ourselves crossing over them, always left me feeling uneasy and disoriented.  Along with this was the inappropriate (philosophically speaking) recollection that Kant died a virgin, whereas Hegel’s personal life was marked by drunkenness and carousing.  Finally I thought of Nietzsche, whom Habermas characterized as one of the "dark" philosophers for, among other things, insisting that one set of values were as good as another and, even worse, arguing in The Genealogy of Morals that what we consider to be noble in ourselves is in fact base, and what we consider moral weakness is in fact spiritual strength — a transvaluation of all values.  Nietzsche not only crossed the lines, but so thoroughly blurred them that we are still trying to recover them after almost a century and a half.

But lines are important to software developers — we who obsess about interfaces and abhor namespace collisions the way Aristotle claimed nature abhors a vacuum — as if there were nothing worse than the same word meaning two different things.  We are also obsessed with avoiding duplication of code — as if the only thing worse than the same word meaning two different things is the same thing being represented by two different words.  What a reactionary, prescriptivist, neurotic bunch we all are.

This seemed to explain it for me.  I’ve been trained to revere the definition, and to form fine demarcations in my mind.  What could be more horrible, then, than to casually introduce the notion that not only can one methodology be exchanged for another, but that they can be mixed and matched as one sees fit.  Like wearing a brown belt with black shoes, this fundamentally goes against everything thing I’ve been taught to believe not only about software, but also about the world.  If we allow this one thing, it’s a slippery slope to Armageddon and the complete dissolution of civil society.

Then I recalled Slavoj Zizek’s introduction to one of his books about Jacques Lacan (pictured above), and a slightly different sense of discomfort overcame me.  I quote it in part:

I have always found extremely repulsive the common practice of sharing the main dishes in a Chinese restaurant.  So when, recently, I gave expression to this repulsion and insisted on finishing my plate alone, I became the victim of an ironic "wild psychoanalysis" on the part of my table neighbor: is not this repulsion of mine, this resistance to sharing a meal, a symbolic form of the fear of sharing a partner, i.e., of sexual promiscuity?  The first answer that came to my mind, of course, was a variation on de Quincey’s caution against the "art of murder" — the true horror is not sexual promiscuity but sharing a Chinese dish: "How many people have entered the way of perdition with some innocent gangbang, which at the time was of no great importance to them, and ended by sharing the main dishes in a Chinese restaurant!"

Waterfall and Polygamy

polygamy

Methodology is one of those IT topics that generally make my eyes glaze over.  There is currently a hefty thread over on the altdotnet community rehashing the old debates about waterfall vs. agile vs. particular flavors of agile. The topic follows this well-worn pattern: waterfall, which dominated the application development life cycle for so many years, simply didn’t work, so someone had to invent a lightweight methodology like XP to make up for its deficiencies.  But XP also didn’t always work, so it was necessary to come up with other alternatives, like Scrum, Rational, etc., all encapsulated under the rubric "Agile". (Which agile methodology came first is a sub-genre of the which agile methodology should I use super-genre, by the way.)  Both Waterfall and the various flavors of Agile are contrasted against the most common software development methodology, "Cowboy Coding" or "Seat-of-the-pants" programming, which is essentially a lack of structure.  Due to the current common wisdom regarding agile, that one should mix-and-match various agile methodologies until one finds a religion one can love, there is some concern that this is not actually all that distinguishable from cowboy coding. 

For an interesting take on the matter, you should consult Steve Yegge’s classic post, Good Agile, Bad Agile.

I have a friend who, in all other ways, is a well-grounded, rational human being in the engineering field, but when the topic of Druids comes up, almost always at his instigation, I feel compelled to find a reason to leave the room.  The elm, the ewe, the mistletoe, the silver sickle: these subjects in close constellation instill in me a sudden case of restless legs syndrome.

Not surprisingly, discussions concerning Methodology give me a similar tingly feeling in my toes.  This post in the altdotnet discussion caught my eye, however:

I also don’t believe it is possible to do the kind of planning waterfall requires on any sufficiently large project. So usually what happens is that changes are made to the plan along the way as assumptions and understandings are changed along the way.

Rather than belittle waterfall methodology as inherently misguided, the author expresses the novel notion that it is simply too difficult to implement.  The fault in other words, dear Brutus, lies not in our stars, but in ourselves.

Rabbi Gershom Ben Judah, also known as the Light of the Exile, besides being a formidable scholar, is also notable for his prohibition of polygamy in the 10th century, a prohibition that applied to all Ashkenazy jews, and which was later adopted by Sephardis as well. The prohibition required particular care, since tradition establishes that David, Solomon, and Abraham all had multiple wives.  So why should it be that what was it good for the goose is not so for the gander?

Rabbi Gershom’s exegesis in large part rests on this observation: we are not what our forefathers were.  David, Solomon, and Abraham were all great men, with the virtue required to maintain and manage  polygamous households.  However, as everyone knows, virtue tends to become diluted when it flows downhill.  The modern (even the 10th century modern) lacks the requisite wisdom to prevent natural jealousies between rival wives, the necessary stamina to care for all of his wives as they deserve, and the practical means to provide for them.  For the modern to attempt to live as did David, Solomon, or Abraham, would be disastrous personally, and inimical to good order generally.

What giants of virtue must have once walked the earth.  There was a time, it seems, when the various agile methodologies were non-existent and yet large software development projects were completed, all the same.  It is perhaps difficult for the modern software developer to even imagine such a thing, for in our benighted state, stories about waterfall methodology sending men to the moon seem fanciful and somewhat dodgy — something accomplished perhaps during the mythical man month, but not in real time.

Yet it is so.  Much of modern software is built on the accomplishments of people who had nothing more than the waterfall method to work with, and where we are successful, with XP or Scrum or whatever our particular religion happens to be, it is because we stand on the shoulders of giants.

I find that I am not tempted, all the same.  I know my personal shortcomings, and I would no more try to implement a waterfall project than I would petition for a second wife.  I am not the man my forefathers were.

Technical Interview Questions

flogging

Interviewing has been on a my mind, of late, as my company is in the middle of doing quite a bit of hiring.  Technical interviews for software developers are typically an odd affair, performed by technicians who aren’t quite sure of what they are doing upon unsuspecting job candidates who aren’t quite sure of what they are in for.

Part of the difficulty is the gap between hiring managers, who are cognizant of the fact that they are not in position to evaluate the skills of a given candidate, and the in-house developers, who are unsure of what they are supposed to be looking for.  Is the goal of a technical interview to verify that the interviewee has the skills she claims to possess on her resume?  Is it to rate the candidate against some ideal notion of what a software developer ought to be?  Is it to connect with a developer on a personal level, thus assuring through a brief encounter that the candidate is someone one will want to work with for the next several years?  Or is it merely to pass the time, in the middle of more pressing work, in order to have a little sport and give job candidates a hard time?

It would, of course, help if the hiring manager were able to give detailed information about the kind of job that is being filled, the job level, perhaps the pay range — but more often than not, all he has to work with is an authorization to hire “a developer”, and he has been tasked with finding the best that can be got within limiting financial constraints.  So again, the onus is upon the developer-cum-interviewer to determine his own goals for this hiring adventure.

Imagine yourself as the technician who has suddenly been handed a copy of a resume and told that there is a candidate waiting in the meeting room.  As you approach the door of the meeting room, hanging slightly ajar, you consider what you will ask of him.  You gain a few more minutes to think this over as you shake hands with the candidate, exchange pleasantries, apologize for not having had time to review his resume and look blankly down at the sheet of buzzwords and dates on the table before you.

Had you more time to prepare in advance, you might have gone to sites such as Ayenda’s blog, or techinterviews.com, and picked up some good questions to ask.  On the other hand, the value of these questions is debatable, as it may not be clear that these questions are necessarily a good indicator that the interviewee had actually been doing anything at his last job.  He may have been spending his time browsing these very same sites and preparing his answers by rote.  It is also not clear that understanding these high-level concepts will necessarily make the interviewee good in the role he will eventually be placed in, if hired. 

Is understanding how to compile a .NET application with a command line tool necessarily useful in every (or any) real world business development task?  Does knowing how to talk about the observer pattern make him a good candidate for work that does not really involve developing monumental code libraries?  On the other hand, such questions are perhaps a good gauge of the candidate’s level of preparation for the interview, and can be as useful as checking the candidate’s shoes for a good shine to determine how serious he is about the job and what level of commitment he has put into getting ready for it.  And someone who prepares well for an interview will, arguably, also prepare well for his daily job.

You might also have gone to Joel Spolsky’s blog and read The Guerrilla Guide To Interviewing in order to discover that what you are looking for is someone who is smart and gets things done.  Which, come to think of it, is especially helpful if you are looking for superstar developers and have the money to pay them whatever they want.  With such a standard, you can easily distinguish between the people who make the cut and all the other maybe candidates.  On the other hand, in the real world, this may not be an option, and your objective may simply be to distinguish between the better maybe candidates and the less-good maybe candidates.  This task is made all the harder since you are interviewing someone who is already a bit nervous and, maybe, has not even been told, yet, what he will be doing in the job (look through computerjobs.com sometime to see how remarkably vague most job descriptions are) for which he is interviewing.

There are many guidelines available online giving advice on how to identify brilliant developers (but is this really such a difficult task?)  What there is a dearth of is information on how to identify merely good developers — the kind that the rest of us work with on a daily basis and may even be ourselves.  Since this is the real purpose of 99.9% of all technical interviews, to find a merely good candidate, following online advice about how to find great candidates may not be particularly useful, and in fact may even be counter-productive, inspiring a sense of inferiority and persecution in a job candidate that is really undeserved and probably unfair.

Perhaps a better guideline for finding candidates can be found not in how we ought to conduct interviews in an ideal world (with unlimited budgets and unlimited expectations), but in how technical interviews are actually conducted in the real world.  Having done my share of interviewing, watching others interview, and occasionally being interviewed myself, it seems to me that in the wild, technical interviews can be broken down into three distinct categories.

Let me, then, impart my experience, so that you may find the interview technique most appropriate to your needs, if you are on that particular side of the table, or, conversely, so that you may better envision what you are in for, should you happen to be on the other side of the table.  There are three typical styles of technical interviewing which I like to call: 1) Jump Through My Hoops, 2) Guess What I’m Thinking, and 3) Knock This Chip Off My Shoulder.

 

Jump Through My Hoops

tricks

Jump Through My Hoops is, of course, a technique popularized by Microsoft and later adopted by companies such as Google.  In its classical form, it requires an interviewer to throw his Birkenstock shod feet over the interview table and fire away with questions that have nothing remotely to do with programming.  Here are a few examples from the archives.  The questions often involve such mundane objects as manhole covers, toothbrushes and car transmissions, but you should feel free to add to this bestiary more philosophical archetypes such as married bachelors, morning stars and evening stars, Cicero and Tully,  the author of Waverly, and other priceless gems of the analytic school.  The objective, of course, is not to hire a good car mechanic or sanitation worker, but rather to hire someone with the innate skills to be a good car mechanic or sanitation worker should his IT role ever require it.

Over the years, technical interviewers have expanded on the JTMH with tasks such as writing out classes with pencil and paper, answering technical trivia, designing relational databases on a whiteboard, and plotting out a UML diagram with crayons.  In general, the more accessories required to complete this type of interview, the better.

Some variations of JTMH rise to the level of Jump Through My Fiery Hoops.  One version I was involved with required calling the candidate the day before the job interview and telling him to write a complete software application to specification, which would then be picked apart by a team of architects at the interview itself.  It was a bit of overkill for an entry-level position, but we learned what we needed to out of it.  The most famous JTMFH is what Joel Spolsky calls The Impossible Question, which entails asking a question with no correct answer, and requires the interviewer to frown and shake his head whenever the candidate makes any attempt to answer the question.  This particular test is also sometimes called the Kobayashi Maru, and is purportedly a good indicator of how a candidate will perform under pressure.

 

Guess What I’m Thinking

brain

Guess What I’m Thinking, or GWIT, is a more open ended interview technique.  It is often adopted by interviewers who find JTMH a bit too constricting.  The goal in GWIT is to get through an interview with the minimum amount of preparation possible.  It often takes the form, “I’m working on such-and-such a project and have run into such-and-such a problem.  How would you solve it?”  The technique is most effective when the job candidate is given very little information about either the purpose of the project or the nature of the problem.  This establishes for the interviewer a clear standard for a successful interview: if the candidate can solve in a few minutes a problem that the interviewer has been working on for weeks, then she obviously deserves the job.

A variation of GWIT which I have participated in requires showing a candidate a long printout and asking her, “What’s wrong with this code?”  The trick is to give the candidate the impression that there are many right answers to this question, when in fact there is only one, the one the interviewer is thinking of.  As the candidate attempts to triangulate on the problem with hopeful answers such as “This code won’t compile,” “There is a bracket missing here,” “There are no code comments,” and “Is there a page missing?” the interviewer can sagely reply “No, that’s not what I’m looking for,” “That’s not what I’m thinking of, “That’s not what I’m thinking of, either,” “Now you’re really cold” and so on.

This particular test is purportedly a good indicator of how a candidate will perform under pressure.

 

Knock This Chip Off My Shoulder

eveready

KTCOMS is an interviewing style often adopted by interviewers who not only lack the time and desire to prepare for the interview, but do not in fact have any time for the interview itself.  As the job candidate, you start off in a position of wasting the interviewer’s time, and must improve his opinion of you from there.

The interviewer is usually under a lot of pressure when he enters the interview room.  He has been working 80 hours a week to meet an impossible deadline his manager has set for him.  He is emotionally in a state of both intense technical competence over a narrow area, due to his life-less existence for the past few months, as well as great insecurity, as he has not been able to satisfy his management’s demands. 

While this interview technique superficially resembles JTMFH, it is actually quite distinct in that, while JTMFH seeks to match the candidate to abstract notions about what a developer ought to know, KTCOMS is grounded in what the interviewer already knows.  His interview style is, consequently, nothing less that a Nietzschean struggle for self-affirmation.  The interviewee is put in the position of having to prove herself superior to the interviewer or else suffer the consequences.

Should you, as the interviewer, want to prepare for KTCOMS, the best thing to do is to start looking up answers to obscure problems that you have encountered in your recent project, and which no normal developer would ever encounter.  These types of questions, along with an attitude that the job candidate should obviously already know the answers, is sure to fluster the interviewee. 

As the interviewee, your only goal is to submit to the superiority of the interviewer.  “Lie down” as soon as possible.  Should you feel any umbrage, or desire to actually compete with the interviewer on his own turf, you must crush this instinct.  Once you have submitted to the interviewer (in the wild, dogs generally accomplish this by lying down on the floor with their necks exposed, and the alpha male accepts the submissive gesture by laying its paw upon the submissive animal) he will do one of two things;  either he will accept your acquiescence, or he will continue to savage you mercilessly until someone comes in to pull him away.

This particular test is purportedly a good indicator of how a candidate will perform under pressure.

 

Conclusion

moderntimes

I hope you have found this survey of common interviewing techniques helpful.  While I have presented them as distinct styles of interviewing, this should certainly not discourage you from mixing-and-matching them as needed for your particular interview scenario.  The schematism I presented is not intended as prescriptive advice, but merely as a taxonomy of what is already to be found in most IT environments, from which you may draw as you require.  You may, in fact, already be practicing some of these techniques without even realizing it.

Coding is not a Spectator Sport

matrix

 

This past Tuesday I attended a Microsoft technology event at a local movie theater.  Ever since the Matrix, movie theaters are apparently the convenient place to go to get technology updates these days.  If you’ve never been to one of these events, they involve a presenter or two with a laptop connected to the largest movie screen in the building.  The presenters then promote new Microsoft offerings by writing code on the big screen while software programmers who have somehow gotten the afternoon off from their bosses watch on.

Jim Wooley presented on LINQ, while a Microsoft employee presented on WCF.  The technologies looked pretty cool, but the presentations were rather dull.  I don’t think this was really the fault of the presenters, though.  The truth is, watching other people code is a bit like watching paint dry, and seems to take longer.  Perhaps this is why pair programming, one of the pillars of extreme programming, has never caught on (failing to document your code, however, another pillar of extreme programming, has been widely adopted and, like Monsieur Jourdain, many developers have found that they’d been doing XP for years without even realizing it). 

Within these constraints — that is that you are basically doing the equivalent of demonstrating how to hammer nails into a board for four hours — the presenters did pretty well, although Mr. Wooley appeared to be somewhat nervous and kept insisting he was doing “Extreme Presenting” whenever he made a coding mistake and the greediest members of the audience would compete with one another to point out his failings.  The Microsoft presenter didn’t encounter any compile errors like Mr. Wooley did, but on the other hand he was following a script and kept referring to it as he typed the code that we were all watching.  Why you should need a script to write uninteresting demo code that ultimately just emits “Hello, world” messages is beyond me, but that’s what he did, and he demonstrated that there could be something even less able to hold the attention than watching someone write code — watching someone write code by rote.

But it is easy to criticize, and in truth I never got to see the presentation on Silverlight given by Shawn Wildermuth (aka “adoguy”), which for all I know may have been much more entertaining and might have undermined my mantra that coding is not a spectator sport, but I’ll never know because I had to skip out on it in order to attend a company dinner.  How I got invited to this dinner I’ll never know, because I wasn’t really very involved in the project that the dinner was intended to celebrate.

I arrived fashionably late by an hour, and as I entered I realized the only seat left was squeezed in between my manager, the CFO of the company and the Senior VP of IT.  This is a dreadful spot to be in, and into that spot I deposited myself.  The problem with being situated next to one’s uppers at a social event is that one spends an inordinate amount of time trying to think of something to say that will impress one’s uppers, while simultaneously trying to avoid saying anything to demonstrate one’s utter unfitness for one’s position.  And here I was next to my boss, who was sitting across from his boss, who was sitting across from his boss.  And as I sat, watching what appeared to be scintillating conversation at the opposite end of the table, my end was completely silent with an air of tension about it.

So I picked up a menu and tried to order.  This was a steak and seafood restaurant, and judging by the prices, approximately twice as good as Longhorn or Outback.  I took the highest priced item, divided the cost by half, and ordered the crawfish pasta with a glass of wine.  Then I sat back to listen to the silence.  Finally someone struck up a conversation about insurance (my industry).  If you want to know how dreadfully dull insurance talk is, it’s a bit like — actually, there is nothing as boring as insurance talk because it is the sine qua non against which all boredom should be judged.  Listening to insurance talk is the sort of thing that makes you want to start cutting yourself for distraction (it’s an old POW trick), and just as I was reaching for the butter knife I found myself telling the jazz story.

The jazz story went over well and seemed to break the ice, so I followed it up with the Berlin mussels story, which was also a hit.  I drank more wine and felt like I was really on a roll.  I’d demonstrated my ability to talk entertainingly around my bosses and as the food arrived I was able to maintain the mood with a jaunty disquisition on men’s fashion and how to select a good hunting dog.  But I grew overconfident.  Over dessert, I decided to play the teacup game, which is a conversation game my friend Conrad at The Varieties had taught me, and it was a disaster.  Apparently I set it up wrong, because a look of disgust formed on the CFO’s face.  My manager tried to save with a distracting story about hygiene, but rather than leave things well enough alone, I decided to continue with the asparagus story, and pretty well ruined the evening.  Oh well.  Bye-bye annual bonus.

Which all goes to show, entertainment is a damnably difficult business.

eddie

I can probably improve my dinner conversation by reading a bit more P.G. Wodehouse and bit less of The New Yorker (which is where I got the fateful asparagus story) but how to improve a Microsoft presentation is a much trickier nut to crack.  How much can you realistically do to dress up watching other people code?

Then again, it is amazing what passes for a spectator sport these days, from Lumberjack Olympics to Dancing with the Stars.  Perhaps one of the strangest cultural trends is the popularity of poker as a spectator sport — something that would have seemed unimaginable back in the day.  The whole thing revolves around a handful of people dressed up in odd combinations of wigs, sunglasses and baseball caps to hide their tells playing a card game that depends largely on luck, partly on a grasp of probabilities, and partly on being able to guess what your opponents are guessing about you.  Is there anything in this jumble of crazy costumes, luck and skill that can be used to improve a typical Microsoft presentation?

The truth is, even skill isn’t so important in creating a successful spectator sport.  Take quiz shows, which once were devoted to very tough questions that left the audience wondering how the contestants could know so much (it turned out, of course, that often they were cheating).  Over time, these shows became simpler and simpler, until we ended up with shows like Are You Smarter Than a 5th Grader (which makes you wonder how they find contestants so dumb) and the very successful Wheel of Fortune (in which you are challenged to list all the letters of the alphabet until a hidden message becomes legible).  Demonstrating skill is not the essence of these games.

If you have ever seen National Lampoon’s Vegas Vacation (fourth in the series, but my personal favorite), you will recall the scene where, after loosing a large portion of his life savings at a casino, Chevy Chase is taken by his cousin Eddie to a special place with some non-traditional games of luck such as rock-paper-scissors, what-card-am-I-holding, and pick-a-number-between-one-and-ten.  This, it turns out, is actually the premise of one of the most popular American game shows of the year, Deal Or No Deal, hosted by the failed-comedian-actor-turned-gameshow-host Howie Mandel.  The point of this game is to pick a number between one and twenty-six, which has a one in twenty-six chance of being worth a million dollars.  The beauty of the game is that the quick and the slow, the clever and the dim, all have an equal chance of winning.  The game is a great leveler, and the apparent pleasure for the audience is in seeing how the contestants squirm.

I had initially thought that Mr. Wooley’s palpable nervousness detracted from his presentation, but the more I think about it, the more I am convinced that his error was in not being nervous enough.  The problem with the format of Microsoft presentations is that there is not enough at stake.   A presenter may suffer the indignity of having people point out his coding errors on stage or of having bloggers ask why he needs a script to write a simple demo app — but at the end of the day there are no clear stakes, no clear winners, no clear losers.

The secret of the modern spectator sport — and what makes it fascinating to watch — is that it is primarily about moving money around.  Televised poker, Survivor-style Reality shows, and TV game shows are all successful because they deal with large sums of money and give us an opportunity to see what people will do for it.  Perhaps at some low level, it even succeeds at distracting us from what we are obliged to do for money.

And money is the secret ingredient that would liven up these perfunctory Microsoft events.  One could set a timer for each code demonstration, and oblige the presenter to finish his code — making sure it both compiles and passes automated unit tests — in the prescribed period in order to win a set sum of money.  Even better, audience members can be allowed to compete against the official Microsoft presenters for the prize money.  Imagine the excitement this would generate, the unhelpful hints from the audience members to the competitors, the jeering, the side-bets, the tension, the drama, the spectacle.  Imagine how much more enjoyable these events would be.

Microsoft events are not the only places where money could liven things up, either.  What if winning a televised presidential debate could free up additional dollars to presidential candidates?  What if, along with answering policy questions, we threw in geography and world event questions with prize money attached?  Ratings for our presidential debates might even surpass the ratings for Deal Or No Deal.

Academia would also be a wonderful place to use money as a motivator.  Henry Kissinger is reported to have said that academic battles are so vicious because the stakes are so low.  Imagine how much more vicious we could make them if we suddenly raised the stakes, offering cash incentives for crushing intellectual blows against one’s enemies in the pages of the Journal of the History of Philosophy, or a thousand dollars for each undergraduate ego one destroys with a comment on a term paper.  Up till now, of course, academics have always been willing to do this sort of thing gratis, but consider how much more civilized, and how clearer the motives would be, if we simply injected money into these common occurrences.

Pluralitas non est ponenda sine necessitate

“Plurality should not be posited without necessity.”  — William of Occam

Ptolemaic

I added an extra hour to my commute this morning by taking a shortcut.  The pursuit of shortcuts is a common pastime in Atlanta — and no doubt in most metropolitan areas.  My regular path to work involves taking the Ronald Reagan Highway to Interstate 85, and then the 85 to the 285 until I arrive at the northern perimeter.  Nothing could be simpler, and if it weren’t for all the other cars, it would be a really wonderful drive.  But I can’t help feeling that there is a better way to get to my destination, so I head off on my own through the royal road known as “surface streets”.

Cautionary tales like Little Red Riding Hood should tell us all we need to know about taking the path less traveled, yet that has made no difference to me.   Secret routes along surface streets (shortcuts are always a secret of some kind) generally begin with finding a road that more or less turns the nose of one’s car in the direction of one’s job.  This doesn’t last for long, however.  Instead one begins making various turns, right, left, right, left, in an asymptotic route toward one’s destination. 

There are various rules regarding which turns to make, typically involving avoiding various well-known bottlenecks, such as schools and roadwork, avoiding lights, and of course avoiding left turns.  Colleagues and friends are always anxious to tell me about the secret routes they have discovered.  One drew me complex maps laying out the route she has been refining over the past year, with landmarks where she couldn’t remember the street names, and general impressions about the neighborhoods she drove through when she couldn’t recall any landmarks.  This happened to be the route that made me tardy this morning. 

When I told her what had happened to me, the colleague who had drawn me the map apologized for not telling me about a new side-route off of the main route she had recently found (yes, secret routes beget more secret routes) that would have shaved an additional three minutes off of my drive.  Surface streets are the Ptolemaic epicycles of the modern world.

A friend with whom I sometimes commute has a GPS navigation system on his dashboard, which changes the secret route depending on current road conditions.  This often leads us down narrow residential roads that no one else would dream of taking since they wouldn’t know if the road leads to a dead-end or not — but the GPS system knows, of course.  We are a bit slavish about following the advice of the GPS, even when it tells us to turn the trunk of the car toward work and the nose toward the horizon.  One time we drove along a goat path to Macon, Georgia on the advice of the GPS system in order to avoid an accident on S. North Peachtree Road Blvd. 

All this is made much more difficult, of course, due to the strange space-time characteristics of Atlanta which cause two right turns to always take you back to your starting point and two left turns to always dump you into the parking lot of either a Baptist church or a mall.

Various reasons are offered to explain why the Copernican model of the solar system came to replace the Ptolemaic model, including a growing resentment of the Aristotelian system championed by the Roman Catholic Church, resentment against the Roman Catholic Church itself, and a growing revolutionary humanism that wanted to see the Earth and its inhabitants in motion rather than static.  My favorite, however, is the notion that the principle of parsimony was the deciding factor, and that at a certain point people came to realize that the simplicity, rather than complexity, is the true hallmark of scientific explanation.

The Ptolemaic system, which places the earth at the center of the universe, with the Sun, planets and heavenly sphere revolving around it, was not able to explain the observed motions of the planets satisfactorily.  We know today was due to both having the wrong body placed in the center of the model, as well as insisting on the primacy of circular motion rather than elliptical route the planets actually take. 

In particular, Ptolemy was unable to explain the occasionally observed retrogression of the planets, during which these travelers appear to slow down and then go into reverse during their progression through the sky, without resorting to the artifice of epicycles, or mini circles, which the planets would follow even as they were also following their main circular routes through the sky.  Imagine a ferris wheel on which the chairs do more than hang on their fulcrums; they also do loop-de-loops as they move along with the main wheel.  In Ptolemy’s system, not only would the planets travel along epicycles that traveled on the main planetary paths, but sometimes the epicycles would have their own epicycles, and these would beget additional epicycles.  In the end, Ptolemy required 40 epicycles to explain all the observed motions of the planets.

Copernicus sought to show that, even if his model did not necessarily exceed the accuracy of Ptolemy’s system, he was nevertheless able to get rid of many of these epicycles simply by positing the Sun at the center of the universe rather than the Earth.  At that moment in history simplicity, rather than accuracy per se, became the guiding principle in science.  It is a principle with far reaching ramifications.  Rather than the complex systems of Aristotelian philosophy, with various qualifications and commentaries, the goal of science (in my simplified tale) became the pursuit of simple formulas that would capture the mysteries of the universe.  Whereas Galileo wrote that the book of the universe is written in mathematics, what he really meant is that it is written in a very condensed mathematics and is a very short book, brought down to a level that mere humans can at last contain in their finite minds.

The notion of simplicity is germane not only to astronomy, but also to design.  The success of Apple’s IPod is due not to the many features it offers, but rather to the realization that what people want from their devices is only a small set of features done extremely well.  Simplicity is the manner in which we make notions acceptable and conformable to the human mind.  Why is it that one of the key techniques in legal argumentation is to take a complex notion and reframe it in a metaphor or catchphrase that will resonate with the jurists?  The phrase “If the glove doesn’t fit, you must acquit” was, though bad poetry, rather excellent strategy.  “Separate but not equal” has resonated and guided the American conscience for fifty years.  Joe Camel, for whatever inexplicable reasons, has been shown to be an effective instrument of death.  The paths of the mind are myriad and dark.

Taking a fresh look at the surface streets I have been traveling along, I am beginning to suspect that they do not really save me all that much time.  And even if they do shave a few minutes off my drive, I’m not sure the intricacies of my Byzantine journey are worth all the trouble.  So tomorrow I will return to the safe path, the well known path — along the Ronald Reagan, down the 85, across the top of the 285.  It is simplicity itself.

Yet I know that the draw of surface streets will continue to tug at me on a daily basis, and I fear that in a moment of weakness, while caught behind an SUV or big rig, I may veer off my intended path.  In order to increase the accuracy of his system, even Copernicus was led in the end to add epicycles, and epicycles upon epicycles, to his model of the universe, and by the last pages of On the Revolution of the Heavenly Bodies found himself juggling a total of 48 epicycles — 8 more than Ptolemy had.

Concerning Facts


In the early chapters of A Study In Scarlet, John H. Watson observes of his friend Sherlock Holmes:



His ignorance was as remarkable as his knowledge.  Of contemporary literature, philosophy and politics he appeared to know next to nothing.  Upon my quoting Thomas Carlyle, he inquired in the naivest way who he might be and what he had done.  My surprise reached a climax, however, when I found incidentally that he was ignorant of the Copernican Theory and of the composition of the Solar System.  That any civilized human being in this nineteenth century should not be aware that the earth travelled round the sun appeared to be to me such an extraordinary fact that I could harldly realize it.



There is a similar strain of incredulity, both within the United States as well as without, when it is observed that a vast number of Americans claim they do not believe in Evolution.  It is a source of such consternation that the beliefs of presidential candidates on this matter are speculated upon and discussed as a sort of key that will reveal the secret heart of these men and women.  Are people who do not believe in Evolution simply of lower intellectual abilities than the rest of us?  Or is it rather that the decision not to believe is an indication of other values, tied together in a web of beliefs, that hinge on certain characteristics which make these people ultimately alien in their thought patterns, radically other in their perception of reality?  Do these people pose a threat to the homogeneity of world view that we take for granted in public discourse?



“You appear to be astonished,” he said, smiling at my expression of surprise.  “Now that I do know it I shall do my best to forget it.”


“To forget it!”


“You see,” he explained, “I consider that a man’s brain originally is like alittle empty attic, and you have to stock it with such furniture as you choose. A fool takes in all the lumber of every sort that he comes across, so that the knowledge which might be ueful to him gets crowded out, or at best is jumbled up with a lot of other tings, so that he has a difficulty in laying his hands upon it…. Depend upon it there comes a time when for every addition of knowledge you forget something that you knew before. It is of the highest importance, therefore, not to have useless facts elbowing out the useful ones.”



If they deny one fact, what else might they deny?  If they hold this set of beliefs, what else might they cleave to?  As a Scientific American article put it,



“Embarrassingly, in the 21st century, in the most scientifically advanced nation the world has ever known, creationists can still persuade politicians, judges and ordinary citizens that evolution is a flawed, poorly supported fantasy. They lobby for creationist ideas such as “intelligent design” to be taught as alternatives to evolution in science classrooms.



“In addition to the theory of evolution, meaning the idea of descent with modification, one may also speak of the fact of evolution.  The NAS defines a fact as “an observation that has been repeatedly confirmed and for all practical purposes is accepted as true …. All sciences frequently rely on indirect evidence. Physicists cannot see subatomic particles directly, for instance, so they verify their existence by watching for telltale tracks that the particles leave in cloud chambers. The absence of direct observation does not make physicists’ conclusions less certain.”


The atheism entry on About.com puts it more baldly:



“Evolutionary theory is the organizing principle for all modern biology – denial of it is like denying relativity in modern physics. The fact of evolution — the fact that allele frequencies change in populations over time — is as undeniable as are the actions of gravity or continental shifts. Despite this, only a third of Americans actually think that evolution is supported by the evidence…. People who don’t “accept” evolution are guilty of very unfortunate ignorance, but it’s probably an understandable ignorance. I wouldn’t be surprised if people were similarly ignorant of other aspects of science. It’s a sign of the great scientific illiteracy of American culture.”



“But the Solar System!” I protested.


“What the deuce is it to me?” he interrupted impatiently: “you say that we go round the sun. If we went round the moon it would not make a pennyworth of difference to me or to my work.”



Alasdair MacIntyre in After Virtue marks this conflation of theory and facts (evolution is a fact, but it is also a theory) as a product of the 17th and 18th centuries.  Empiricism is based on the notion that what we see is what there actually is.  It depends on the belief (or is it fact?) that our experiences are reliable.  Natural science, on the other hand, depends on tools of measurement as the arbiter of what is real.  The telescope tells us more than our unreliable eyes can. 



“…[I]n the measurement of temperature the effect of heat on spirits of alcohol or mercury is given priority over the effect of heat on sunburnt skin or parched throats.”


Just as theory is dependent upon measurement to verify it, so measurement is dependent on theory to justify it.  We require a theory of how heat affects mercury in order to be able to rely on our thermometer.  Yet this is so far from the notions of common sense and perception which undergird empiricism.



“There is indeed therefore something extraordinary in the coexistence of empiricism and natural science in the same culture, for they represent radically different and incompatible ways of approaching the world.  But in the eighteenth century both could be incorporated and expressed within one and the same world-view.  It follows that that world-view is at its best radically incoherent….”


Out of this notion of the fact, as something both self-evident and obscure at the same time, Max Weber formulated the opposition central to his theorizing, and still central to the modern world view: the fact-value distinction.  Just as a fact has a dual nature, a value also has an inherent ambiguity. It is both a choice as well as something imposed upon us by society.  In its second form, it is something that can be studied by the social sciences, and consequently can be analyzed to some degree as a fact.  In the first form, it is radically subjective, and as indeterminate as the swerve of Lucretius.


The matter can be framed as something even stranger than that.  In existentialist terms, the choice is something we are always obliged to make, so that the notion of a “factual” value is ultimately false, or worse, inauthentic.  From a scientific view point, on the other hand, choice is illusory, and merely a stand-in for facts we do not yet know.


It is these two terms, as vague as they are, that inform our public discourse.  On the one hand, facts are something we should all agree upon; the replacement of values for facts is considered an act of civil disobedience.  If we can’t agree on the facts, then what can we agree on?  On the other hand, we should not be driven by facts alone.  Is it enough to say that a market economy in the long run is the most efficient way to distribute goods?  What about social justice?  What about idealism?  What about values?



I was on the point of asking him what that work might be, but something in his manner showed me that the question would be an unwelcome one. I pondered over our short conversation, however, and endeavoured to draw my deductions from it. He said that he would acquire no knowledge which did not bear upon his object. Therefore all the knowledge which he possessed was such as would be useful to him.


It is the state-of-mind of Evolution-deniers I find most fascinating.  The more I think about them, the more I long to be one.  They hold a strange position that while they want to leave room in public science education for the creationism — hence the insistence on the public avowal that evolution is “only a theory” — they appear to have no desire to actually displace the teaching of evolutionary biology.  Perhaps this is merely strategic, a camel’s nose under the tent. 


But what if we take them at their word?  In that case, they want to find a way to make the fact of evolution and the value of creationism exist side-by-side.  They want to take nothing away from evolution to the extent that it is a practical tool that provides technology for them and extends their lives, but they also want to take nothing away from faith to the extent that it provides a reason to live and a way to go about it.  It is a world-view only possible with the construction of the fact-value distinction.  It is a beautiful attempt to reconcile the irreconcilable, and to make possible a plurality of beliefs that should not co-exist.  Still, it is a world-view that is at its best radically incoherent. 



“I don’t like to talk much with people who always agree with me. It is amusing to coquette with an echo for a little while, but one soon tires of it.”


— Thomas Carlyle