Imposter Syndrome Reconsidered

thomas

The term “imposter syndrome” is a meme that has captured the imagination of the technology sector – and for good reason; we are insecure people. The term may have made its way into the tech world in an illegitimate manner, however. This illegitimate use of the term is what I want to explore in this post, coupled with the rather obvious conclusion that “insecurity” is the better term in the vast majority of cases, despite its relative un-sexiness.

“Imposter syndrome” was first clinically described in 1978 in my adopted home city Atlanta, Georgia by Pauline Rose Clance and Suzanne Ament Imes in their paper The Impostor Phenomenon in High Achieving Women: Dynamics and Therapeutic Intervention. There are several salient features in the way it was originally understood which distinguish it from the way imposter syndrome is used today by software engineers.

9to5

First, the diagnosis was originally meant to describe the unique situation of professional women in a predominantly male-dominated professional world. In the 70’s we didn’t ask if men felt like phony’s, too. That would have been missing the point.

Second, the phenomenon covered successful women in particular – that is, women whose accomplishments were publicly recognized and not merely a matter of self-esteem. Today when we talk about  imposter syndrome, by contrast (for instance in Scott Hanselman’s famous post I’m a Phony), the point is always that unaccomplished people should not feel like imposters because even successful people like X, Y and Z feel this way. In the original article, obviously, the term could only be applied to X, Y and Z. Other people, male or female, who felt unearned confidence without visible accomplishments, simply were imposters.

chauncy

Third, imposter syndrome was a phenomenon tied to early family dynamics and requiring therapy to overcome. Today, no one says if you have imposter syndrome you should look for professional help to deal with it. Instead, recognizing the condition is meant to be in itself an instantaneous form of self-therapy, because if everyone has imposter syndrome then no one has imposter syndrome.

In 2019 imposter syndrome has moved well beyond its original carefully circumscribed bounds, to the point that 70% of professionals acknowledge feeling like imposters from time to time (this is an often cited statistic I have been completely unable to source). If this large number is meant to make us feel better about our own insecurities, it should nevertheless concern us that the people who run our banks, fill out our taxes, prescribe our medications, perform surgery on us, cook our food, fix our cars and run our government aren’t always sure they know what they are doing. I would question whether this is actually a good thing.

doingitwrong

Imposter syndrome has found a home in technical communities. There are several obvious reasons for this. The sort of people who go into tech tend to be beset by social anxiety. At the same time, the most common communication strategy used in technology is bombastic and overconfident – sometimes described as bro culture. These two things together will cause a frequent sense of inadequacy in the face of often meaningless processes like behavioral interviews, fadish architectural trends and cargo-cult adherence to agile processes. And if agile isn’t working for you, then you are doing it wrong, so you should feel inadequate about that, too.

elizabethholmes

At the same time, the software industry is a young industry with characteristics that make it susceptible to real imposters. It is a complex profession lacking recognized standards of education or certification. After all, even hair dressers have to be licensed. The guy who writes your banking software, on the other hand, doesn’t. For a set of peculiar linguistic social and linguistic reasons, however, we have no easy way for technologists and business people to talk to each other and judge the relative merits of different approaches to writing software. This leaves the unlicensed software developers in a position of needing to police themselves, with mixed success. In philosophy, this is generally known as the crisis of legitimation.

Another problem is that software is very important in our current world. A lot of time and money is determined by whether we make good or poor software solution decisions. In the case of medical software, lives are literally at stake.

fairies

A third characteristic compounded by the crisis of legitimation and the high stakes involved is a tendency to love shiny new technology. Whenever things don’t go well with a project, we gravitate toward unknown and untested new platforms, frameworks and processes to fix our problems. This creates an inverted success strategy where the technology industry prefers mysterious, poorly understood solutions over acquired experience.

All of this leads to lots of tech people normalizing the process of talking about things they do not know anything about. Whereas in established professions, people know not to speak when they have that vertiginous sense that they are out of your depth, in technology, if no one contradicts us we accept this as license to continue bullshitting. Fake it till you make it.

patterson-gibly

Because other people’s money and potentially lives are on the line, however, we should recognize that this is dangerous behavior.

Moreover, the notion that everyone has imposter syndrome fails to recognize that in previous generations, that sense of being out of one’s depth is how professionals discovered their weaknesses in order to move from journeymen to experts in their craft.

In fact, there is a whole genre of literature known as Bildungsroman dedicated to exploring this basic aspect of the transition to adulthood. Novels as diverse as Twain’s Huckleberry Finn, Dickens’s Great Expectations, Jane Austen’s Emma and Hurston’s Their Eyes Were Watching God and Salinger’s Catcher in the Rye cover this difficult journey from naivete and unearned confidence to nuanced understanding and expertise. The most common hallmark of his genre is the recognition that error, misspeaking and insecurity are milestones in our path towards self-mastery. Being insecure is a good thing, not a bad thing, because it drives us to be better.

_108622702_surgeonsphoto

Insecurity is normal and good. This feature of the human condition tends to be masked and smoothed over by terms like imposter syndrome which helps us to ignore these moments of doubt, which is a shame.

For this reason, I prefer that people just say they feel unconfident about an idea or approach. When someone tells me this, I can help them evaluate their idea and potentially refine it so we both learn something.

trump

When someone tells me they have imposter syndrome I don’t really know what I’m supposed to do. They obviously want me to say that everyone has imposter syndrome and so they should believe in themselves. But when the stakes are as high as they are in software engineering, this feels like a cop out. 

If you are feeling like an imposter in tech, maybe that’s something worth exploring more deeply.

Revolution, Evolution, Visual Studio 2010 and Borges

epicycle

In his preface to The Sublime Object of Ideology Slavoj Zizek writes:

“When a discipline is in crisis, attempts are made to change or supplement its theses within the terms of its basic framework – a procedure one might call ‘Ptolemization’ (since when data poured in which clashed with Ptolemy’s earth-centered astronomy, his partisans introduced additional complications to account for the anomalies).  But the true ‘Copernican’ revolution takes place when, instead of just adding complications and changing minor premises, the basic framework itself undergoes a transformation.  So, when we are dealing with a self-professed ‘scientific revolution’, the question to ask is always: is this truly a Copernican revolution, or merely a Ptolemization of the old paradigm?”

In gaming circles, Zizek’s distinction between Ptolemization and Copernican revolution resembles the frequent debates about whether a new shooter or new graphics engine is merely an ‘evolution’ in the gaming industry or an honest-to-goodness ‘revolution’ – which terms are meant to indicate whether it is a small step for man or a giant leap for gamers.  When used as a measure of magnitude, however, the apposite noun is highly dependent on one’s perspective, and with enough perspective one can easily see any video game as merely a Ptolemization of Japanese arcade games from the 80’s.  (For instance, isn’t CliffyB’s Gears of War franchise — with all the underground battles and monsters jumping out at you — merely a refinement of Namco’s Dig Dug?)

When Zizek writes about Ptolemization and revolutions, he does so with Thomas Kuhn’s 1962 book The Structure of Scientific Revolutions as a backdrop.  Contrary to the popular conception of scientific endeavor as a steady progressive movement, Kuhn proposed that major breakthroughs in science are marked by discontinuities – moments when science simply has to reboot itself.  Professor Kuhn identifies three such ‘paradigm shifts’: the Copernican revolution, the displacement of phlogiston theory with the discovery of oxygen, and the discovery of X-rays.  In each case, according to Kuhn, our worldview changed, and those who came along after the change could no longer understand those who came before.

Thoughts of revolution were much on my mind at the recent Visual Studio 2010 Ultimate event in Atlanta, where I had the opportunity to listen to Peter Provost and David Scruggs of Microsoft talk about the new development tool – and even presented on some of the new features myself.  Peter pointed out that this was the largest overhaul of the IDE since the original release of Visual Studio .NET.  Rewriting major portions of the IDE using WPF is certainly a big deal, but clearly evolutionary.  There are several features that I think of as revolutionary, however, inasmuch as they will either change the way we develop software or, in some cases, because they are simply unexpected.

  • Intellitrace (aka the Historical Debugger) stands out as the most remarkable breakthrough in Visual Studio 2010.  It is a flight recorder for a live debug session.  Intellitrace basically logs callstack, variable, event, SQL call (as well as a host of other) information during debugging.  This, in turn, allows the developer to not only work forward from a breakpoint, but even work backwards through the process flow to track down a bug.  A truly outstanding feature is that, on the QA side with a special version of VS, manual tests can be configured to generate an Intellitrace log which can then be uploaded as an attachment to a TFS bug item.  When the developer opens up the new bug item, she will be able to run the Intellitrace log in order to see what was happening on the QA tester’s machine and walk through this recording of the debug session.  For more about Intellitrace, see John Robbins’ blog.
  • As I hinted at above, Microsoft now offers a fourth Visual Studio SKU called the Microsoft Test and Lab Manager (also available as part of Visual Studio 2010 Ultimate).  The key feature in MTLM, for me, is the concept of a Test Case.  A test case is equivalent to a use case, except that there is now tooling built around it (no more writing use cases in Word) and the test case is stored in TFS.  Additionally, there is a special IDE built for running test cases that provides a list of use case steps, each of which can be marked pass/fail as the tester manually works through the test case.  Even better, screenshots of the application can be taken at any time, and a live video recording can be made of the entire manual test along with the Intellitrace log described above.  All of this metadata is attached to the bug item which is entered in TFS along with the specs for the machine the tester is running on and made available to the developer who must eventually track down the bug.  The way this is explained is that testing automation up to this point has only covered 30% of the testing that actually occurs (mostly with automated unit tests).  MTLM covers the remaining 70% by providing tooling around manual testing – which is what most of good testing is about.  For more info, see the MTLM team blog.
  • Just to round out the testing features, there is also a new unit test template in Visual Studio 2010 called the Coded UI Test.  Creating a new unit test from this template will fire up a wizard that allows the developer to start a manual UI test which gets interpreted as coded steps.  These steps are gen’d into the actual unit test either as UI hooks or XY-coordinate mouse events depending on what is being tested.  Additionally, assertions can be inserted into the test involving UI elements (e.g. text) one expects to see in the app after a series of steps are performed.  The Coded UI Test can then be run like any other unit test through the IDE, or even added to the continuous build process.  Finally, successful use cases verified by a tester can also be gen’d into a Coded UI Test.  This may be more gee-wiz than actually practical, but simply walking through a few of these tests is fascinating and even fun.  For more, see this msdn documentation.
  • Extensibility – Visual Studio now has something called an Extension Manager that lets you browse http://visualstudiogallery.com/ and automatically install add-ins (or more properly, “extensions”).  This only works, of course, it people are creating lots of extensions for VS.  Fortunately, thanks to Peter’s team, a lot of thought has gone into the Visual Studio extensibility and automation model to make it both easier to develop extensions, compared to VS2008, but also much more powerful. Link.

gallery

  • Architecture Tools – Code visualization has taken a great step forward in Visual Studio 2010. You can now generate not only class diagrams, but also sequence diagrams, use case diagrams, component diagrams and activity diagrams right from the source code.  Even class diagrams have a number of visualization options that allow you to see how your classes work together, where to find possible bottlenecks, which classes are the most referenced and a host of other perspectives that the sort of people who like staring at class diagrams will love.  The piece I’m really impressed by is the generation of sequence diagrams from source code.  One right clicks on a particular method in order to get the generation started.  As I understand it, the historical debugger is actually used behind the scenes in order to provide flow information that is then analyzed in order to create the diagram.  I like this for two reasons.  First, I hate actually writing sequence diagrams.  It’s just really hard.  Second, it’s a great diagnostic tool for understanding what the code is doing and, in some cases, what it is doing wrong.

There is a story I borrowed long ago from the Library of Babel and forgot to return – I believe it was by Jorge Luis Borges – about a young revolutionary who leads a small band in an attempt to overthrow the current regime.  As they sneak up on the house of the generalissimo, the revolutionary realizes that the generalissimo looks like an older version of himself, sounds like an older version of himself, in fact is an older version of himself.  Through some strange loop in time, he has come upon his future self – his post-revolutionary self – and sees that he will become what he is attempting to overthrow.

This is the problem with revolutions — revolutions sometimes produce no real change.  Rocky Lhotka raised this specter in a talk he gave at the Atlanta Leading Edge User Group a few months ago; he suggested that even though our tools and methodologies have advanced by leaps and bounds over the past decade, it still takes just as long to write an application today as it did in the year 2000. No doubt we are writing better applications, and arguably better looking applications – but why does it still take so long when the great promise of patterns and tooling has always been that we will be able to get applications to market faster?

This is akin to the Scandal of Philosophy discussed in intellectual circles.  Why, after 2,500 years of philosophizing, are we no closer to answering the basic questions such as What is Virtue?  What is the good life?  What happens to us when we die?

[Abrupt Segue] – Visual Studio 2010, of course, won’t be answering any of these questions, and the resolution of whether this is a revolutionary or an evolutionary change I leave to the reader.  It does promise, however, to make developers more productive and make the task of developing software much more interesting.

What can one do with Silverlight: Part deux

Corey Schuman, Roger Peters and Mason Brown – whom many of you met at the Atlanta Silverlight Firestarter – have been under wraps for several months working on a project for IQ Interactive they repeatedly insisted they couldn’t tell me about.

Now that the beta of My Health Info on MSN has been published, not only do I finally get to see what they have been working on but I also get to share it with you.

My Health Info is an aggregator of sorts for personal medical information – a tool to help the user keep track of her personal medical history.  Unlike other portals that support widgets, however, this one is built using Silverlight.

My Health Info is an interesting alternative to the Ajax-based web portal solutions we typically see and serves as a good starting point for anyone looking to combine the “portal” concept with Silverlight technology.  The Silverlight animations as one navigates through the application are especially nice; they strike the appropriate balance between the attractive and the distracting – between cool and cloying.

The Self-Correcting Process

carnival

Science is all about making proposals that can be tested (especially after Karl Popper’s formulation of the Falsifiability Criterion), and then undergoing the experience of having that proposal rejected.  This is the essence of any successful process — not that it eliminates errors altogether, but rather that it is able to make corrections despite these errors so that the target need never shift.

Professor Alain Connes recently gave his opinion of Xin-Jing Li’s proof for the Riemann Hypothesis — a proof which relies in part on Professor Connes’ work –  in a blog comment to his own blog (by way of Slashdot):

I dont like to be too negative in my comments. Li’s paper is an attempt to prove a variant of the global trace formula of my paper in Selecta. The "proof" is that of Theorem 7.3 page 29 in Li’s paper, but I stopped reading it when I saw that he is extending the test function h from ideles to adeles by 0 outside ideles and then using Fourier transform (see page 31). This cannot work and ideles form a set of measure 0 inside adeles (unlike what happens when one only deals with finitely many places).

 

Self-correcting extends to other professions, as well.  Scott Hanselman recently posted to correct an opinion he discovered here which he felt required some testing.  Through his own tests, he discovered that nesting a using directive inside a  namespace declaration provides no apparent performance benefit over placing it outside the namespace.

This leads him to draw these important lesson:

  • Don’t believe everything you read, even on a Microsoft Blog.
  • Don’t believe this blog, either!
  • Decide for yourself with experiments if you need a tiebreaker!

 

The sentiment recalls Ralph Waldo Emerson’s memorable words:

 

There is a time in every man’s education when he arrives at the conviction that envy is ignorance; that imitation is suicide; that he must take himself for better, for worse, as his portion; that though the wide universe is full of good, no kernel of nourishing corn can come to him but through his toil bestowed on that plot of ground which is given to him to till. The power which resides in him is new in nature, and none but he knows what that is which he can do, nor does he know until he has tried.

Trust thyself: every heart vibrates to that iron string. Accept the place the divine providence has found for you, the society of your contemporaries, the connection of events.

 

A similar sentiment is expressed in Hobbes’ Leviathan, though with a wicked edge:

 

And as to the faculties of the mind, setting aside the arts grounded upon words, and especially that skill of proceeding upon general and infallible rules, called science, which very few have and but in few things, as being not a native faculty born with us, nor attained, as prudence, while we look after somewhat else, I find yet a greater equality amongst men than that of strength. For prudence is but experience, which equal time equally bestows on all men in those things they equally apply themselves unto. That which may perhaps make such equality incredible is but a vain conceit of one’s own wisdom, which almost all men think they have in a greater degree than the vulgar; that is, than all men but themselves, and a few others, whom by fame, or for concurring with themselves, they approve. For such is the nature of men that howsoever they may acknowledge many others to be more witty, or more eloquent or more learned, yet they will hardly believe there be many so wise as themselves; for they see their own wit at hand, and other men’s at a distance. But this proveth rather that men are in that point equal, than unequal. For there is not ordinarily a greater sign of the equal distribution of anything than that every man is contented with his share. [emphasis mine]

 

We find it again expressed in Descartes’ Discours de la méthode. Descartes, it might be remembered, occasionally exchanged letters with Hobbes:

 

Le bon sens est la chose du monde la mieux partagée; car chacun pense en être si bien pourvu, que ceux même qui sont les plus difficiles à contenter en toute autre chose n’ont point coutume d’en désirer plus qu’ils en ont.

 

Both Hobbes and Descartes formulate their defense of common sense somewhat ironically.  In a recent post, Steve Yegge takes out the irony (or perhaps takes out the kernel of truth and leaves nothing but the irony) in his argument against Joel Spolsky’s widely aknowledged criteria for a desirable employee: "smart, and gets things done."

According to Yegge, the crux of the problem is this:

 

Unfortunately, smart is a generic enough concept that pretty much everyone in the world thinks [he’s] smart.

So looking for Smart is a bit problematic, since we aren’t smart enough to distinguish it from B.S. The best we can do is find people who we think are smart because they’re a bit like us.

So, like, what kind of people is this Smart, and Gets Things Done adage actually hiring?

 

And yet the self-correcting process continues, on the principle that we are all smart enough, collectively, to solve our problems in the aggregate, even if we can’t solve them as individuals.

Presidential candidate Barack Obama recently held a news conference to correct a misunderstanding he had made a few hours earlier about his stance on the Iraq War.  According to CNN:

 

Obama on Thursday denied that he’s shying away from his proposed 16-month phased withdrawal of all combat troops from Iraq, calling it "pure speculation" and adding that his "position has not changed."

However, he told reporters questioning his stance that he will "continue to refine" his policies as warranted.

His comments prompted the Republican National Committee to put out an e-mail saying the presumed Democratic nominee was backing away from his position on withdrawal.

Obama called a second news conference later Thursday to reiterate that he is not changing his position.

 

This is, of course, merely a blip in the history of self-correction.  A more significant one can be found in Bakhtin’s attempt to interpret the works of Rabelais, and to demonstrate (convincingly) that everyone before him misunderstood the father of Gargantua. 

Bakhtin’s analysis of Rabelais in turn brought to light one of the great discoveries of his career: The Carnival — though a colleague once found an earlier reference to the concept in one of Ernst Cassirer’s works.  Against the notion of a careful and steady self-correcting mechanism in history, Bakhtin introduced the metaphor of the Medieval Carnival:

 

The essential principle of grotesque realism is degradation, that is, the lowering of all that is high, spiritual, ideal, abstract; it is a transfer to the material level, to the sphere of earth and body in their indissoluble unity.

Degradation and debasement of the higher do not have a formal and relative character in grotesque realism. "Upward" and "downward" have here an absolute and strictly topographical meaning….Earth is an element that devours, swallows up (the grave, the womb) and at the same time an element of birth, of renascence (the maternal breasts)….Degradation digs a bodily grave for a new birth….To degrade an object does not imply merely hurling it into the void of nonexistence, into absolute destruction, but to hurl it down to the reproductive lower stratum, the zone in which conception and a new birth take place.

 

The Carnival serves to correct inequalities and resentments in society and its subcultures not by setting it upon a surer footing, but rather by affording us an opportunity to air our grievances publicly in a controlled ceremony which allows society and its hierarchical institutions to continue as they are.  It is a release, rather than an adjustment.  A pot party at a rock festival rather than a general strike.

As for the Internet, it is sometimes hard to say what is actually occurring in the back-and-forth that occurs between various blogs.  Have we actually harnessed the wisdom of crowds and created a self-correcting process that responds more rapidly to intellectual propositions, nudging them over a very short time to the correct solution, or have we in fact recreated the Medieval Carnival, a massive gathering of people in one location which breaks down the normal distinctions between wisdom and folly, knowledge and error, competence and foolhardiness? 

Authority as Anti-Pattern

authority

There has been a recent spate of posts about authority in the world of software development, with some prominent software bloggers denying that they are authorities.  They prefer to be thought of as intense amateurs.

I worked backwards to this problematic of authority starting with Jesse Liberty.  Liberty writes reference books on C# and ASP.NET, so he must be an authority, right?  And if he’s not an authority, why should I read his books?  This led to  Scott Hanselman, to Alastair Rankine and finally to Jeff Atwood at CodingHorror.com.

The story, so far, goes like this.  Alastair Rankine posts that Jeff Atwood has jumped the shark on his blog by setting himself up as some sort of authority.  Atwood denies that he is any sort of authority, and tries to cling to his amateur status like a Soviet-era Olympic poll vaulter.  Scott Hanselman chimes in to insist that he is also merely an amateur, and Jesse Liberty (who is currently repackaging himself from being a C# guru to a Silverlight guru) does an h/t to Hanselman’s post.  Hanselman also channels Martin Fowler, saying that he is sure Fowler would also claim amateur status.

Why all this suspicion of authority?

The plot thickens, since Jeff Atwood’s apologia, upon being accused by Rankine of acting like an authority, is that indeed he is merely "acting". 

"It troubles me greatly to hear that people see me as an expert or an authority…

"I suppose it’s also an issue of personal style. To me, writing without a strong voice, writing filled with second guessing and disclaimers, is tedious and difficult to slog through. I go out of my way to write in a strong voice because it’s more effective. But whenever I post in a strong voice, it is also an implied invitation to a discussion, a discussion where I often change my opinion and invariably learn a great deal about the topic at hand. I believe in the principle of strong opinions, weakly held…"

To sum up, Atwood isn’t a real authority, but he plays one on the Internet.

Here’s the flip side to all of this.  Liberty, Hanselman, Atwood, Fowler, et. al. have made great contributions to software programming.  They write good stuff, not only in the sense of being entertaining, but also in the sense that they shape the software development "community" and how software developers — from architects down to lowly code monkeys — think about coding and think about the correct way to code.  In any other profession, this is the very definition of "authority".

In literary theory, this is known as authorial angst.  It occurs when an author doesn’t believe in his own project.  He does what he can, and throws it out to the world.  If his work achieves success, he is glad for it, but takes it as a chance windfall, rather than any sort of validation of his own talents.  Ultimately, success is a bit perplexing, since there are so many better authors who never achieved success in their own times, like Celine or Melville.

One of my favorite examples of this occurs early in Jean-Francois Lyotard’s The Postmodern Condition in which he writes that he knows the book will be very successful, if only because of the title and his reputation, but …  The most famous declaration of authorial angst is found in Mark Twain’s notice inserted into The Adventures of Huckleberry Finn:

"Persons attempting to find a motive in this narrative will be prosecuted; persons attempting to find a moral in it will be banished; persons attempting to find a plot in it will be shot."

In Jeff Atwood’s case, the authority angst seems to take the following form: Jeff may talk like an authority, and you may take him for an authority, but he does not consider himself one.  If treating him like an authority helps you, then that’s all well and good.  And if it raises money for him, then that’s all well and good, too.  But don’t use his perceived authority as a way to impugn his character or to discredit him.  He never claimed to be one.  Other people are doing that.

[The French existentialists are responsible for translating Heidegger’s term angst as ennui, by the way, which has a rather different connotation (N is for Neville who died of ennui).  In a French translation class I took in college, we were obliged to try to translate ennui, which I did rather imprecisely as "boredom".  A fellow student translated it as "angst", for which the seminar tutor accused her of tossing the task of translation over the Maginot line.  We finally determined that the term is untranslatable.  Good times.]

The problem these authorities have with authority may be due to the fact that authority is a role.  In Alasdaire MacIntyre’s After Virtue, a powerful critique of what he considers to be the predominant ethical philosophy of modern times, Emotivism, MacIntyre argues that the main characteristics (in Shaftesbury’s sense) of modernity are the Aesthete, the Manager and the Therapist.  The aesthete replaces morals as an end with a love of patterns as an end.  The manager eschews morals for competence.  The therapist overcomes morals by validating our choices, whatever they may be.  These characters are made possible by the notion of expertise, which MacIntyre claims is a relatively modern invention.

"Private corporations similarly justify their activities by referring to their possession of similar resources of competence.  Expertise becomes a commodity for which rival state agencies and rival private corporations compete.  Civil servants and managers alike justify themselves and their claims to authority, power and money by invoking their own competence as scientific managers of social change.  Thus there emerges an ideology which finds its classical form of expression in a pre-existing sociological theory, Weber’s theory of bureaucracy."

To become an authority, one must begin behaving like an authority.  Some tech authors such as Jeffrey Richter and Juval Lowy actually do this very well.  But sacrifices have to be made in order to be an authority, and it may be that this is what the anti-authoritarians of the tech world are rebelling against.  When one becomes an authority, one must begin to behave differently.  One is expected to have a certain realm of competence, and when one acts authoritatively, one imparts this sense of confidence to others: to developers, as well as the managers who must oversee developers and justify their activities to upper management.

Upper management is already always a bit suspicious of the software craft.  They tolerate certain behaviors in their IT staff based on the assumption that they can get things done, and every time a software project fails, they justifiably feel like they are being hoodwinked.  How would they feel about this trust relationship if they found out that one of the figures their developers are holding up as an authority figure is writing this:

"None of us (in software) really knows what we’re doing. Buildings have been built for thousands of years and software has been an art/science for um, significantly less (yes, math has been around longer, but you know.) We just know what’s worked for us in the past."

This resistance to putting on the role of authority is understandable.  Once one puts on the hoary robes required of an authority figure, one can no longer be oneself anymore, or at least not the self one was before.  Patrick O’Brien describes this emotion perfectly as he has Jack Aubrey take command of his first ship in Master and Commander.

"As he rowed back to the shore, pulled by his own boat’s crew in white duck and straw hats with Sophie embroidered on the ribbon, a solemn midshipman silent beside him in the sternsheets, he realized the nature of this feeling.  He was no longer one of ‘us’: he was ‘they’.  Indeed, he was the immediately-present incarnation of ‘them’.  In his tour of the brig he had been surrounded with deference — a respect different in kind from that accorded to a lieutenant, different in kind from that accorded to a fellow human being: it had surrounded him like a glass bell, quite shutting him off from the ship’s company; and on his leaving the Sophie had let out a quiet sigh of relief, the sigh he knew so well: ‘Jehovah is no longer with us.’

"It is the price that has to be paid,’ he reflected."

It is the price to be paid not only in the Royal Navy during the age of wood and canvas, but also in established modern professions such as architecture and medicine.  All doctors wince at recalling the first time they were called "doctor" while they interned.  They do not feel they have the right to wear the title, much less be consulted over a patient’s welfare.  They feel intensely that this is a bit of a sham, and the feeling never completely leaves them.  Throughout their careers, they are asked to make judgments that affect the health, and often even the lives, of their patients — all the time knowing that their’s is a human profession, and that mistakes get made.  Every doctor bears the burden of eventually killing a patient due to a bad diagnosis or a bad prescription or simply through lack of judgment.  Yet bear it they must, because gaining the confidence of the patient is also essential to the patient’s welfare, and the world would likely be a sorrier place if people didn’t trust doctors.

So here’s one possible analysis: the authorities of the software engineering profession need to man up and simply be authorities.  Of course there is bad faith involved in doing so.  Of course there will be criticism that they frauds.  Of course they will be obliged to give up some of the ways they relate to fellow developers once they do so.  This is true in every profession.  At the same time every profession needs its authorities.  Authority holds a profession together, and it is what distinguishes a profession from mere labor.  The gravitational center of any profession is the notion that there are ways things are done, and there are people who know what those ways are.  Without this perception, any profession will fall apart, and we will indeed be merely playaz taking advantage of middle management and making promises we cannot fulfill.  Expertise, ironically, explains and justifies our failures, because we are able to interpret failure as a lack of this expertise.  We then drive ourselves to be better.  Without the perception that there are authorities out there, muddling and mediocrity become the norm, and we begin to believe that not only can we not do better, but we aren’t even expected to.

This is a traditionalist analysis.  I have another possibility, however, which can only be confirmed through the passage of time.  Perhaps the anti-authoritarian impulse of these crypto-authorities is a revolutionary legacy of the soixantes-huitards.  From Guy Sorman’s essay about May ’68, whose fortieth anniversary passed unnoticed:

"What did it mean to be 20 in May ’68? First and foremost, it meant rejecting all forms of authority—teachers, parents, bosses, those who governed, the older generation. Apart from a few personal targets—General Charles de Gaulle and the pope—we directed our recriminations against the abstract principle of authority and those who legitimized it. Political parties, the state (personified by the grandfatherly figure of de Gaulle), the army, the unions, the church, the university: all were put in the dock."

Just because things have been done one way in the past doesn’t mean this is the only way.  Just because authority and professionalism are intertwined in every other profession, and perhaps can longer be unraveled at this point, doesn’t mean we can’t try to do things differently in a young profession like software engineering.  Is it possible to build a profession around a sense of community, rather than the restraint of authority?

I once read a book of anecdotes about the 60’s, one of which recounts a dispute between two groups of people in the inner city.  The argument is about to come to blows when someone suggests calling the police.  This sobers everyone up, and with cries of "No pigs, no pigs" the disputants resolve their differences amicably.  The spirit that inspired this scene, this spirit of authority as anti-pattern, is no longer so ubiquitous, and one cannot really imagine civil disputes being resolved in such a way anymore.  Still, the notion of a community without authority figures is a seductive one, and it may even be doable within a well-educated community such as the web-based world of software developers.  Perhaps it is worth trying.  The only thing that concerns me is how we are to maintain the confidence of management as we run our social experiment.

Navel Gazing

orestes

In Greek, it is called Omphaloskepsis — though the provenance of this term appears to be fairly recent.  Its origins seem to be found in Eastern meditative practices in which the subject concentrates on his navel, the center of his being, in order to shut out all worldly distractions.  The significance of the navel is also found in the origins of Western culture.  Omphalos, literally "navel", was used to describe the terrestrial location from which all life sprang.  It was also the name of the stone found in the temple at Delphi, purported to be the stone which Rhea swaddled in baby clothes and which the titan Chronos swallowed, believing it to be his son Zeus, prophesied to one day overthrow him.  Because of its divine origins, the Omphalos was believed to be a point of contact between the celestial and the terrestrial, and gazing at the Omphalos was equivalent to gazing into the realm of the gods.

It is also a mode of reflection common in times of uncertainty.  The IT industry is at such a point.  Any software professional who went through the .NET bust at the turn of the millennium and through the financial decline following 9/11 has an ingrained sense that changes in the economy can radically alter the face of IT.  The past five years have seen the emergence and adoption of various tools and methodologies, from Agile to Responsibility Driven Design to Domain Driven Design to alt.net, all premised on the notion that things haven’t been going well, and if we all just put our heads together we will find a better way. 

Marshall McLuhan — I believe it was in The Medium is the Massage — tells an anecdote about a factory that brought in consultants to change their internal processes, resulting in a 10% increase in productivity.  A year later, for unspecified reasons, the factory reverted to its previous processes, and surprisingly increased productivity by another 10%.  This suggested to McLuhan that sometimes what one does to bring about change is not significant in itself.  Sometimes it is simply the message of change, rather than any particular implementation, which provides results.

The various methodologies, philosophies and practices of the past few years seem to have made improvements in some software houses, but it is not clear that this is due to the inherent wisdom of the prescribed techniques rather than, as McLuhan might say, the simple message that things aren’t quite right with our industry.  The acolytes of each methodology that comes along initially cite, quite properly, Fred Brooks’s influential articles The Mythical Man-Month and No Silver Bullet.  What they rarely do, once their movements achieve a certain momentum, is revisit those early arguments and evaluate whether they have accomplished what they set out to do.  Did they solve the problems raised by Fred Brooks?  Or do they just move on and "evolve"?

Besides all the process related changes that have been introduced over the pass few years, Redmond is currently caught up in a flurry activity, and has been releasing not only new versions of their standard development tools, but a slew of alpha and beta frameworks for new technologies such as Presentation Foundation, Silverlight, Entities Framework, and MVP which threaten to radically alter the playing field, and leaves developers in a quandary about whether to become early adopters — risking the investment of lots of energy for technology that may potentially never catch on (DNA and Microsoft’s DHTML come to mind) — or stick with (now) traditional windows forms and web forms development, which may potentially become obsolete.

We also face the problem of too many senior developers.  There was a time when the .NET bubble drove all companies to promote developers rapidly in order to keep them, a tendency that kept expectations high and that did not really end with the collapse of the bubble.  Along with this, companies set the standard for senior developers, generally the highest level developers can attain short of management, as someone with ten years of development experience, a standard which, given the compact time frames of the IT industry, must have seemed a long way off.  But now we have lots of people in lots of IT departments with 10 years of experience, and they expect to be confirmed as senior developers.  Those that are already senior developers are wondering what their career path is, and management is not particularly forthcoming.

The combination of these factors means the IT population is graying but not necessarily maturing.  Management in turn is looking at ways to outsource their IT labor, under the misapprehension that IT fits into an assembly line labor model rather than a professional model in which system and business knowledge should ideally be preserved in-house.  IT, for most companies, falls on the expense side of the ledger, and one expects to find ways to make it more efficient.  The immature state of the profession, compared to medicine or teaching or even engineering, makes these efficiencies difficult to find.

Add to this an economy headed toward a recession or already in recession, and we have all the necessary ingredients for a period of deep navel gazing.  My feed reader recently picked up three, and I suspect this is only the beginning.

 

Martin Fowler’s Bliki contains a recent entry on SchoolsOfSoftwareDevelopment which deals with the problem of competing methodologies, and the Habermassian problem of agreeing on a common set of criteria upon which they may be judged.

Instead what we see is a situation where there are several schools of software development, each with its own definitions and statements of good practice. As a profession we need to recognize that multiple schools exist, and that their approaches to software development are quite different. Different to the point that what one school considers to be exemplary is considered by other schools to be incompetent. Furthermore, we don’t know which schools are right (in part because we CannotMeasureProductivity) although each school thinks of itself as right, with varying degrees of tolerance for the others.

 

A Canadian developer, D’Arcy from Winnipeg, wonders what the new Microsoft technology map entails for him.

For the last 7 years we’ve been learning the web form framework, learning the ins and outs of state management (regardless of your opinion if its good or bad), how to manage postbacks, and how to make the web bend and do summersaults based on what our needs were. And here we are again, looking at the next big paradigm shift: Webforms are still around, Rails-like frameworks are the new trend, and we have a vector based Flash-like framework that we can code in .NET. I find it funny that so many are wondering whether web forms will go away because of the new MVC framework Microsoft is developing, and *totally* ignore the bigger threat: being able to develop winform-like applications that will run on the web using Silverlight.

 

Finally, Shawn Wildermuth, the ADOGuy, has posted an existential rant on the state of the industry and the mercenary mentality on the part of management as well as labor, and the long-term implications of this trend.

In some sense we developers are part of the problem. Quitting your $75K/yr job to be hired back at $75/hr seems like a good deal, but in fact it is not a good deal for either party.  Your loyalty is to the paycheck and when you leave, the domain knowledge goes with you…

At the end of the contract you just move on, forcing you to divest in a personal stake. I miss that part of this business.  I have had more enjoyment about projects that didn’t work than all the mercenary positions I’ve ever held…

So do I have a call for action?  No. I think that domain knowledge is an important idea that both developers and companies need to address, but I don’t have a nice and tidy solution. This is a shift that I think has to happen in software development. Both sides of the table need to look long at the last five to ten years and determine if what we’re doing now is better than before.  Does it only feel better because each line of code is cheaper to produce (mostly a product of better platforms, not better coders). I hope this can change.

 

Something is in the air.  A sense of uneasiness.  A rising dissatisfaction.  An incipient awareness of mauvaise foi.  Or perhaps just a feeling that the world is about to change around us, and we fear being left behind either because we didn’t try hard enough, or because we weren’t paying attention.