Finding the correct metaphor for text-to-speech

medspeech

A recent release from the Associated Press concerning the Authors Guild’s concerns with the Kindle 2’s text-to-speech feature left many computer programmers guffawing, but it occurs to me that for those not familiar with text-to-speech technology, the humorous implications may not be self-evident, so I will attempt to parse it:

“NEW YORK (AP) — The guild that represents authors is urging writers to be wary of a text-to-speech feature on Amazon.com Inc.’s updated Kindle electronic reading device.

 

“In a memo sent to members Thursday, the guild says the Kindle 2’s “Read to Me” feature “presents a significant challenge to the publishing industry.”

 

“The Kindle can read text in a somewhat stilted electronic voice. But the Authors Guild says the quality figures to “improve rapidly.” And the guild worries that could undermine the market for audio books.”

The quality of text-to-speech depends on the library of phonemes available on the reading device and the algorithms used to put them all together.  A simple example is when you call the operator and an automated voice reads back a phone number to you with a completely unnatural intonation, and you realize that the pronunciation of each number has been clipped and then taped back together without any sort of context.  That is a case, moreover, where the relationship between vocalization and semantics is one-to-one.  The semantic meaning of the number “1” is always mapped to the sound of someone pronouncing the word “one”.   In the case of speech-to-text, no one has been sitting with the OED and carefully pronouncing every word for a similar one-to-one mapping. Instead, the software program on the reading device must use an algorithm to guess at the set of phonemes that are intended by a collection of letters and generate the sounds it associates with those phonemes. 

 

The problem of intonation is still there, along with the additional issue of the peculiarities of English spelling.  If have a GPS system in your car, then you are familiar with the results.  Bear in mind that your GPS system, in turn, is bungling up what is actually a very particularized vocabulary.  The books that the Kindle’s “Read to Me” feature will be dealing with have more in common with Borges’s labyrinth than Rand McNally’s road atlas.

 

While text-to-speech technology will indeed improve over time, it won’t be improving in the Kindle 2, which comes with one software bundle that reads in just one way.  I worked on a text-to-speech program a while back (if you have Vista, you can download it here) that combines an Eliza engine with the Vista operating system’s text-to-speech functionality.  One of the things I immediately wanted to do was to be able to switch out voices, and what I quickly found out was that I couldn’t get any new voices.  Vista came with a feminine voice with an American accent, and that was about it unless one wanted to use a feminine voice with a Pidgin-English accent that is included with the Chinese speech pack.  The only masculine voice Microsoft provided was available for Windows XP, and it wasn’t forward compatible. 

 

It simply isn’t easy to switch out voices, much less switch out speech engines on a given platform, and seeing that we aren’t paying for a software package when we buy the Kindle but rather only the device (with much less power than a Microsoft operation system), it can be said with some confidence that the Kindle 2 is never going to be able to read like Morgan Freeman.

 

The Kindle 2’s text-to-speech capabilities, or lack of it, is not going to undermine the market for audio books any more than public lectures by Stephen Hawking will undermine sales of his books.  They are simply different things.

“It is telling authors and publishers to consider asking Amazon to disable the audio function on e-books it licenses.”

This is what is commonly referred to as the business requirement from hell.  It assumes that something is easy out of a serious misunderstanding of how a given technology actually works.  Text-to-speech technology is not based on anything inherent to the books Amazon is trying to peddle.  It isn’t, for what this is worth, even associated with metadata about the books Amazon is trying to peddle.  Instead, it is a free-roaming program that will attempt to read any text you feed it.  Rather than a CD that is sold with the book, it has a greater similarity to a homunculus living inside your computer and reading everything out loud to you. 

 

The proposal from the Authors Guild assumes that something must be taken off of the e-books in order to disable the text-to-speech feature.  In fact, instructions not to read those certain e-books must be added to the e-book metadata, and each Kindle 2 homunculus must in turn be taught to look for those instructions and act accordingly, in order to fulfill this requirement.  This is a non-trivial rewrite of the underlying Kindle software as well as of the thousands of e-book images that Amazon will be selling — nor can the files already living on people’s devices be recalled to add the additional metadata.

“Amazon spokesman Drew Herdener said the company has the proper license for the text-to-speech function, which comes from Nuance Communications Inc.”

This is just a legalese on Amazon’s part that intentionally misunderstands the Authors Guild’s concerns as well as the legal issues involved.  The Authors Guild isn’t accusing Amazon of not having rights to the text-to-speech software.  They are asking whether using text-to-speech on their works doesn’t violate pre-existing law. 

 

The answer to that, in turn, concerns metaphors, as many legal matters ultimately do.  What metaphor does text-to-speech fall under?  Is it like a CD of a reading of a book, which generates additional income from an author’s labor?  Or is it like hiring Morgan Freeman to read Dianetics to you?  In which case, beyond the price of the physical book, Mr. Freeman should certainly be paid, but the Church of Scientology should not.

Second prize is a set of steak knives

 

According to rumor, Alec Baldwin ad-libbed this scene for the movie adaptation of David Mamet’s Glengarry Glen Ross, although I’ve also heard that Mamet created the monologue especially for Baldwin’s character, Blake, who does not appear in the original Pulitzer Prize winning play.  A transcription of the monologue can be found here

Blake strikes me as the ur-Father of Freud’s Civilization and Its Discontents, speaking to the id-driven sales force.  In good times, Blake comes across as a parody of all bad managers with his A-B-C rules — he is Stephen Covey’s evil twin.   In a bad business climate, however, he is the bearer of profound Hobbesean truths, and one feels obliged to internalize him and let him whisper in the back of one’s mind, for his is the voice that drives industry.

I’ve never kindled…

kippled

My employer disperses interesting technical gifts every Christmas, and this year it was to be the Kindle.  However the endorsement of the Kindle by Ms. Oprah Winfrey had apparently caused a run on the digital reader (thank goodness she did not endorse the American banking system) and so, now in mid-February, we are still not in possession of the item.  Fortunately, the Kindle 2 has been announced and our original order of several hundred Kindles has automatically been upgraded, with a release date at the end of the month.

I enjoy reading and disposing of books, which has led me to be a avid user of the local public library system.  At the same time, I rather enjoying collecting — on a modest budget — titles that I can display on my bookshelves, often unread, sadly.   Browsing my own books, which bring back memories of the periods in my life when I was inclined to peculiar interests — I have a shelf full of Husserl with titles like Ideas I, Ideas V, Ideas VI, etc. ! — is a source of great pleasure. 

Whenever I open one of my books I am uncertain of what will fall out.  Sometimes its orange peels, sometimes the whiff of stale cigarettes from college, sometimes money I thought I had cleverly hidden from robbers.  The other day I opened up a dog-eared paperback copy Benjamin’s Illuminations and naturally came across his essay on unpacking his books. 

My basement is currently filled with crates of Russian books and journals, inherited by my father-in-law from a distant relative and emigre historian of the Russian imperial family living in Prague.  The books this distant relative chose to collect reveal a thoroughly different world in which an interest in the newest theories of evolution existed side-by-side with an interest in how best to uplift one’s serfs.

The digital revolution has made many of these books now generally available, especially with the ubiquity of one-off printing.  But what mind could gather this particular collection together, with its hidden references made through the course of a life, entangling interests into an ultimate statement of a man’s life pursuit.  Absent the fact that a certain man, our distant relative, found something valuable in each of these books, and that by handling each of them we can to some degree reconstruct his life — would any computer program be able to generate this particular collection based on algorithms of selection and indexing?

While I look forward to receiving my Kindle, I find myself secretly believing that books possess an élan vital that Kindles, by their nature, lack.

My friend Mr. Conrad Roth blogged a few months ago about his less ambivalent attitude toward e-books at the Varieties here and here — which I highly recommend to anyone considering the Kindle.  Another friend, Bill Ryan, blogged about his own very positive reaction to the Kindle a few months back, as well as his attitude towards the most common criticisms of the Kindle. 

Always interested in bringing my friends together, I just wanted to take advantage of an opportunity to link both of them in one page.

Promiscuity and Software: The Thirty-one Percent Solution

 kiss

There is one big player in the software development world, and her name is Microsoft. Over the years many vendors and startups have attempted to compete against the lumbering giant, and Microsoft has typically resorted to one of two methods for dealing with her rivals. Either she pours near-unlimited money into beating the competition, as Microsoft did with Netscape in the 90’s, or she buys her rival right out. It is the typical build or buy scenario. But with the ALT.NET world, she seems to be taking a third approach.

The premise of the ALT.NET philosophy is that developers who work within the Microsoft .NET domain should still be free to use non-Microsoft sanctioned technologies, and there is even a certain Rome versus the barbarians approach to this, with Microsoft naturally taking the part of the Arian invaders. The best solution to a technical problem it is claimed (and rightly so) need not be one provided by Microsoft, which ultimately is a very small sub-set of the aggregate body of developers in the world. Instead, solutions should be driven by the developer community who know much more about the daily problems encountered by businesses than Microsoft does. Microsoft in turn may or may not provide the best tools to implement these solutions; when she doesn’t, the developer community may come up with their own, such as NHibernate, NUnit, Ajax, Windsor and RhinoMocks (all free, by the way).

What is interesting about each of these tools is that, when they came out, Microsoft didn’t actually have a competing tool for any of these technologies. Instead of competing with Microsoft on her own field, the ALT.NET community began by competing with Microsoft in the places where she had no foothold. Slowly, however, Microsoft came out with competing products for each of these but the last. MSUnit was released about three years ago to compete with NUnit. ASP.NET AJAX (formerly ATLAS, a much cooler name) competes with the various Ajax scripting libraries. ASP.NET MVC competes with the PHP development world. Entity Framework and the Unity Framework were recently released to compete with NHibernate and Windsor, respectively.

Unlike the case with the browser wars of the 90’s, Microsoft’s offerings are not overwhelmingly better. The reception of the Entity Framework (mostly orchestrated by the ALT.NET community itself, it should be admitted) was an extreme case in point, for scores of developers including a few MVP’s (Microsoft’s designation for recognized software community leaders) publicly pilloried the technology in an open letter and petition decrying its shortcomings.

Microsoft, in these cases, is not trying to overwhelm the competition. She does not throw unlimited resources at the problem. Instead, she has been throwing limited resources at each of these domains and, in a sense, has accomplished what the ALT.NET world originally claimed was their goal: to introduce a bit a of competition into the process and allow developers to select the most fitting solution.

Not too long ago I came across an article that suggested to me a less benign strategy on Microsoft’s part, one that involves ideological purity and software promiscuity. The ALT.NET world, one might be tempted to say, has a bit of a religious aspect to it, and the various discussion board flames concerning ALT.NET that pop up every so often have a distinct religious patina to them.

The relationship between ALT.NET-ers to Microsoft is a bit like the relationship of of Evangelicals and Fundamentalists to the world. We do, after all, have to live in this world, and we don’t have the ability or the influence at all times to shape it the way we want. Consequently, compromises must be made, and the only question worth asking is to what extent we must compromise. The distinction between Evangelicals and Fundamentalists rests squarely on this matter, with Evangelicals believing that some sort of co-existence can be accomplished, while Fundamentalists believe that the cognitive dissonance between their view of the world and the world’s view of itself are too great to be bridged. For Fundamentalists, the Evangelicals are simply fooling themselves, and worse opening themselves up to temptation without realizing it.

All this being background to Margaret Talbot’s article in the November New Yorker Red Sex Blue Sex: Why do so many evangelical teen-agers become pregnant?  Ms. Talbot raises the question of abstinence only programs which are widely ridiculed for being unsuccessful. 

“Nationwide, according to a 2001 estimate, some two and a half million people have taken a pledge to remain celibate until marriage. Usually, they do so under the auspices of movements such as True Love Waits or the Silver Ring Thing. Sometimes, they make their vows at big rallies featuring Christian pop stars and laser light shows, or at purity balls, where girls in frothy dresses exchange rings with their fathers, who vow to help them remain virgins until the day they marry. More than half of those who take such pledges—which, unlike abstinence-only classes in public schools, are explicitly Christian—end up having sex before marriage, and not usually with their future spouse.”

The programs are not totally unsuccessful.  In general pledgers delay sex eighteen months longer than non-pledgers.  The real indicator of the success of an abstinence only program, however, is how popular they become.  The success of an abstinence only program is ironically inversely proportional to its popularity and ubiquity.

“Bearman and Brückner have also identified a peculiar dilemma: in some schools, if too many teens pledge, the effort basically collapses. Pledgers apparently gather strength from the sense that they are an embattled minority; once their numbers exceed thirty per cent, and proclaimed chastity becomes the norm, that special identity is lost. With such a fragile formula, it’s hard to imagine how educators can ever get it right: once the self-proclaimed virgin clique hits the thirty-one-per-cent mark, suddenly it’s Sodom and Gomorrah.”

The ALT.NET chest of development tools is not widely used, although its proponents are very vocal about the need to use them.  Unit testing, which is a very good practice, has limited actual adherence though many developers will publicly avow its usefulness.  NHibernate, Windsor and related technologies have an even weaker hold on the mind share of the developer community — much less than the thirty percent, I would say — an actuality which belies the volume and vehemence, as well as exposure, of their proponents.

With the thirty-one percent solution, Microsoft does not have to improve on the ALT.NET technologies and methodologies in order to win.  All she has to do is to help the proponents of IOC, Mocking and ORMs to get to that thirty-one percent adoption level.  She can do this by releasing interesting variations of the ALT.NET community tools, thus gentrifying these tools for the wider Microsoft development community.  Even within the ALT.NET world, as in our world, there are more Evangelicals than Fundamentalists, people who are always willing to try something once.

Microsoft’s post-90’s strategy need no longer be build or buy.  She can take this third approach of simply introducing a bit of software promiscuity, a little temptation here, a little skin there, and pretty soon it’s a technical Sodom and Gomorrah.

New Years and Hard Times

I’ve been busy on multiple projects for the past month. It has kept me up on recent trends in enterprise architecture as well as hardening my hands in Silverlight development. The new year brought with it some problems in the Das Blog engine I use for this site, which prompted me to finally upgrade to the latest version. I am still trying to get various rather interesting comments approved (not sure what the problem is) and updating the theme. A quick observation, however. In December I was commuting daily with a colleague to an engagement downtown, and we both noted that although we were hearing dire warnings about the economy, neither of us actually knew anyone who had been laid off. Come January, everything has changed, and we both know many people either unemployed or facing unemployment. Worrying about how to keep my blog running smoothly is a pleasant distraction, by comparison, and so I will redouble my efforts in that direction.

Expertise and Authority

In my late teens, I went through a period of wanting to be a diplomat for the State Department.  The prospect of traveling, learning languages, and being an actor in world history appealed to me.  My father, a former case officer in Vietnam, recommended joining the CIA instead.  As he put it to me (and as old Company hands had put it to him), diplomats only ever think they know what is going on in a given country.  It is the spies that really know.

The knowledgeableness — and even competence — of intelligence agencies have been called into question over the past few years with the inability to track down bin Laden and, before that, the inability to accurately assess Iraq’s nuclear capabilities.  I was surprised to read recently in an article by John Le Carré for The New Yorker that, contrary to my father’s impression, this may have long been the case.

Discussing his time as an insider in British intelligence, Le Carré writes about his disappointment with the discrepancy between what he had imagined it to be and what it turned out to actually be.  In terms reminiscent of the longings of many career professionals, he describes “fantasizing about a real British secret service, somewhere else, that did everything right that we either did wrong or didn’t do at all.”

As an IT consultant I encounter many technical experts, and am a bit of one myself in some rather abstruse areas.  A common frustration among these experts is that expertise does not always grant them authority, as one would expect in a meritorious modern corporate society.  Instead, contrarily, they find that corporate authority tends to confer expertise.  The managerial classes inside the corporations we work with are able to dictate technical directions not because they know about these technologies, but rather simply because they have the authority to do so.

In part this is simply how the system works.  Expertise and authority go together, but not in the ways one would expect.  In the corporate world, authority granted through expertise in one area, say managerial or financial expertise and a track record of success, grants additional and possibly unjustified acknowledgment of expertise in unrelated fields.

Another reason, however, must be due to the incommunicability of IT expertise.  The field is complicated and its practitioners are not generally known for their communication abilities.  Whereas the spooks of the intelligence world are not allowed to communicate their detailed knowledge to the layman, the IT professional is simply unable to.  IT professionals speak “geek talk,” while business professionals speak corporate speak, and translators between these two dialects are few and far between.  Philosophically, however, such translations and transitions are possible, and the people who can do it make excellent careers for themselves.

What happens, however, when the whole notion of expertise is called into question.  As Stanley Rosen once said of Nietzsche, what happens when the esoteric becomes exoteric, and what we all know about our own failings and shortcomings as “experts” becomes public knowledge?

Such a thing seems to be happening now with the world economic crisis (I’m waiting for an expert to come along with a better moniker for this downward spiral we seem to all be going through, but for the moment “WEC” seems to be working).  The world economic crisis seems to have occurred because people who should have known better: bankers, traders, investors and economists, never put a stop to a problem with bad debt, bad credit and bubble markets of worldwide proportions.  As I understand it, all these people knew things weren’t kosher but were hoping to take advantage of market distortions to make huge profits before bailing out at the last moment, but like the unfortunate fellow who raced James Dean in Rebel Without a Cause, they all failed to jump when they were supposed to.

Yet they were the experts.  As back up we have men like Henry Paulson at the Treasury to fix these messes, and he started out sounding authoritative about what needed to be done.  We needed $700 billion to fix the situation or at least to make it not so bad and the government had a plan, we were told, to do so.  However, the plan has mutated and meandered to the point that it now looks like it is being made up as we go along.  This in itself may not be such a bad thing, but is this meandering the sort of thing experts are supposed to do?

Recently the heads of the automotive industry came to Washington to ask for bailout money and, as we now all know, they didn’t have a plan for how they planned to spend it.  Is that how experts act?

After the flood, the big discussion now seems to be whether we should try to preserve our laissez-faire system or try to improve it and correct it with more regulation.  The sages of Wall Street seem to actually like this solution, which is in itself an admission that they no longer see themselves as experts or, apparently, of even being capable of managing their own affairs.  They would prefer that another authority correct their own excesses for them, since they no longer trust themselves.

But if there are no experts any longer on Wall Street, where all they had to do was look after their own interests, can we really expect to find one in Washington that will look over all of our interests?  I don’t mean to be a knee-jerk conservative on this matter, but does it make sense that when our clever people make it clear that they are not so clever or competent after all, we must look for someone that much more clever than all of them put together to fix things?  Can that level of expertise even exist?

And so I find myself fantasizing about a different America, indeed a different world, in which they get everything right that we either do wrong or don’t do at all.

Of Zombies VI: The Rise and Fall of the Zombie Threat

By way of Boing Boing, the io9 site has created a chart correlating the production of zombie movies with social upheavals in America and the world.  The inference one is expected to make is that Zombie movies are a symptom of unrest, either as a mirror to them or as an attempt at escapist self-therapy.

This narrative follows well known pop-analyses of Invasion of the Body Snatchers as a response to Cold War fears and George A. Romero’s Living Dead movies as a reflection of mind-numbing American consumerism.

What the chart reminds me most of, however, is Alan Wolfe’s book The Rise and Fall of the Soviet Threat which begins with a quote from Ronald Reagan:

“Let’s not delude ourselves,” Ronald Reagan said in 1981.  “The Soviet Union underlies all the unrest that is going on.  If they weren’t engaged in this game of dominoes, there wouldn’t be any hot spots in the world.”  Not since the cold war began, and perhaps never before in American history, has an administration come to power with as insistently hostile an attitude toward the Soviet Union as that of Ronald Reagan.

Wolfe’s thesis is ultimately an inversion of the Zombie = unrest argument.  Wolfe’s book was immensely popular in political science departments in the 80’s because it attempted to demonstrate that the American perception of the Soviet Threat moved independently of the “actual” Soviet Threat at any point in the 50 year history of the Cold War.  He argued for the lack of correlation between the perception and the reality of our fears.  Bear in mind that this was a welcome argument in a time when the left still suffered from mal foi following Solzhenitsyn’s publication of  the third volume of The Gulag Archipelago. The ground for this sort of argument may also have been prepared by Thomas Kuhn’s The Structure of Scientific Revolutions (claiming that our perception of science as a continuous progression does not conform to the historical reality) which had achieved a broad cross-discipline appeal by the time that Wolfe’s thesis came out. 

Today Kuhn is mostly remembered for burdening us with the phrase “paradigm shift,” though in his day he provided the model (perhaps unintentionally) for a broad range of arguments that attempted to demonstrate that reality is not what it seems, but instead is something constructed (a word that naturally entails that we must at some point de-construct it, of course) by social forces.  Reality is a manifold of social constructs.

The Zombie literature is interesting, among other things, because it attempts to go the other way.  People who write about the horror genre are always tempted to take what falls clearly within the realm of subjectivity and personal taste and find some sort of correlate for it in the “real” world.  This is true of many fringe interests, for instance sci-fi, pop music, prime-time television, software programming.  We all want to find deep meaning in the things we recognize as subjectively meaningful for us.

The summum bonum would certainly be achieved if each of our personal interests were acknowledged as universally meaningful.  Why shouldn’t we spell words the way each of us prefers to, or use grammar in the way we think best?  Why shouldn’t zombie movies have the same cultural status as the novels of Dostoevsky?  Why shouldn’t Ayn Rand be found next to Rousseau at the local book store?  If a thing has meaning for us, shouldn’t this meaning be reflected in the world?

The notion that the meaning of a name is the thing in the world that it points to (its referent) was originally formulated by J.S. Mill and is known as Mill’s Theory of Names.  It is also sometimes called the “Fido”-Fido theory, for obvious reasons.  Fido, in turn, is a 2006 movie about the efforts of a small band of survivors to reconstruct society along 1950’s lines following a major social upheaval.  The social upheaval, of course, is a zombie epidemic.  Coincidence?  I think not.

Atlanta .NET User Group

I will be presenting on “Working with new ASP.NET features in .NET Framework 3.5 Service Pack 1” at the Atlanta .NET User Group on Monday, October 27th.  Magenic will be providing refreshments, as usual.   The meeting will begin at 6:00 PM at Microsoft’s Offices in Alpharetta.  It’s a lot of material to pack into an hour long presentation, but I think I have a few good strategies for working with that.  The presentation will cover Dynamic Data, Entity Framework, Data Services, the Silverlight Media Control, the Ajax browser history feature built into the Script Manager, and Script Combining.  That gives me about 10 minutes per technology.  Whew.

Microsoft Corporation
1125 Sanctuary Pkwy.
Suite 300
Atlanta, GA 30004

Directions to Microsoft

The problem with computer literacy

basoliA recent post on Boing Boing is titled Paper and pencil better for the brain than software?  The gist of the article and its associated links is that software, in guiding us through common tasks, actually makes us dumber.  The Dutch psychologist Christof van Nimwegen has performed studies demonstrating the deleterious effects of being plugged-in.  From the post:

“Van Nimwegen says much software turns us into passive beings, subjected to the whims of computers, randomly clicking on icons and menu options. In the long run, this hinders our creativity and memory, he says.”

This certainly sounds right to me, from personal experience.  About a year ago, my company gave away GPS navigation devices as Christmas gifts to all the consultants.  The results are twofold.  On the one hand, we all make our appointments on time now, because we don’t get lost anymore.  On the other, we have all lost our innate sense of direction — that essential skill that got the species through the hunter-gatherer phase of our development.  Without my GPS, I am effectively as blind as a bat without echolocation.

In Charles Stross’s novel about the near future, Accelerando, this experience is taken a step further.  The protagonist Manfred Macx is at one point mugged on the street, and his connection to the Internet, which he carries around with him hooked up to his glasses, is taken away.  As a man of the pre-singularity, however, his personality has become so distributed over search engines and data portals that without this connection he is no longer able to even identify himself.  This is the nightmare of the technologically dependent.

Doctor van Nimwegen’s study recalls Plato’s ambivalence about the art of writing.  His mentor Socrates, it may be remembered, never put anything to writing, which he found inherently untrustworthy. Consequently all we know of Socrates comes by way of his disciple Plato.  Plato, in turn, was a poet who ultimately became distrustful of his own skills, and railed against it in his philosophical writings.  From the modern viewpoint, however, whatever it is that we lose when we put “living” thoughts down to writing, surely it is only through poetry that we are able to recover and sustain it.

It is through poetic imagery that Plato explains Socrates’s misgivings about letters in the Phaedrus:

At the Egyptian city of Naucratis, there was a famous old god, whose name was Theuth; the bird which is called the Ibis is sacred to him, and he was the inventor of many arts, such as arithmetic and calculation and geometry and astronomy and draughts and dice, but his great discovery was the use of letters. Now in those days the god Thamus was the king of the whole country of Egypt; and he dwelt in that great city of Upper Egypt which the Hellenes call Egyptian Thebes, and the god himself is called by them Ammon. To him came Theuth and showed his inventions, desiring that the other Egyptians might be allowed to have the benefit of them; he enumerated them, and Thamus enquired about their several uses, and praised some of them and censured others, as he approved or disapproved of them. It would take a long time to repeat all that Thamus said to Theuth in praise or blame of the various arts. But when they came to letters, This, said Theuth, will make the Egyptians wiser and give them better memories; it is a specific both for the memory and for the wit. Thamus replied: O most ingenious Theuth, the parent or inventor of an art is not always the best judge of the utility or inutility of his own inventions to the users of them. And in this instance, you who are the father of letters, from a paternal love of your own children have been led to attribute to them a quality which they cannot have; for this discovery of yours will create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.

We can certainly see aspects of Manfred Macx’s experience of disorientation in our dependence on tools like Google and Wikipedia, which provide us all with the same degree of wisdom, or at least the same show of wisdom.  In tracking down the above quote about Theuth, I had to rely on a vague reminiscence that this memory passage occurred in either the Timaeus or the Phaedrus. I then used my browser search functionality to track down the specific paragraph.  Very handy, that search feature.  But how much more wonderful it would have been had I been able to call that up from my own theater of memory.

My only stand against the steady march of progress (from which I make my living, it should be remembered) is that I turn my spell-checker off when I write emails and articles.  A consulting manager recently chastised me for this practice, which he found error prone and somewhat irresponsible.  To this I could only reply, “but I already know how to spell.”  

I should have added, “…for now.”

Visual Studio 2008 SP1 Toolbox Crash

For the past month or so, whenever I tried to add a control to my Visual Studio Toolbox, the IDE would shut itself down.  My solution, of course, was to avoid adding tools to my Toolbox.

Finally I decided that I needed to do something, ahem, a little smarter.  The specific problem occurred when I tried to use the Context Menu’s “Choose Items…” option on my toolbox.   It turns out that the Power Commands (when did I install that?) has a conflict with VS 2008.  This apparently can also mess up the class viewer in Visual Studio 2008.  There are two work-arounds for this.   The first is to hack a config file for your IDE settings.   That solution can be found here: http://social.msdn.microsoft.com/Forums/en-US/vssetup/thread/e2434065-9921-4861-b914-9cc9d6c55553/

 

Unfortunately this didn’t work for me.  The second work-around is simply to uninstall the Power Commands.  If you go into Add/Remove Programs, it is listed as PowerCommands for Visual Studio 2008.

PowerCommands

Oddly enough, Add or Remove Programs says I used this particular tool frequently, even though I’m not exactly sure of how I used it.