HackNot - The Industry

Part of Hacknot: Essays on Software Development

The Crooked Timber of Software Development1

Out of the crooked timber of humanity no straight thing was ever made.” – Immanuel Kant

Imagine you are a surgeon. You are stitching a wound closed at the end of a major procedure, when you are approached by the chief surgeon, clad in theatre garb. He explains that, inkeeping with recently introduced hospital policy, you are required to use a cheaper, generic brand of suture material, rather than the more common (and more expensive) brand you are accustomed to using. He orders you to undo the stitching you’ve done, and redo it using the generic brand.

Now you are in an ethical quandary. You know that the cheaper suture material is not of the same strength and quality as the usual type. You also know that it is a false economy to skimp on sutures, given that the amount of money to be saved is trivial, but the increased risk to the patient is decidedly non-trivial. Further, it seems unconscionable to be miserly on such critical materials. But on the other hand, the chief surgeon wields a lot of political might in the hospital, and it would no doubt be a career-limiting move to ignore his instruction. So what do you do?

As a health professional, there is simply no question. You are legally and ethically obliged to act in the best interests of the patient and there are serious consequences if you fail to do so. The penalties for malpractice include financial, legal and professional remedies. You can be fined, sued for malpractice, or struck from the register and rendered unable to practice. In the light of the system’s support and enforcement of good medical practice, you complete the stitching using the standard suture material, then express your concerns to the chief surgeon. If you don’t get satisfaction, you can take the matter further.

Now let’s examine a similar situation in our own industry. Suppose a software developer is trying to decide which of a set of competing technologies should be used on a project. One technology stands out as clearly superior to the others in terms of its suitability to the project’s circumstances. Upon hearing of the technology chosen, the company’s senior architect informs the developer that they have made the wrong decision, although they cannot explain why that is the case. The architect directs you to use a technology you know to be inferior, and makes it clear that it would be a career-limiting move to ignore his instruction. Again, what do you do?

My observations over the last twelve years working as a software developer leave me in no doubt what the probable outcome is. You shake your head in disbelief, and use the technology you are instructed to use, knowing that the best interests of both the project and its sponsors has just been seriously compromised. Why is the situation so different from the previous medical scenario? The basic answer is this: medicine is a profession, but software development merely an occupation.

A Profession Is More Than An Occupation

As it is used in common parlance, the word “profession” refers to the principle occupation by which you earn an income. But this is not its true meaning. A true profession has at least the following characteristics:2

  • Minimum educational requirements – Typically an accredited university degree must be completed.
  • Certification & licensing – Exams are taken to ensure that a minimum level of knowledge has been obtained. These exams target an agreed upon body of knowledge that is considered central to the profession.
  • Legally binding code of ethics – Identifies the behaviors and conduct considered appropriate and acceptable. Failure to observe the code of ethics can result in ejection from professional societies, loss of license, or a malpractice suit.
  • Professional experience – A residency or apprenticeship with an approved organization to gain practical skills.
  • Ongoing education – Practitioners are required to undertake a minimum amount of selfeducation on a regular basis, so that they maintain awareness of new developments in their field.

Notice that software development has none of these elements. Anyone, regardless of ability, education or experience can hang out a shingle calling themselves a “software developer,” without challenge. Worse, practitioners may behave in any manner they choose, without restraint. The strict ethical requirements of a medical practitioner aim to ensure that the patients needs are best served. In the absence of such requirements, a software developer is free to scheme, manipulate, lie and deceive as suits their purpose – consequently we see a great deal of exactly this type of behavior in the field.

Integrity

The key concept in any profession is that of integrity. It means, quite literally, “unity or wholeness.” A profession maintains its integrity by enforcing standards upon its practitioners, ensuring that those representing the profession offer a minimum standard of competence. Viewed from the perspective of a non-practitioner, the profession therefore offers a consistent promise of a certain standard of work, and creates the public expectation of a certain standard of service.

Individuals, also, are required to act with integrity. It is not acceptable for them to say one thing and do another e.g. to promise to always act in the best interests of a patient or client, but then let personal interests govern their action. What is said and what is done must be consistent.

This cultural focus upon integrity is entirely missing from the field of software development, and demonstrates the vast gap in maturity that exists between our occupation and the true professions. If we are ever to make a profession of software development, to move beyond the currently fractured and uncoordinated group of individuals motivated by self-interest, with little or no concern for the reputation or collective future of their occupation, then some fundamental changes in attitude must occur. We must begin to value both personal and professional integrity and demonstrate a strong and unwavering commitment to it in our daily professional lives.

Think about it – what are your ethical and professional obligations in your current position. Are you fulfilling them? Look to ethical codes such as those offered by the ACM3 and the IEEE-CS4, even if you are not a member of these societies. Although not legally binding, they at least demonstrate the sorts of concerns you should be championing in your everyday work. You will find that their central focus is upon always acting with integrity; always representing the best interests of the client. Specifically, you will note that the following behaviors, as commonplace as they are amongst developers, are antithetical to ethical conduct:

  • Choosing technologies and solutions because they are “cool”, have novelty value or look good on your CV.
  • “Going with the flow” or “keeping a low profile: i.e. remaining deliberately distant from or ignorant of issues which affect the quality of service delivered to the customer. You must be willing to voice unpopular facts or express controversial opinions if you have reason to believe that not doing so will compromise the service delivered to a client.
  • Distancing yourself from others who are attempting to maintain a minimum standard of work or conduct, so as to avoid any political risk yourself. If you are aware of a challenge to the ethical standards of your profession, you are obliged to defend those standards, even if you have not been directly involved.
  • Letting unethical conduct go unchallenged. To observe unethical conduct and say nothing is to offer a tacit endorsement of that behavior. Saying “It’s not my problem,” “It’s none of my business” or “I’m glad that didn’t happen to me” is not acceptable. Next time, it may be happening to you.

There’s no denying that acting ethically can have a personal cost, perhaps quite a profound one. It would be naive to think that attempts to contradict or combat unethical behavior are not likely to result in some attempt at retribution. Even in professions with legally binding codes of ethics, this is the case. In software development, where it is a moral free-for-all, it is particularly so. Raising ethical objections, voicing unpopular facts, standing up for the client’s rights where they conflict with some manager’s self-interest – all of these actions bring a very real risk of retribution from offended parties, that may include losing your job. Because ours is not a true profession, there is no protection – legal or otherwise –- for a developer who speaks the truth and in so doing defies authority. Whoever is most adept at bullying, intimidation and political manipulation is likely to hold sway.

I suspect that more than a few of the incidents we have recently seen involving the termination of bloggers for alleged indiscretions on their blogs have been excuses for employers to remove inconvenient employees who threaten the status quo. Although superficially plausible reasons may be offered for such action, they may well be nothing more than an excuse for retribution against the employee for challenges they have made to the employer’s unethical behavior.

There Was A Crooked Man

In assessing the personal cost of ethical action, it helps to maintain a broader perspective. In our industry, jobs come and go like the seasons. Due to the prevalence of contract work, many software developers will likely have dozens of employers in their careers. Rather than viewing our work as a series of unrelated engagements, I believe we need to view our efforts as part of a larger process – the maturation of an occupation into a true profession. Seen from this angle, the significance of any particular job (or the loss of it) is lessened and the importance of the over-arching principles becomes more obvious.

As they say, the chain is only as strong as its weakest link. The strength of our reputation and worth as a burgeoning profession is therefore dependant upon the strength of the individual’s commitment to maintaining a high personal standard of ethics. The integrity of the whole is contingent upon the integrity of the parts.

Some years ago I read the following statement, which for its truth and boldness has stuck with me ever since:

The best managers are the ones that come into work each day prepared to lose their job.

In other words, unless you remain willing to walk away from a job, the threat of termination can always be used against you, and used as leverage to encourage or excuse unethical behavior. The same reasoning applies to developers as it does to managers. The same ethical obligations and the same obstacles to fulfilling them are present.

In 1985, David Parnas resigned his position as member of a U.S. Defense Department Committee advising on the Strategic Defense Initiative (SDI). He felt, with good reason, that the goals set for the SDI were entirely unachievable, and that the public was being misled about the program’s potential. Others urged him to continue, and continued with it themselves, even though they shared his beliefs about the feasibility of the programs fundamental objectives. They reasoned that, even though the desired outcomes wouldn’t be achieved, there was good funding to be had that might be put into ostensibly “contributing efforts”, and the opportunity was too good to miss. When Parnas resigned, he wrote a series of eight papers5 outlining both his reasons for doing so, and the fundamental issues about software professionalism that the SDI issue had bought to light. Unfortunately, he have very few men of his quality in our occupation.

Parnas summarized a professional’s responsibility in three statements, which I conclude with here:

  • I am responsible for my own actions and cannot rely on any external authority to make my decisions for me.
  • I cannot ignore ethical and moral issues. I must devote some of my energy to deciding whether the task that have been given is of benefit to society.
  • I must make sure that I am solving the real problem, not simply providing short-term satisfaction to my supervisor.

From James Dean to J2EE: The Genesis of Cool6

It has always been the purview of the young to define what “cool” means to their generation. In the fifties, cool was epitomized by James Dean. Teenagers rushed to emulate him in looks and manner. Cigarettes, leather jackets, sports cars and a crushing sense of parent-induced angst were the hallmarks by which these youth declared both their distance from the previous generation and unity within their own.

In the sixties, the hippy generation stepped off the path to maturity their parents had planned out for them, put flowers in their hair and went on a drug assisted exploration of their own psyche to the soundtrack of Jimi Hendrix and The Jefferson Airplane. The meaning of cool became a little more diffuse. As an adjective of laid back approval, it still carried the antiauthoritarian flavor of the previous decade; but was broad enough to include almost anything of an unconventional nature.

In the seventies, bigger was better. Wide collars and ties, flared trousers and ostentatious jewelry were the adornments of the young and cool. Disco was king and the Bee Gees were the kings of disco. The definition of cool could only be broadened to accommodate the crass symbols of consumerism that the cultural elite filled their home and their wardrobes with. For the first time, cool was as much about earning capacity as it was about rebellion.

In the eighties, consumerism and technology joined forces to highjack cool from the hands of the kids. It became an adjunct to the management buzzwords and marketing neologisms that littered the corporate lingo. The electronics companies created synthesizers that dominated the music of the decade, and sold them back to the youth who were wondering what had become of cool. “Behold”, they said, “this is technology and verily, it is cool.”

In the nineties, cool went through its final stage of deconstruction to become the meaningless mouth-noise that we have today. With the unexpected rise in popularity of the Web and its accompanying soap bubble of financial optimism, cool became the adjective of choice for the technically literate. In keeping with their unfettered enthusiasm and cavalier attitude, dot-com entrepreneurs everywhere looked up only briefly from their Palm Pilots to heap uncritical praise upon every new technology and gadget that passed across their expansive desks.

The Future Of Cool

This decade, “cool” means nothing. It is a label applied so ubiquitously and indiscriminately that it could compete with “nice” for the title of “Most Ineffectual Adjective in Common Usage.” The retro punk rockers with their double basses and Gibson Epiphones think they have it. The Feng Shui consultants and the new age drop-outs think it has something to do with Atlantis. The advertising executives and middle managers know that they had it once, but then it slipped between the cushions of their leather lounges along with their ridiculously miniature mobile phones.

But most laughably of all, we the techies think that we have it. Surprised to find that technology is now cool, we feel justified in labeling the geekiest of our enthusiasms with this meaningless endorsement. Pop quiz: Which of the following are cool?

  • Open source
  • Linux
  • Visual Basic
  • Windows XP
  • Extreme Programming
  • MP3
  • Quake
  • J2EE
  • .NET

There are no correct answers to this quiz, and your response means nothing – unless you voice it with breathless enthusiasm while gazing in a shop window.

In the coming year, cool will lead us everywhere and nowhere, with the following predictable detours:

  • Many software projects will be initiated by software developers with a cool hammer looking for some business-case nails to justify their expenditure. Projects thus founded will fail, but not before the developers have had a nice time playing with their new hammers and increasing their market appeal to future employers in search of the latest coolness.
  • Many vendors will grunt out another selection of half-baked products that promise a world of coolness but deliver instead a slew of bugs, patches and service packs. The products these same vendors previously marketed as cool will be mysteriously absent from their catalog, although many of the newer products will bare an uncanny resemblance to their predecessors.
  • The shelves of technical book stores will overflow with 500 page tomes promising a quick path to mastery of these latest technologies. The speed with which these books are issued and revised will equal or exceed the release rate of the technologies they describe.
  • Many legacy systems that have been providing satisfactory service for years will be decommissioned and replaced with systems based on newer and cooler technologies. These replacements will be less reliable than their predecessors.
  • Technology selection based on hard-headed empiricism will be viewed as impossibly expensive and time consuming, and abandoned in favor of emotive decision making based on marketing promises and perceived tech appeal. We will be too busy climbing the learning curves of the latest software development gear to have any time remaining in which to quantifying the costs and benefits of doing so. Hamsters … exercise wheels … same old story.

The overall success and failure rates of software projects will remain much as it was last decade, and everyone will bemoan the sad state of software development.

IEEE Software Endorses Plagiarism7

plagiarize – take (the work or an idea of someone else) and pass it off as one’s own. – The New Oxford Dictionary of English

Ours is an occupation obsessed with invention and novelty. Every week it seems that some new technology or development technique arrives, heralded by a fanfare of hype and a litany of neologisms. So keen are we to exploit the community’s enthusiasm for newness that we will even take old ideas and rebadge them, offering them up to our colleagues as if they were original.

Every time I see such reinvention, I feel a certain discomfort. There seems to me something fundamentally wrong with positing work as being entirely your own, when it in fact borrows, duplicates or derives from the work of others.

In science, precedence counts for a great deal and authors are usually generous and fastidious in providing correct attribution and acknowledgement of former discoveries which their own work has benefited from. Indeed, a broad indication of the significance of a paper is the number of subsequent citations that the work receives. In software development, there appears to be rather less respect for the contributions that others make; perhaps even a certain contempt for prior art.

Fail Fast

A particularly egregious example of this disrespect for precedence appeared in the Sept/Oct 2004 issue of IEEE Software, in an article in the Design section by Jim Shore called Fail Fast8. The section editor is Martin Fowler.

Shore describes “a simple technique that will dramatically reduce the number of bugs in your software”. His technique, which he considers “nonintuitive” is to write your code so that it fails “immediately and visibly.” This is achieved by putting assertions at the beginning of each method, that check the validity of the values passed to the method’s arguments, throwing a run-time exception if invalid values are encountered.

For example, if you write a method for finding the positive square root of a non-negative argument, you make the expectation of “non-negativity” explicit at the beginning of the method, like this:

  public void squareRoot(float value) {
    if (value < 0.0) {
      throw new SomeException(value);
    }
    // More code goes here
  }

This technique is the antithesis of defensive programming, which would encourage us to make the method as tolerant of unexpected input as possible.

Shore then goes to some lengths to enumerate the strengths of this technique, such as:

  • When failure occurs, the result is a stack trace that leads directly to the source of error. Code that doesn’t fail-fast can sometimes propagate errors to other portions of the call hierarchy, finally to fail in a location quite distant from the original point of error.
  • Reduced use or elimination of a debugger; the messages from the assertion failures are sufficient to localize the error.
  • Logging of assertion failures provide excellent debugging information for maintenance programmers who later diagnose a production failure from log files.
  • Reduced time and cost of debugging.

There are no citations anywhere within the article; nor does it specify any references. The author (and by extension, the editor) are apparently content to have you believe that this concept is new and original.

Design By Contract

You may well be familiar with the term Design by Contract (DBC). The term was coined by Bertrand Meyer, and a full exposition of it may be found in Chapter 11 of his excellent text Object Oriented Software Construction 2. Shore’s Fail Fast technique is nothing more than a re-naming of a subset of the concepts within DBC. In short, “Fail Fast” is entirely derivative in nature.

For those who have not previously encountered it, DBC is a technique for specifying the relationship between a class and its clients as a formal agreement9 – a contract. A contract is expressed as an assertion of some boolean conditional statement. When the condition is false, the contract is said to fail; which results in the throwing of a runtime exception.

Broadly speaking there are three types of contracts – preconditions, postconditions and invariants. The Fail Fast technique relies only upon preconditions – assertions placed at the beginning of a method that specify the conditions the method assumes to be true. The topic of DBC is fairly involved, particularly with regard to the way that contracts accumulate across inheritance relationships. Meyer’s exegesis of DBC is vastly superior to the limited discussion of preconditions (under the new name “Fail Fast”) given by Shore.

Not only does Shore co-opt the work of others, he combines it with bad advice regarding the general use of assertions. Shore claims:

When writing a method, avoid writing assertions for problems in the method itself. Tests, particularly test-driven development, are a better way of ensuring the correctness of individual methods.

This is the purest nonsense. Assertions are an excellent way of documenting the assumed state of a method mid-way through its operation, and are helpful to anyone reading or debugging the method body. This was first pointed out by Alan Turing back in 1950:

How can one check a large routine in the sense that it’s right? In order that the man who checks may not have too difficult a task, the programmer should make a number of definite assertions which can be checked individually, and from which the correctness of the whole program easily follows.10

In contrast to Shore, Meyer is generous in his acknowledgement of predecessors and contributors to DBC itself. Section 11.1 of his text has an entire page of “Bibliographical Notes” in which he acknowledges the work of Turing, Floyd, Hoare, Dijkstra, Mills and many others. Indeed, he has delivered an entire presentation on the conceptual history of DBC prior to his own involvement.11

Giving Credit Where Credit Is Due

Such misattribution and inattention to precedence as Shore’s harms our profession in several ways:

  • It is professionally discourteous in that it denies those who develop and originate work their proper credit.
  • It discourages modern readers from exploring the history of the concepts they are presented with, thereby denying them an opportunity to deepen their knowledge through exploration of the prior art. Meyer has already expounded the benefits of “fail fast” versus “defensive programming” at length. If Shore’s article had appropriate citations, readers would be directed towards this better and more detailed explanation, and would realize that the concept can be taken much, much further through postconditions, invariants, and inheritance of contracts.
  • It garners false credit for those who ignore the precedence of other’s work, encouraging others to do the same – diverting energy into the re-labeling of already known concepts that could otherwise be directed into new areas.
  • It creates confusion amongst the readership and obfuscates links with the existing body of knowledge. Central to any epistemological effort is a consistent naming scheme, so that links between new discoveries and existing concepts can be identified. Renaming makes it difficult, particularly for those new to the field, to distinguish new from old concepts.

Conclusion

To have work published in a peer reviewed journal is a significant achievement. It means that one’s work has been found to make a worthwhile contribution to the literature, and to be of a high professional standard. By these criteria, the Fail Fast article by Jim Shore in the Sept/Oct 2004 issue of IEEE Software should not have been published. The material it presents as being new and original is a superficial (and flawed) restatement of earlier work by Meyer, Hoare and others. It should be cause for concern for us all that a high profile, professional journal should publish work that is derivative and misrepresentative. Those who reviewed Shore’s article prior to publication, and the editor/s who approved its publication deserve the harshest admonishment for effectively endorsing plagiarism.

Early Adopters or Trend Surfers?12

Q: What are the most exciting/promising software engineering ideas or techniques on the horizon?

A: I don’t think that the most promising ideas are on the horizon. They are already here and have been here for years but are not being used properly.

– Interview with David L Parnas

Many software developers pride themselves on being up to date with the latest software technologies. They live by the credo “beta is better” and willingly identify themselves as early adopters. The term “early adopter” comes from the seminal work on technology transfer Diffusion of Innovations by Everett M. Rogers (1962). He categorizes the users of a new innovation as being innovators, early adopters, early majority, late majority and laggards. Innovators and early adopters constitute about 16% of the user population.

Amongst the software development population, that percentage must be significantly higher, given the technological orientation of most practitioners. Consider the following selection of recent technologies and their respective dates of introduction. Observe how quickly these technologies have become main stream. In about five years a technology can go from unknown to common place. In ten years it is passé?


Technology Introduced JSP 1998 EJB 1998 .NET 2002 Java 1995 J2EE 1999 SOAP 2000 Microsoft Windows 1993 GUI 1974 ——————- —————-

Now consider the following software development practices:


Practice First Noted Source code control 1980 Inspections 1976 Branch coverage testing 1979 Software Metrics 1977 Throwaway UI prototyping 1975 Information Hiding 1972 Risk Management 1981 ————————– —————–

Why is it that after, in some cases, 20 years worth of successful application in the field, often accompanied by repeated empirical verification of their worth, many of these practices are yet to be considered even by the early majority?

Adopting new technologies is easy, but changing work practices is hard. Technologies are “out there” but work practices are distinctly personal. And new technologies promise immediate gratification by way of satisfying the hunger for novelty.

Reuse is Dead. Long Live Reuse.13

Reuse is one of the great broken promises of OO. The literature is full of empirical and anecdotal evidence to this effect. The failure to realize any significant benefit from reuse is variously ascribed to technical, organizational and people factors. Observation of the habits and beliefs of my fellow software engineers over many years leads me to believe that it is the latter which poses the principle obstacle to meaningful reuse, and which ultimately renders it unachievable in all but the most trivial of cases.

Hubris is a common trait amongst software developers and brings with it a distrust and disrespect for the work of others. This “not invented here” attitude, as it is commonly known, leads developers to reinvent solutions to problems already solved by others, driven by the conviction that the work of anonymous developers must be of dubious quality and value. Some simply prefer “the devil you know” - figuring that whatever the shortcomings of a solution they may write themselves, their familiarity with it will sufficiently reduce the cost of subsequent maintenance to justify the cost of duplicating the development effort. Evidence of this drive to reinvention is everywhere. Indeed, the collective output of the open source movement is proof of the “I can do better” philosophy in action.

Consider what it is about software development that attracts people to it. In part, it is the satisfaction that comes from solving technical problems. In part, it is attraction to the novelty of new technologies. In part, it is the thrill of creating something that has a life independent of its original author. Reuse denies the developer all of these attributes of job satisfaction. The technical problem is already solved, the new technology has already been mastered (by somebody else), and the act of creation has already occurred. On the whole, the act of reuse is equivalent to surrendering the most satisfying aspects of one’s job.

So what degree of reuse can coexist with such a mindset? Certainly we may abandon hope for any broad reuse such as that promised by frameworks. Instead, we may expect frameworks themselves to proliferate like flowers in spring. The greater the scope of the potential reuse, the greater the opportunity to disguise technology lust and hubris as genuine concerns over scalability or applicability.

I believe the only reuse likely to be actually realized is in the form of limited utility libraries and perhaps small GUI components. If the problem the potentially reusable item solves is seen as technically novel or intriguing, then reinvention will result. If there is no entertainment, novelty or career value in reinvention then begrudging reuse may result simply as a way of avoiding “the boring stuff.” But as long as developers are willing to use their employer’s time and money to satisfy their personal ambitions; and as long as they continue to believe they hold a personal monopoly on reliable implementation, then the cost advantage of reuse will remain a gift that we are too proud to accept.

All Aboard the Gravy Train14

Hype is the plague upon the house of software.” – Richard Glass

It is interesting to watch the software development landscape change underfoot. As with many geographies, the tremors and shifts which at first appear random, when more closely examined reveal an underlying order and structure that is more familiar and less mysterious.

Recently, some of the loudest rumblings have been coming from that quarter whose current fascination is the scripting language Ruby, and its database framework Rails. Think back to the last cycle of hype you saw in our industry – perhaps the Extreme Programming craze – and you’ll recognize many of the phenomena from that little reality excursion now reoccurring in the context of Rubyism. There are wild and unverifiable claims of improved productivity amidst the breathless ravings of fan boys declaring how cool it all is. There are comparisons against precursor technologies, highlight faults that are apparently obvious in hindsight, but were unimportant while those technologies were in fashion. And above all there is the frenetic scrambling of the “me too” crowd, rushing to see what the fuss is all about, desperately afraid that the bandwagon will pass them by, leaving them stranded in Dullsville, where nothing is cool and unemployment is at a record high.

But this crowd faces a real dilemma, for there are multiple bandwagons just ripe for the jumping upon. Which to choose?

The Web 2.0 juggernaut has been on tour for some time, despite the lack of a cogent definition. The AJAX gang have also been making a lot of noise, mainly because the Javascript weenies can’t contain their excitement at being in the popular group again.

But how and why does all this techno-fetishism get started?

Now Departing On Platform One

Welcome aboard the gravy train, ladies and gentleman. Our next stop is Over-enthusiasm Central. Please be advised that critical thought and a sense of perspective are not permitted in the passenger compartment. Please ensure that your safety belt is unfastened while the red dollar sign is illuminated. We know that you have a choice of bandwagons, and thank you for your choice to bet the farm upon this one. We promise – this time it’ll be different.

The endless cycle of technological and methodological fashions that so characterizes our industry is the result of a symbiotic relationship between two groups – the sellers and the buyers.

The sellers are the parties who are out to create a “buzz,” generating a desire for some technology-related product. They include the corporate vendors of new technologies such as Sun and IBM. Alongside them are the pundits and self-promoters who are looking to make a name for themselves. They attach themselves to particular trends in order to cross-sell themselves as consultants, authors and speakers. Hot on their heels are the book publishers and course vendors, who appear with remarkable speed at the first hint of something new, with a selection of 500 page books and offsite training courses to ease your transition to the next big thing.

The buyers are the developers who hear the buzz and are drawn to it. And for many, that draw is very strong indeed, for a variety of reasons. First, many developers are fascinated with anything new simply because it is a novelty. The desire to play with new tech toys is what got many into IT to begin with, and is still their main source of enjoyment in their working lives. For others, the lure of a new technology lies in the belief that it might solve all their development woes (rarely is it stated directly, but that’s the tacit promise). It’s classic “silver bullet” thinking of the sort Fred Brooks warned against 25 years ago, but which is just as deceptively attractive now as then.

Incoming technologies have the same advantage over their predecessors that opposition political parties have over the governing party; the shortcomings of the existing option have been revealed through experience, but the shortcomings of the incoming option are unknown because nobody has any experience with it. This makes it easy to make the incoming option look good by comparison. You just focus on the problems with the old technology, while saying nothing of the problems that will inevitably accompany the new one. The newer option has an image that is unblemished by the harsh light of experience. The new technology is promoted as a key ingredient of forthcoming software success stories, but those pieces of software are just vaporware, and vaporware doesn’t have any bugs or suffer any performance or interoperability problems.

It should also be acknowledged that there is a psychological and emotional appeal to placing such emphasis upon the technological aspect of software development. It alleviates the burden of self-examination and introspection upon work practices. It is much easier and more comfortable to think of all one’s problems as being of external origin, leaving one’s self blame free. “As long as the problem is “out there” somewhere, rather than “in here”, we can just jump from one silver bullet to the next in the hope that maybe this time the vendors have got it right. Heaven forbid that the way we apply those technologies should actually have something to do with the sort of outcome we achieve.

But think of this:

Of all the failed and troubled software development efforts you’ve been involved in, there is one common element … you.

Your Regularly Scheduled Program

Some developers enjoy this perpetual onslaught of marketing efforts, for it keeps them well supplied with new toys to play with. But some of us are both tired of the perpetual call to revolution, and concerned for the long term effect it has upon our profession. I belong to the latter group.

The main danger that this ever-changing rush to follow technological fashion has upon us is to distract us from focusing on those aspects of our work that really matter – the people who are doing the work and the working methods they employ. Do you think that the technologies you use really make much difference to the outcomes your achieve? I suggest they are generally quite incidental. To understand why, consider this analogy.

Suppose a group of professional writers gather together for a conference discussing the nature of the writing activity. You would expect them to broach such topics as plot, character development, research methods, editing techniques and so on. But suppose they spent their time discussing the brand of pen that they preferred to write with. If one author claimed “My writing has got so much better since I started using Bic pens” - would you not think that author might be missing something? If another claimed “That book would have been so much better if it’d been written with a Parker pen” - you might again think that the speaker has missed the point. If a third claimed “I write twice as much when I use a Staedtler pen,” you might think that the author is probably making things up, or at least trying to rationalize a behavior that is really occurring for emotional or psychological reasons. But isn’t this exactly what we developers do when we claim “This project would have been so much better if we’d written it in Ruby” or “I’m twice as productive writing in Java as I am in C++”? In other words, our focus is all wrong. We’re preoccupied with the tools we use, but we should be focused on the skills and techniques with which we wield those tools.

At the organizational level, this fixation with novelty often works to create a bad impression of IT’s capabilities and proclivities. If those that make the strategic technology decisions for a company are the type to get carried away with the latest fads, then that company can find itself buffeted by the ever-changing fashions of the technical industry, always switching from one “next big thing” to another, with no concern for long term maintenance burden and skills investment. It is easy to create a portfolio of projects implemented in a broad range of diverse technologies, requiring an equally diverse set of skills from anyone hoping to later maintain the project. A broad skill base is seldom very deep, so staff become neophytes in an ever-increasing set of technologies, none of which have been used for a sufficient time for them to gain a high level of expertise. From an outsider’s perspective, the IT section seems to be a bunch of boys playing with toys, terminally indecisive, that for some reason needs to keep re-implementing the same old applications in progressively newer and cooler technologies, though successive reimplementations don’t seem to be getting any better or more reliable. It seems that every six to twelve months they suddenly “realize” that the technologies they’re currently using aren’t adequate and a new technology direction is spawned. All that is really happening is that the novelty of one technology selection has worn off and the hype surrounding some new novelty is beckoning.

Think of the organizational detritus this leaves behind. You’ve got legacy VB applications that can only be maintained by the VB guys, legacy J2EE systems that can only be maintained by the J2EE guys, a few .NET applications that only the .NET guys can comprehend, and that Python script that turned out to be unexpectedly useful, which no one has been game to touch since the Python enthusiast that wrote it resigned last year.

How many companies, do you suppose, are now left with monolithic J2EE systems containing entity beans galore, that were written as the result of some consultant’s fascination with application servers, and their compulsion to develop a distributed system even if one wasn’t required. And how impressed are the executives in those companies who find themselves with an enormous, sluggish system that appears to have gone “legacy” about five minutes after the consultants left the building. Can we be surprised at their cynicism when they’re told their system will have to be rewritten because it was done poorly by people who didn’t really understand the technologies they were working with (how could they – they were learning as they went). How can they leverage their technology and skill investments when both seem to become irrelevant so rapidly?

What’s The Better Way?

Thankfully, it doesn’t have to be like this. But avoiding the harmful effects of technology obsession requires some clarity.

At the organizational level, it requires senior technicians to have the maturity and professional responsibility to put the interests of the business before their personal preferences. It means developing technology strategies and standards based solely upon benefit to the business. It means remembering that there is no ROI on “cool.”

At the individual level, it means adopting a skeptical attitude towards the hype generated by vendors and pundits; and turning one’s focus to the principles and techniques of software development, which transcend any technology fashion. Your time and energy is better invested in improving your abilities and skills than in adding another notch to your technology belt.


  1. First published 7 Aug 2005 at http://www.hacknot.info/hacknot/action/showEntry?eid=77 

  2. After The Gold Rush, Steve McConnell, Microsoft Press, 1999 

  3. http://www.acm.org/constitution/code.html 

  4. http://www.ieee.org/portal/pages/about/whatis/code.html 

  5. Software Fundamentals: Collected Papers by David L. Parnas, Addison-Wesley, 2001 

  6. First published 11 Jan 2004 at http://www.hacknot.info/hacknot/action/showEntry?eid=43 

  7. First published 2 Oct 2004 at http://www.hacknot.info/hacknot/action/showEntry?eid=67 

  8. Fail Fast, Jim Shore, IEEE Software, Sept*Oct 2004, pg 21 

  9. Object Oriented Software Construction, 2nd Edition, Bertrand Meyer, Prentice Hall, 1997 

  10. Checking A Large Routine, Talk delivered by Alan Turing, Cambridge, 24 June 1950. 

  11. Eiffel’s Design by Contract: Predecessors and Original Contributions, Bertrand Meyer 

  12. First published 25 Sep 2003 at http://www.hacknot.info/hacknot/action/showEntry?eid=24 

  13. First published 4 Aug 2003 at http://www.hacknot.info/hacknot/action/showEntry?eid=13 

  14. First published 27 Aug 2006 at http://www.hacknot.info/hacknot/action/showEntry?eid=89