In my last editorial (in Overload 60, called "An Industry That Refuses to Learn") I asserted that the software development industry has not made significant progress in the last quarter of a century. This assertion provoked enough of a response to fill the letters page in the following issue. I'm pleased about that, but at the same time, not so pleased. I'm pleased because I managed to provoke people into putting pen to paper - or rather, in this day and age, putting fingers to keyboard. I'm not so pleased because the response was one of overwhelming agreement, which is unfortunate because it suggests that any hopes I may have had that my experience is the odd one out, are false.
Once again it's my turn to write an editorial, and in search of inspiration, I dug out Overload 62 and reread Alan Griffiths' editorial "The Value of What You Know". In that editorial, Alan recounts how a colleague asked him how to return a NULL string - because in C the colleague would have represented the string using const char* and therefore could, and would, have returned NULL. The developer just expected to get a simple answer because it never occurred to him that in the context he was working, a different solution may have been appropriate; in other words, returning NULL may or may not have afforded the best set of tradeoffs in the given situation.
Anyway, why am I going on about this? Well, it's because I have observed on many occasions over the last few years, that when a developer comes up with a solution to a problem, they think the problem is solved and get on with implementing whatever it is they've come up with. Like the developer in Alan's story, they don't stop to consider that implementing a particular solution has its own set of consequences - or, putting it another way, they don't consider that there are tradeoffs to be considered.
A recurring example of this is speeding up the lookup process in a data structure in memory, by keeping an index in memory in addition to the data. This approach makes the simple trade of using more memory in return for a gain in speed. Whether or not the tradeoffs are acceptable depends very much on the execution environment. For example, if the structure holds enough data to take up (say) thirty percent of a computer's memory, then the index is likely to be sufficiently large to have an impact on both speed and memory requirements. It should be noted that if measures must be taken to speed up element lookup, then there is an implication that the structure is likely to be large. Further, even if the structure will be large, an index will not be of any benefit if most of the elements searched for are near the starting point for the lookup (typically the beginning of the structure). The upshot of all this is that the solution using indexing is only a good idea if:
There is enough memory to support it
The overhead of referencing the index will not impose too much overhead on lookup speed, too much of the time
What we're heading towards here is Pattern territory. It's worth extending the discussion to consider Patterns, because the original idea of Patterns was that any particular Pattern captures not only a problem and a solution, but also the tradeoffs that must be accepted if the solution is adopted. The problem of speeding up lookup in a data structure and solving it using indexing, may or may not qualify as a pattern (there are other factors that are beyond the scope of this discussion). However, there is an analogy to be drawn, because indexing is a solution to the problem, but only if a certain set of tradeoffs - i.e. the two cited above, at least - are acceptable.
The concept of a Pattern originates in "The Timeless Way of Building" by the Architect Christopher Alexander, first published in the late 1970s. The idea was imported into the software development community in the early 1990s. However, it was the publication of "Design Patterns: Elements of Reusable Object-Oriented Software" by Erich Gamma, Richard Helm, Ralph Johnson and John Vlissides, the "Gang of Four" (or "GoF"), that brought Patterns to the attention of the software development community at large, focusing on twenty-three patterns from the domain of object-oriented design. Over the years this book has become known as the "GoF book".
Unfortunately the GoF book, in its attempt to make Patterns more accessible, also (partly due to its presentation of Patterns, and partly due to the way it has been read by the development community at large) accidentally popularised some misconceptions. For example:
Patterns are for object-oriented design, and there are twenty-three only, no more and no less
A Pattern is a configuration of classes that works in more than one place.
Patterns are invented
Whereas, in reality (respectively):
Patterns occur at all stages of the development process
A Pattern captures a problem, a solution, and the tradeoffs involved
Very important: Patterns are harvested from existing practice/experience
In my Overload 60 editorial "An Industry That Refuses To Learn", I expressed my concerns that by bringing Patterns to the attention of the development community at large, the GoF book has lead to them being hijacked and turned into a buzzword - not because of any fault of the GoF, but because it is in the nature of the industry to do this. I am finding more and more, that having the phrase "Design Patterns" on my CV is becoming advantageous when applying for contract work, when it comes to playing the inescapable "buzzword bingo" with the agencies. I have a recollection from an interview, of being asked if I was familiar with "The Patterns Book"; I recall telling the interviewer, yes, I've read "Software Patterns" by James Coplien. Naturally that approach didn't get me anywhere. In passing, and for the record, I regard "Software Patterns" [ 1 ] to be a much better candidate for being The Patterns Book than the GoF book, because it is much more likely to leave the reader with an understanding of what Patterns are and what they can offer.
Above, I cited indexing as an example of speeding up lookup in a data structure subject to certain tradeoffs being acceptable. The industry's hijacking of Patterns has meant that the tradeoffs element has been lost. Given that the most effective means of learning is by example, the loss of tradeoffs means that a valuable example of considering tradeoffs when considering solutions has been lost.
Anyway, at this point I'd like to move on to a different but related topic, and this time on a more positive note. In this editorial and in my previous one mentioned above, I have talked about the how the software development industry has hijacked concepts and practices and turned them into buzzwords. Well, it seems this has - by accident rather than design - lead to an area of improvement, and I can write about something other than doom and gloom.
Back in the early days of the desktop PC, programmers typically just sat down at the computer and wrote code, testing their work by some sequence of random actions that involved running the program they were working on and seeing what happened. Their practices were unfortunately a far cry from the rigor of their mainframe counterparts. Today the computers used as both desktop workstations and as servers, are much more advanced that the PCs of ten and twenty years ago. Further, the software that runs on them is much larger in scale and much more complex. However, all too often, the software development practices have not advanced. Until now, that is...
It seems that one of the age-old practices of the mainframe developers of old, namely that of unit testing, has resurfaced! You've probably heard mention of "Test-Driven Development" (aka "TDD" or "Test-First Development") in various places over the last few months. Well, a friend of mine went for an interview recently, and was asked what he knew about this practice. He got the job, and it played a part that he'd read Kent Beck's book "Test-Driven Development" [ 2 ] , which went down well with the interviewer. I have noticed recently myself, this phrase is beginning to be mentioned occasionally when the agencies reel off their list of buzzwords.
Granted, TDD is a bit more than just unit testing (it advocates that a piece of code's tests should be written before the code itself is written), but that's not really the point. The point is, there must be some ironic twist in the fact that unit testing is finding its way back into mainstream workstation/server programming, simply because someone thought up a buzzword - or rather a buzz-phrase - to associate with it. Don't get me wrong though, I'm quite happy that TDD is becoming trendy. Sadly it was for the wrong reasons that TDD came to the attention of this development community at large, but it can only have a positive effect on the quality of software.
In "The Value of What You Know", Alan finishes with the sentiment to the readers: "I'll have to trust you'll have your own story to tell". Well, here I've told (one of) mine - or rather, I've started in one place and followed a line of thought to where it led me (there is an element of thinking out loud in this editorial). I've looked at Patterns - or rather, all that has gone wrong for Patterns - being adopted by the development community. I then moved on to look at how by a happy accident TDD has been adopted, but in this case to the benefit of the projects on which it is used.
Perhaps we should learn from experience with TDD and take stock of practices that we would like to see adopted more widely, and then sharpen our skills in coming up with buzzwords and/or buzz-phrases that are sufficiently catchy for the majority of developers and/or managers. Then, as what happened to TDD happens to other useful practices, maybe the "Buzzword Adoption Pattern" will start to emerge.
[ 1 ] Available as a free download from: http://www.bell-labs.com/user/ cope/Patterns/WhitePaper
[ 2 ] Note that Kent Beck is completely open about the origins of unit testing. In his book "Extreme Programming Explained", he asserts that XP is built on the premise that its practices are of proven effectiveness.