After Four Years

After Four Years

By Alan Griffiths

Overload, 16(84):, April 2008


After four years as editor of Overload it is time for a change.

Changing times

Out of curiosity I looked up the first editorial I wrote after taking over the editorship of Overload (the June 2004 issue). By a curious coincidence this editorial started by discussing change: I was optimistic that software development practices are evolving and new ideas are being put into practice. By another coincidence, the previous editorial had - like the one before this one - been written by a guest: Mark Radford, who observed a (less positive) tendency for organisations to try techniques that are known to be ineffective.

After four years one might hope that evidence would be mounting to support my view of a changing industry. I do see wider adoption of good practices - for example I rarely have to 'sell' unit tests or continuous integration. But I also see some of the same bad ideas that have been around since they were debunked in 'The Mythical Man-Month'. (Adding people to a late project? It still happens, and it still makes things harder!!)

Why is change so slow? Well we are talking about changes to human behaviour - and that happens very slowly.

Over a long time-scale things definitely do change in the software development world - although it has spent decades trying to model development processes into a model that assumes that correcting errors is so expensive that it justifies elaborate and costly precautions to avoid them. This may have been partially true once - in the 1970s I can remember working in environments where one got one or two attempts to compile a program each day and hand checking for code syntax errors was an essential part of making progress towards actually executing the code. In these circumstances, checking in advance was necessary.

Since then the technology supporting software development has changed. The discipline of hand checking the code became obsolete decades ago, when it became possible to run a compile and get the results faster than checking by hand. It became far more effective to throw the code at the compiler and deal with any diagnostics it produced.

The changes have gone further than that. In a typical modern development environment, syntax errors are highlighted as one types and when the file is saved the code is automatically compiled and tested: the results appear in a moment. This leads to a mode of working where the tests and code are developed in parallel and to some developers making the (unsurprising) observation that if one makes the tests easy to write the corresponding code is easy to use.

Throughout the development cycle the costs of automation have fallen: building and testing an entire system to a point where it can be deployed can be automated and run at intervals ranging from every commit to every day. Automation of deployment is also feasible with a consequent reduction in the cost of a release over extended manual checking and deployment.

Given this, it is hardly surprising that in software-for-use projects there is an increasing emphasis on delivering a partial solution as early as is feasible and regularly making incremental corrections and improvements based on user feedback. While these practices are not yet universal - and are less applicable to software-for-sale projects - the reduction in overall development costs and time to deliver are driving adoption.

Another sign of the times

In the early days of desktop computers a lot of effort went into moving data from one system to another. Even though I was working for a company whose main business was selling furniture, developing software for it involved writing device drivers, file transfer utilities and translators between different data formats and character encodings. I'm sure that mine wasn't the only company incurring costs when dealing with EBCDIC, ASCII line termination, national currency symbols and the like - not really part of the core business. And as the IBM PC became popular there were also IBM's 'extended' ASCII code pages to deal with too.

On the other hand, as PCs became popular file transfer and transformation utilities became readily available. But similar problems popped up in another area office applications (like word processing) - each supplier created its own incompatible format - and the developers of these must have expended considerable effort reverse-engineering each other's formats and writing import and export functions. These worked inconsistently, and if you didn't know what someone else used then the only reliable format was plain text (although, as noted above, there were still issues with the character encoding).

Eventually, one of these application suites (Microsoft Office) established dominance and its developers at least could relax and let the others worry about reverse engineering competitors' products. Nice for them and an extra effort for anyone else wanting to compete in the 'office' market. And that has been the case for some years now.

However, a number of other parties have reason for wanting this to change: other software developers who want to compete in this market; other OS vendors that want to supply desktop systems; customers who want to use alternative products; and organisations that have a need to ensure continued access to documents.

This conflict of interest has been focussed around Microsoft's attempts to get ISO to ratify its 'Office Open XML' standard. Ms. Geraldine Fraser-Moleketi the South Africa Minister of Public Service and Administration recently described [ Idlelo ] the situation as follows:

...The adoption of open standards by governments is a critical factor in building interoperable information systems which are open, accessible, fair and which reinforce democratic culture and good governance practices... ODF is an open standard developed by a technical committee within the OASIS consortium. The committee represents multiple vendors and Free Software community groups. OASIS submitted the standard to the International Standards Organisation in 2005 and it was adopted as an ISO standard in 2006. South Africa is amongst a growing number of National Governments who have adopted ODF over the past year.

This past year has been marked by a raising in the tension between the traditional incumbent monopoly software players and the rising champions of the Free Software movement in Africa. The flashpoints of conflict have been particularly marked around the development and adoption of open standards and growing concerns about software patents...

It is unfortunate that the leading vendor of office software, which enjoys considerable dominance in the market, chose not to participate and support ODF in its products, but rather to develop its own competing document standard which is now also awaiting judgement in the ISO process. If it is successful, it is difficult to see how consumers will benefit from these two overlapping ISO standards. I would like to appeal to vendors to listen to the demands of consumers as well as Free Software developers. Please work together to produce interoperable document standards. The proliferation of multiple standards in this space is confusing and costly....

An issue which poses a significant threat to the growth of an African software development sector (both Free Software and proprietary) is the recent pressure by certain multinational companies to file software patents in our national and regional patent offices. Whereas open standards and Free Software are intended to be inclusive and encourage fair competition, patents are exclusive and anti-competitive in their nature. Whereas there are some industries in which the temporary monopoly granted by a patent may be justified on the grounds of encouraging innovation, there is no reason to believe that society benefits from such monopolies being granted for computer program 'inventions'. The continued growth in the quantity and quality of Free Software illustrates that such protection is not required to drive innovation in software. Indeed all of the current so-called developed countries built up their considerable software industries in the absence of patent protection for software. For those same countries to insist on patent protection for software now is simply to place protectionist barriers in front of new comers. As the economist, Ha-Joon Chang, observed: having reached the top of the pile themselves they now wish to kick away the ladder.

Between the time I'm writing this and the time you read it, the next round in this conflict will be over: toward the end of March ISO will chose whether to adopted Microsoft's OOXML as a new standard by way of its 'Fast Track' process or not.

As you will know from past editorials, I'm of the opinion that OOXML is not currently fit to be a standard - and my opinion hasn't been changed by the changes voted at the recent Ballot Resolution Meeting. As an illustration of why I feel this way I refer you to Rob Wier's study 'How many defects remain in OOXML?' [ Weir ]. His conclusion (my emphasis):

That's as far as I've gone. But this doesn't look good, does it? Not only am I finding numerous errors, these errors appear to be new ones, ones not detected by the NB 5-month review, and as such were not addressed in Geneva. Since I have not come across any error that actually was fixed at the BRM, the current estimate of the defect removal effectiveness of the Fast Track process is < 1/64 or 1.5%. That is the upper bounds. (Confidence interval? I'll need to check on this, but I'm thinking this would be based on standard error of a proportion, where SE=sqrt((p*(1-p))/N)), making our confidence interval 1.5% ± 3%) Of course, this value will need to be adjusted as my study continues.

The number of errors isn't really surprising given the speed with which this draft standard has been produced - we all know how hard it is to produce accurate technical information. And a standard of this size (it is bigger than SQL) needs a couple of years worth of review - not the few months allowed by the Fast Track process.

Closer to home

The reason I was looking back to my first editorial during this term as editor is that I'm giving up the role again. I've enjoyed my time with the magazine, but I'm no longer moving it forward and it is time that someone else has the opportunity to do something with it.

That someone is Ric Parkin who has been on the editorial team for some time now and, while I disappeared on holiday, edited the October issue last year (so he does know what he's getting into). The team that has been supporting me for the last year remains in place so I'm sure that it is in good hands.

Good luck Ric!

References

[ Idlelo] http://www.raffee.co.za/post/29079077

[ Weir] http://www.robweir.com/blog/2008/03/how-many-defects-remain-in-ooxml.html






Your Privacy

By clicking "Accept Non-Essential Cookies" you agree ACCU can store non-essential cookies on your device and disclose information in accordance with our Privacy Policy and Cookie Policy.

Current Setting: Non-Essential Cookies REJECTED


By clicking "Include Third Party Content" you agree ACCU can forward your IP address to third-party sites (such as YouTube) to enhance the information presented on this site, and that third-party sites may store cookies on your device.

Current Setting: Third Party Content EXCLUDED



Settings can be changed at any time from the Cookie Policy page.