REVIEW - Software Quality - Producing Practical, Consistent Software


Title:

Software Quality

Producing Practical, Consistent Software

Author:

Mordechai Ben-Menachem, Garry S. Marliss

ISBN:

Publisher:

Thomson Learning (1997)

Pages:

326pp

Reviewer:

Tom Parke

Reviewed:

August 1998

Rating:

★☆☆☆☆


This is a bad book. Where there should be clarity there is obfuscation, where there should be case history and examples there are vague and unconvincing statistics, where there should be explanation there is rambling rhetoric.

It purports to provide a quality methodology. Actually it is a rambling, badly written commentary on the IEEE standard for Software Quality Assurance Plans (ANSI/IEEE Standard 730).

The authors fail to place the quality activity in context or to understand the software development process. They fail to discuss the relative costs and benefits of different quality activities. They fail to discuss the risks of a quality plan - the time and effort it takes, the increased cost of change, the danger of the quality tail wagging the development dog.

They fail to show any appreciation of software engineering. For instance they don't understand the difference between Software Requirements and Software Design. There is no discussion of requirements traceability (from requirements through specification to design and on to testing).

They provide examples of worksheets, forms and in one case a process model and yet don't discuss them at all. For instance they provide a Code Inspection Summary report that seems to require a huge amount of manual effort (e.g. counting the number of loops, the number of jumps out of loops, the maximum number of nested levels) without any discussion of how these figures might be used, that these numbers can be gathered by tools and whether they are of any actual value.

The text is pompous, full of important sounding bluster that on closer inspection is vague or vapid. They love to use 'headline statistics' such as 'Fifty percent of software costs are directly attributable to error corrections' and 'only 10% of the errors are in the coding'. While failing to cite sources for many of their figures, failing to analyse what they might actually mean and failing, in the end, to relate the figures to quality planning. They are in fact only there to 'prove that quality is important'. I wanted to shout back, 'we know that,now can we talk about it like grown-ups'.

The authors irritating use of statistics can be shown by two, albeit minor, examples.

They extend Moore's Law ('processing power increases by 48% a year') to say that applies from the time of Babbage. Moore's Law, that applies from the 1950s onwards, is startling enough without this silly, pointless and inaccurate extension. Babbage never actually built his Analytical Engine and even if he had there's no scale on which to compare it to computers in the 1950's. Finally they are claiming, in effect, that there had been something like a 10 to the power 18 increase in processing power between Babbage and early computers. If we say early computers processed a thousand instructions per second, then this means they rate Babbage's machine at roughly one instruction every 30 years.

Secondly they then contrast the growth in the rate of processing power with the growth in the increased expressiveness of computer languages of 11% and programmer productivity of 4.8% (and to be fair they do cite a reference for these highly dubious numbers). They then imply that this disparity is some sort of problem.

Clearly this is a nonsense - there is simply no relation between the speed of individual computers and the amount of software the world needs. If I buy a computer that is twice as fast as my old one I don't suddenly require twice as much software, it is in any case actually easier to write software that requires more processing power rather than less and indeed new software does always seem to use all that increased CPU power. For instance I noticed the other day that Visual C++ 5.0 compiles on a 200Mhz Pentium just as slowly as the old UNIX v6 C compiler on a PDP-11 used to do nearly 20 years ago. It would appear that the complexity of programming languages and their development environments also increases at 48% a year!


Book cover image courtesy of Open Library.





Your Privacy

By clicking "Accept Non-Essential Cookies" you agree ACCU can store non-essential cookies on your device and disclose information in accordance with our Privacy Policy and Cookie Policy.

Current Setting: Non-Essential Cookies REJECTED


By clicking "Include Third Party Content" you agree ACCU can forward your IP address to third-party sites (such as YouTube) to enhance the information presented on this site, and that third-party sites may store cookies on your device.

Current Setting: Third Party Content EXCLUDED



Settings can be changed at any time from the Cookie Policy page.