Thursday, May 8, 2008

Software Development Productivity Gains ?

As I am close to getting my last book published, I figured I should publish a few pages once in a while to trigger the interest and to gather a few comments.

This first extract deals with the matter of productivity gains in the information system. This is a complex topic, and it is really the whole of this book that provides an answer to it. We will merely highlight the most important points and indicate the chapters where they are described in greater detail. In this initial overview we will distinguish the project aspect, dealt with in this section, from the operations aspect, dealt with in the following section. These can then be broken down into three major parts: technology, industry and architecture.
Since the dawn of IT, every year has seen the advent of new technologies that promise spectacular gains in productivity[1]. In the past, people talked about programming languages with a high level of abstraction, artificial intelligence, computer assisted graphic design, synthesis and automatic program checks, etc. Today, the hot topics in the literature consist of meta-programming, service architecture, component assembly, to name but a few. We will go back over these technologies and their promises in chapter 9. There is certainly progress, in terms of function points per dollar and in terms of function points per man-day, but this progress is coming at a very slow rate. If we had to provide an order of magnitude based on the comparative analysis of the productivity figures from the late 80s, and those that are found today, I would say that we have gained a factor 2 in 20 years.[2]. It is a matter of overall productivity (on the project perimeter) according to the technical axis (in FP/man-day). The economic aspect is different, because the IT industry has undergone significant changes.

The software and system integration industry brings with it structural gains in productivity according to the following factors:
  • Globalization, which makes it possible to move a part of production into countries where there is a lower cost of labor. This delocalization produces a substantial economic advantage, when it is relevant, particularly for stable needs. We shall come back to this in chapter 9.
  • Mutualization of needs, which leads to the continuous release of software packages that capitalize on the needs shared by different companies. The associated gains may be quite sizeable at the level of a project, but are only moderate at the level of the average of the ISD.
  • Maturing and professionalization of development practices in relation to the development of tools. The continued formalization and optimization of development processes by using standardized repositories as support[3], is a good practice that is vital for an ISD and its providers.


These productivity factors may have greater effects than technology factors, but their implementation is not easy, and requires discernment in terms of the scope of application.
The third area of progress is the information system architecture, which we distinguish from the technology section, because it involves what is known as “enterprise architecture”, a subject which spans information technology, business processes and “strategic alignment” of the information system. Enterprise architecture is used as a complexity control strategy so that the integration costs of a new application do not reach an irreversible level as the IS grows. This is mostly a defensive strategy that is used when the complexity is precisely such that integration becomes the predominant part of IT projects. As a result, the effect that is noticed is not a “productivity miracle” but the return to a “comfort zone” where integration costs are proportional to the functional complexity of the application.
The conclusion we can make at this stage of the analysis is that there are undisputed productivity factors in terms of information system development but these are not necessarily easily deployed; the gains remain measured.


The visible part of these gains is all the more modest that another part is consumed by three heavy trends of enterprise IT[4]:

  • Additional requirements, as regards availability, security, ergonomics, etc., which induces more complex IT solutions with an equivalent functional perimeter.
  • Inter-operability requirements, whether within the company or in conjunction with partner companies. Enterprise IT has been experiencing a double revolution for 20 years. Firstly, its functional scope grew along with inter-application connectivity requirements (unlike departmental IT in the 70s and 80s). Secondly, the concept of the extended company arose, i.e. the necessity to connect the information systems of companies that are collaborating on processes.
  • The abstraction of software functions to control complexity. Controlling complexity leads to the creation of layered software, with the layers corresponding to different levels of abstraction, and makes it possible to have them evolve without having to get a sense of them as a whole[5]. These methods make it possible to control application systems that are very rich and complex but generate additional costs (the actual size exceeds the expected size).


Users of general-public software such as office suite tools will have noticed that more and more resources are required to run their favorite applications (e.g. word processing), mostly due to these three reasons.


[1] We have already referred to F. Brooks’ article entitled “No Silver Bullet: Essence and Accidents of Software Engineering”, which examines the intrinsic difficulties in terms of software productivity. This article can be found in T. DeMarco and T. Lister’s Software State-of-the-art, which includes a selection of the best articles from the late 80s.
[2] This assessment is an average ratio which hides a great level of variability, according to the type of problems to be dealt with. This will become clearer when inter-company benchmark test results are published every year.
[3] The repository which is gradually becoming the standard one in IT project development is CMMI (Capability Maturity Model Integration). For more on CMMI, see the first section of M.B. Chrissis, M. Konrad and S. Shrum’s CMMI – Guidelines for Process Integration and Product Improvement. You can also find many sources online, including those of the SEI (Software Engineering Institute).
[4] This phenomenon is more commonly known in technology circles as the “rebound effect”. For example, the efficiency increase for heating a square foot of living space is offset by the increase in the average house size; the progress in car fuel consumption has helped to develop the usage, instead of simply generating savings, etc. In IT, the “complexity race” absorbs a proportion of the gains by the rebound effect.
[5] The concept of “layers” is hard for non-computer specialists to grasp. It is quite relevant to compare it to geology: layers corresponding to age, with a successive deposit of “functions”. For example, I have the experience of precisely analyzing the code of a major American publisher on typical functions whose performances seemed to be disappointing. Using the debugger as an analysis tool showed several “layers” which have been built up as the software has undergone different changes (from an initial effective kernel to multiple generic interfaces). This concept of “sedimentation” will be introduced in section 2.4.1.

No comments:

 
Technorati Profile