Monday, January 25, 2010

Taming Information Systems Complexity

This blog has been silent for a long time. I'll resume with a topic which is drawn from my course at Ecole Polytechnique: measuring and mastering the complexity of information systems. I have written many times, including in this blog, that the first job of a CIO is to master the complexity of her/his company's information system.

1. Which complexity ?

Although the difference between complex and complicated is fuzzy and varies according to the source (for instance, it is not supported by the TLF, the official dictionary of French language), it has emerged as follows: complicated is matter of size and scope, where complex describes the nature of the relationship between the components of a system. Complexity (in the sense of complex systems) arise when the finality and the behavior of a system cannot be derived from those of its component (hence the concept of emergence). The three most common ingredients in a complex system are:

  • Feedback loops (and the non-linear resulting behavior that result from amplification)
  • Delays (especially long-term delays) that generate "temporal complexity" which can easily puzzle us.
  • Human factor, that is, the presence of humans as components of the global system.

Information systems are both complicated and complex. The fact that information systems are complex systems is something that I have touched upon in previous posts. I will give example of "complexity and emergence" in a future post, here I want to address complexity from a practical angle, as it appears to the CIO. Here is a summary of what makes information systems complex:

  • Too many things: the sheer number of components, of apps, of interfaces. Although standardization and automation of component management help to master this dimension, it is obviously part of the problem (i.e. the information system of a small company is neither complex nor complicated).
  • Too many interactions: these numerous components interact in many ways both explicit and implicit. Reducing the number of explicit interaction is the goal of enterprise architecture, and technology (integration middleware) may help. Implicit interaction, such as the use of a common resource, is more subtle to track. Reducing implicit interaction is the goal of a modular architecture, which is more an art than a science.
  • Temporal complexity: many relevant time scales coexist, with both very short term delays which requires mastering so-called "real time" behavior and long term life cycles that demand to step back and anticipate
  • Human complexity: information systems are centered (or should be) around human users. This is the source of uncertainty, of plain errors (e.g. typing errors) and interaction errors when users try to second-guess the system (which is unavoidable since humans are intelligent – cf. Charles Perrow's remarkable book "Normal Accidents").


2. Measuring Complexity ?

Measuring complexity is indeed difficult and I know of fewer measures than there exist "dimensions of complexity" as explained previously.

The first dimension (size) means to associate a weight to the information systems, which is a combination of counting and associating a weight to each component. This is the best understood part:

  • Applications, or software components, can be measured using function points
  • Computing resources may be measured using TPM-C (the most obvious choice for "commercial software" but other, more specialized metrics/benchmarks are available for specific purposes.
  • Storage resources are easily measured in Teraytes or Petabytes.


The second dimension is structural complexity, which measures the richness of explicit interactions between components. The most common example of such a measure is cyclomatic complexity, which counts the number of elementary cycles within a graph (the interaction graph). Cyclomatic complexity was popularized a few decades ago and found useful to measure software architecture. A better approach for information systems is Euclidian Scalar Complexity (ESC). Given an architecture diagram with n objects with associated weights (w1, … wn) and m edges between these objects, the Euclidian scalar complexity is defined as:

  • The square root of the sum of products (wi X wj), if i = j or components i and j are linked through an edge.

It is one of the rare metric that is scale invariant (insensitive to the "zoom effect") and invariant to extension without information loss. For more information, you may download a research article from Caseau, Krob and Peyronnet.

The third dimension is the complexity of implicit interaction, which is precisely the definition of modularity. Although one may define a co-evolution distance as the probability (among all possible changes) that an impact on component A also yields a change for component B, this definition is too theoretical to be useful. My own experience suggest to make specialized architectural diagrams for co-evolution (called "coupling") and to used ESC to measure the complexity of the resulting diagram. What are the possible causes for co-evolution? Here is a short and incomplete list to explain this concept:

  • Objects: components that share business objects are co-dependent.
  • Processes: similarly, the existence of a business process that uses both components A and B makes these two components linked (even if no direct reciprocal calls are made)
  • User interfaces coherence: for instance, the requirement for a coherent multi-channel access may create dependencies among components that are functionally independents.


    To summarize, this is the "measuring discipline" that I suggest to my students:
  • To be performed continuously : counting, sorting, weighing components (e.g.: function points)
  • To be performed once in a while: applying ESC to the usual architecture diagrams and maps, producing coupling specialized architectural charts (i.e., process interaction, business object lifecycles …)
  • To be performed "on demand": a detailed complexity analysis to decide between two architectural options


3. Taming Complexity

I have collected the following list of approaches, from simpler to more complex, which is actually quite thorough and effective, while still being quite practical (any comments on how to extend the list are welcome!). It is not a list drawn from "complex system theory", but rather from practical experience.
  • Simple approach: draw diagrams and maps (cartography). This may sound silly, but drawing architecture diagram is still the best way to cope with complexity, assuming that the meta-diagram (the meaning of the graphical conventions) is well-understood. This is what makes UML2 so useful.
  • Systematic approach: Enterprise Architecture (what we French call urbanization). Enterprise Architecture is, by construction, a method geared to reduce the information system complexity.
  • Technology approach: Infrastructure (middleware). As mentioned earlier, integration infrastructures have a clear benefit over the structural complexity. It can actually be proven using ESC (one of my course favorite exercise).
  • Common sense approach: Energetic Standardization. Reducing the heterogeneity of the components effectively reduces the complexity.
  • Hardest approach: modularity (de-coupling), that is producing a modular architecture. As explained earlier, there is no guaranteed method, but it is a skill that is learned through trials and errors.
  • Strategic approach: SOA (governance) as a strategic answer to complexity. SOA has a very positive impact on modularity and favors mutualization and reuse (hence a mechanical reduction of complexity). It also plays a crucial role in the governance of information systems, reducing the human complexity of satisfying the complete range of stakeholders.
  • Sustainable development of the Information System. This is a topic which I have already covered in a previous post. Sustainable development, as advocated by SITA, is a way to master temporal complexity and to avoid painful paradoxes.

If this is the practical list, what would the "theoretical one add" ? Clearly, I would add the influence of biology (hence the theme of this blog) and "autonomic computing" to build systems that self-organize and self-manage their own complexity. This is an ongoing topic of reflection, to be covered in a future post.

 
Technorati Profile