Tuesday, December 31, 2013

Seven Keys for Complex Systems Engineering



I gave a talk early this year at the “IRT SystemX” inauguration, about the challenges that occur when engineering “Systems of Systems”. This talk is a quick introduction of what we can learn from complex systems when designing large-scale interactive industrial systems. Complex systems are defined by their goals (purpose) and a set of sub-systems with rich interactions. The complexity of these interactions yields the concept of emergent behavior. Complex systems have a fractal nature, that is, they exhibit multiple scales, both from a physical/descriptive level and from a temporal level. Complex systems embed memory and have the capability to learn, which makes them both dynamic and adaptive systems. They interact constantly with their environment, which means that a dynamic vision of flows is more relevant than a static description of their top-down decomposition. Most complex systems renew their low-level components in a continuous process. Teleonomy and process analysis are, therefore, the most useful approach to capture the essence of a complex system.

I have become gradually fascinated by the topic of complex systems because I find it everywhere in my job and my own research. Complex systems is the right framework to understand the management and the organization of modern enterprises. This is the topic of my other blog. All that is said about complex systems in the previous paragraph applies to a company. I also found that this applies to information systems as well. The main reason for creating this blog was the realization that the proper control for information system has to be emergent, following the lead of Kevin Kelly and the intuition behind Autonomic Computing. Last, complex systems are everywhere when one tries to understand the most common business ecosystems, such as smartphone application development, smart homes or smart grids. I have talked about Smart Grids Players as a Complex System in this blog. More examples may be found in my keynote at CSDM 2012.

There is a paradox with the popularity of “complex systems science” in today’s business culture. On the one hand, the importance of complex systems’ concepts is obvious everywhere: system of systems, enterprises, markets. On the other hand, the practical insights are not so clear. « System thinking » has become a buzzword and the word “complexity” is everywhere … still many textbooks and articles which claim to apply “the latest of complex science theory” to business and management problems are either obscure or shallow. This is not to say that there does not exist a wealth of knowledge and practical insights that is available in complex systems literature. On the contrary, the following is a selection of some of the books which I have found useful during the last few years.



Today’s post is a crude and preliminary attempt to pick seven keys that I have found in these books which, to me at least, are practical in the sense that they unlock some of the complexity – or mystery – of the practical complex systems which I have encountered. There is no claim of completeness or rigorous selection. This is clearly a personal and subjective list which I consider a « work in progress ». This is just a list, so I will not develop each of the seven keys here, although each would deserve a blog post of  its own.

  1. Complexity means that forecasting is at best extremely slippery and difficult, and most often outright impossible. This is, for instance, the key lesson from Nassim Taleb’s books, such as The Black Swan. The non-linearity of complex system interactions causes the famed butterfly effect, in all kinds of disciplines. If you line up a series of queues, such as in the Beer Game supply chain example, each queue amplifies the variations produced by the previous one and the result is very hard to forecast, hence to control (this depends, obviously, of the system load). This does not mean that simulation of complex systems is useless, it means that is must be used for training as opposed to forecasting. Following Sun Tzu or Fran├žois Jullien, one must practice “serious games” (such as war games) to learn about complex system from experience. This complexity also means that one needs as much data as possible to understand what is happening, and should beware of simplified/abstract description. “God is in the detail” has become a very popular business idiom in the last decades.

  2. Complex systems most often live in a complex environment which makes homeostasis an (increasingly) complex feast of change management. Homeostasis describes the process through which a complex system continuously adapts to its changing environment. The characteristic of successful complex systems, in a business context, is the ability to react quickly, with a large range of possible reactions. This applies both at the level of what the system does and what it is capable of doing. This is illustrated by the rise of the word “agility” in the business vocabulary. The law of requisite variety tells us why detailed perception is crucial for a complex system (which is clearly exemplified by recent robots) : the system’s representation of the environment should be as detailed/varied as the sub-space from the outside environment that the homeostatis process needs to react to.

  3. Complex systems, because of the non-linear interaction in general, and because its components have both memory and the capability to learn, exhibit statistical behaviors which are quite different from “classical” (Gaussian) distribution. This is one of the most fascinating insights from complex systems theory: fat tails (power laws) are the signature of intelligent behavior (such as learning).   In classical physics or statistics, all individual events are (most often) assumed to be independent, which yields the law of large numbers and Gaussian distributions. But when the individual events are caused by actors who can learn or influence each other, this is no longer true. Rather than the obvious reference to Nassim Taleb, the best book I have read on this is The Physics of Wall Street. This works both ways: it warns us that “black swans” should be expected from complex systems, but also tells us that some form of coordinated behavior is probably at work when we observe a fat tail. There is another interesting consequence : small may be beautiful with complex systems, if adding many similar sub-systems creates un-foreseen complexity ! Classical statistics is all in favor of large scale and centralization (reduction of variability) whereas complex behavior may be better understood with a de-centralized approach. This is precisely one of the most interesting debates about the smart grids : if there is no feedback, learning and user behavior change, the linear nature of electricity consumption favors centralization (and large networks); if the opposite is true, a system of system approach may be the best one.

  4. Resilience in complex systems often comes from the distribution of the whole system purpose to each of its subcomponents. This is another great insight from complex system theory: control needs to be not only distributed (to sub-systems) but also declarative, that is, the system’s purpose is distributed and the control (deriving the action from the purpose) is done “locally” (at the sub-system level). This idea of embedding the whole system’s purpose into each component is often referred as the holographic principle, with a nice hologram metaphor (in each piece of a hologram, there is a “picture” of the whole object). This principle has been proven many times experimentally with information systems’ design: it has produced “policy-based control”, where the goals/SLA/purposes are distributed in a declarative form (hence the word “policy”) to all sub-components. I gave the example of SlapOS in my IRT talk as a great illustration of this principle. This is also closely related to the need for fast reaction in the homeostasis process: agility requires distribution of control, with a bottom-up / networked organization similar to living organisms (for most critical functions). One of my favorite books which apply this to the world of enterprise organization is “Managing the Evolving Corporation” by Langdon Morris.

  5. Efficiency in a complex system is strongly related to the capability to support information exchange flows. There is a wealth of information about the structure of information networks that best support these flows. Scale-free networks, for instance, occur in many complex systems, ranging from the Web to the molecular interactions in living cells and including social networks. Scale-free networks reduce the average diameter, among other interesting properties, and can be linked to avoiding long paths in communication chains, both for agility and resilience. The challenge that these information flows produce is represented by the product of the interaction richness (essence of complexity in a complex system) and the high frequency of these interaction (our key #2) – the product of two large numbers being an even larger number.  My other blog is dedicated to the idea that managing the information flows is the most critical management challenge for the 21st century (an idea borrowed from “Organizations” by March & Simon). For instance, the necessity to avoid long paths translates into versatility : complexity prevents specialization, because too much specialization generates even more synchronization flows. This communication challenge is not simply about capabilities (“the size of the communication pipes”), it is also about semantics and meaning. A common vocabulary is essential to most “systems of systems”, whether they are industrial systems or companies.

  6. Complexity in time is something that is difficult to appreciate for humans. One of the most critical aspect of complex systems are the loops, mostly feedback loops. Peter Senge and John Sterman have written famous books about this.  Reinforcement and stabilizing loops are what matter the most when trying to describe a complex system, precisely because of their non-linear natures. The combination of loops, memory and delays cause surprises to human observers. John Sterman gives many examples of overshooting, which happen when human over-react because of the delay. Kevin Kelly gives similar examples related to the management of wildlife ecosystem. The lesson from nature is a lesson of humility : we are not good at understanding delays and their systemic effects in a loop. In the world of business, we have a difficulty to understand long-term consequences of our actions, or simply to visualize long-term equilibriums. Many people think that user market share and sales market share should converge, given enough years, without seeing the bigger picture and the influence of attrition rate (churn). Even simple laws such a Little’s Law may produce counter-intuitive behaviors.
  7. Efficient control for complex systems is an emergent property. Control strategies must be grown and learned, in a bottom-up approach as opposed to a top-down design. We are back to autonomous computing : top-down or centralized control does not work. It may be seen as another consequence of Ross Ashby law of requisite variety: complete control is simply impossible. Adaptive control required autonomy and leaning. This is, according to me, the key insight from Kevin Kelly’s book, Out of Control : “« Investing machines with the ability to adapt on their own, to evolve in their own directions, and grow without human oversight is the next great advance in technology. Giving machines freedom is the only way we can have intelligent control ».  This insight is closely related to our key #4 : autonomy and learning transform progressively distributed policies into emergent control.  There exists another corollary from this principle: such policies, or rules, should be simple, and the more complex the system, the simpler the rules. One could say that this is nothing more than the old idiom KISS, a battlefield lesson from engineering lore. But there is more to it, there seems to be a systemic law that is comforted by business experience: only simple explicit rules provide long-term values to complex systems. Any rule that is complex has to be implicit, that is constantly challenged and re-learned.


 
Technorati Profile