This blog has been sleeping in 2010 since I was writing a third book on « Business Processes and Enterprise 2.0 », an attempt to capture my past years involvement with lean management and information flows. Now that the book is over (I expect it to be published this spring), I am turning my attention back to autonomic/autonomous systems, networks and grids.
Although one could say that the promises of "Autonomic computing" (circa 2003/2004) have not materialized in the world of IT, the premises remain valid. My belief is that it will simply take longer to get effective technology in place in the world of corporate IT. As a research theme (which started much earlier and was very active in the 90s), "autonomous information technologies" (the combination of artificial intelligence, distributed control, adaptive software, quality of service monitoring … to name a few) is still very active.
I predict that there will be significant additional R&D efforts deployed in the coming decade, because of two related fields which are becoming "extremely hot", while requiring the same kind of scientific advances to transform hype into practical innovations:
- Smart Grids, where the ambition captured by the world "smart" mirrors the goal of autonomic computing : self-adaptive, self-organizing and self-healing. There is no need to explain why smart grids are strategic to this century, but it is also easy to recognize the implicit difficulty of the endeavor. The heart of the smart grid principle is to evolve from the centralized management of current power networks towards a distributed and adaptive design, which feasibility remains to be proven on a large scale.
- Home Networks, which are growing like mushrooms in our houses, and which have already reached a complexity level that is unacceptable to most households. "Smart houses" are necessary to fulfill the promises of multiple industries: energy – where smart, energy-efficient houses are actually parts of the previously mentioned smart grids -, content and entertainment – IP content anywhere, any time, on any device, home security and management, healthcare – for instance, out-care within the home, etc. The various control/distribution networks may all share the IP protocol, the complexity of provisioning, pairing, routing and interconnecting is rapidly becoming an impossible burden. Here also, the words "self-provisioning", "self-repair" and "self-discovery" are quickly becoming requirements from the customers.
It would not be difficult to define a SDT (Smart Distributed Thing) that regroups the challenges of distributed information systems, smart grids and smart home networks … With this first post of the year, I'd like to explore three ideas which I have been toying with during the past few months and which I intend to explore more seriously in the future.
- There is a lot of wisdom in applying biomimetics to replicate evolution to produce autonomous systems. This is especially true for smart home automation networks. Rather than designing "a grand scheme" of "the smart house's nervous system", it is much safer to start with simpler subsystems, add a first layer of local control, then add a few reflexes (limited form of autonomy), then add a second layer of global control … and end up with a "cortex" of advanced "intelligent" functions. Multi-layered redundant designs such as those produced by evolution are more robust (a key feature for a home automation control network), more stable (a key insight from complex system theory which is worth a post by itself) and more manageable. The need for recursive/fractal architecture is nothing new: I wrote about it with respect to information system architecture many years ago in my first book. I went from a global architecture (which was common when EAI was a catchwordJ) to a loosely coupled collection of subsystems (so called fractal enterprise architecture), for the same reasons: increase robustness, reduce operational complexity and, most importantly, increase the manageability (the rate of evolution is not constant over a large information system). There is much more than fractal design involved here: the hierarchy of cognitive functions from low-level pulse, reflexes, to skills and then "creative" thinking, is equally suited to the design of a SDT.
- Autonomous systems tend to scare end-users unless they embody the principles of calm computing. Calm computing is derived from the concept of ubiquitous computing (cf. the pioneering work of Mark Weiser at Xerox Park), and addresses the concerns that emerge when "computers are everywhere (ubiquitous)". Calm computing is very relevant to SDT, I could summarize the three main principles as follows: a smart ubiquitous system must act "in the background" and not "in your face" (it must be discrete J), it needs to be adaptive and learn from the interaction with its users (the complexity of the users must be recognized in the overall system) and, most importantly (from the user's perspective), it should be stoppable (you must be able to shut it down easily at any time). This becomes much easier with a fractal/layered design (previous point) and more difficult with a monolithic global design. There is a wealth of ideas in the early papers about calm technology, such as minimizing the consumption of the user's attention.
- The emergence of software ecosystems most often needs to be guided/shepherded, and rarely occurs in the wild as random events. This is a key point since it is widely acknowledged that software ecosystems (such as iPhone applications) are where innovation occurs (and where value is created from the end-user's point of view). In the realm of home network, I have been advocating for open architecture, open standards and (web) service exposition for many years, thinking that open standards for the "Home Service Bus" would attract an ecosystem of service providers. You create the opportunity and evolution/selection of the fittest does the rest (Darwin). The last two years spent thinking about sustainable development (i.e., analyzing complex systems' architectures) and looking at successful software ecosystems have made me reconsider my Darwinian position. I am much more a follower of Lamarck these days: I see a "grand architect" in the success of many application stores, iPhone being the obvious example. Open standards and open API is not enough. The spectacular failure of major Web Service exposure programs from large telco is a good example. You need to provide SDKs (more assistance for the developper), a "soul" (a common programming model) and a sense of excitement/challenge/cool (which obviously requires some marketing).
This third point is precisely the cause for SOA (Service-Oriented Architecture). This is an observation that I have made earlier: reuse in the world of corporate IT does not occur easily or randomly, it requires serious work. To put it differently, to come up with a catalog of reusable services is not to deploy a service-oriented architecture (with a Darwinian hope that the "fittest" services would survive). To make SOA work, you need to organize (hence the word "architecture"), promote, plan and communicate. There is a need for a "grand architect" and a "common sense of destiny" for SOA to bring its expected benefits of sharing, reusing and cost reduction.