Wednesday, December 21, 2016
I had the privilege last month to give a keynote lecture on “Big Data, Behavioral Change and IOT Architecture” at the Euro-CASE Annal conference on “Big Bata – Smarter Products, Better Society”. You may download the slides here. My lecture was divided into three parts: the first was about our NTAF report on big data, the second focused on behavioral change and the last part presented some of my views about IOT architecture. I have already covered the first part and the last part in previous blogposts, so today I will talk about behavioral change.
There is an obvious link between Big Data, Internet of Things and Behavioral Change. Many of the “smarter products” leverage IoT technologies and big data to help us change our behavior. This is true for wearables that are intended to help us take better care of our health and well-being, but is this also true for many products for your car or your home. The IoT technology is used to capture data through sensors and provide feedbacks through screens, speakers, motors, actuators, etc. Big Data methods are applied to extract value from the captured data so that the overall feedback experience is “smart” – hence the “smart product” subtitle for this conference. However, it turns out that changing behavior is hard, and this is not a matter of technology, it is a matter of psychology. There is a fair amount of science that may be leveraged, but there is no silver bullet: designing digital objects or experiences that help you change your behavior is a difficult project. I am not a behavioral scientist nor a psychology expert, thus this post is a short introduction to the topic. I am just trying to make a few cautionary points and to open a few doors.
This post will follow the same outline that I used during the conference. The next section (Section 2) sets the landscape of behavioral change with respect to “smarter products” and IoT. The goal is to move the focus in IoT from data to user-centric design - which was my conclusion at the end of the lecture. Behavioral change requires times, stories and emotional design. Section 3 is a short summary of a NATF working group that worked for a year on understanding how people react to the exponential rate of change for ICT (information and communication technologies). The key takeaway is that there is no fear of ICT, but there exists adaptive stress. That stress may be relieved if we design digital experiences as learning experiments – quoting from Mary Helen Immordino-Yang, “the goals and the motivations of the digital environment should be readily apparent”. Section 4 draws on a few well-renown scientists and sources to see how fun and learning may be embedded into digital experiences. The last section applies this to smart objects whose ambition is to coach you to change your behaviors towards a better or healthier lifestyle. The need to weave emotions, fun, self-learning and reflective-story-telling yields to systemic serious games. Behavioral change requires a systemic posture, because of the importance of feedback loops, adaptive planning, and chronology. It also requires to design “smarter products” as games with a focus on user emotions, story-telling and pleasure.
When experiencing with a connected wearable or device, most users do not want a dashboard, they want a story. I have already covered this in a previous post, to go further I suggest that you read “Inside Wearables - How the Science of Human Behavior Change Offers the Secret to Long-Term Engagement”. Owners of connected devices become bored quickly of their data dashboards, once the excitement of the first days has faded away. The story of wearables that are offered for Christmas and forgotten a few months later is a perfect illustration. Self-tracking is a good and healthy habit – recommended by psychologists in many situations – but self-tracking without sense does not work because everyone is not a data scientist. This is wider than the field of health improvement and connected wearables: similar observations have been made about smart home connected devices. Remote control and monitoring through your smartphone is not enough value for the connected gadgets that we bring home – often as gifts.
As expressed in the previous post, connected devices must come with a story and a coach. If we look at the numerous behavioral change models, you need a good story to start you moving, and you need a coach to keep going. There are many references about the fact that we are moved, and hence remember better, by stories and not data sets, but I am partial to Nassim Taleb’s wonderful books. I strongly encourage you to read “Fooled by Randomness”. The importance of stories is deeply connected with the importance of emotions in learning which I will evoke later. Stories trigger emotions that acts as anchors in our learning process. One of the dominant behavior change mode is the TransTheoretical Model (TTM). Where stories are critical in the precontemplation/contemplation phases, the role of the coach is critical in the action/maintenance phases. The coach cannot be reduced to a feedback loop – otherwise dashboards would work. The coach must bring sense to the results that are collected by the connected device. Behavior change is hard; hence the coach role is difficult. The coach need to provide the proper information at the right time, together with the right emotion, to keep the “why” (motivation) alive while taking care of the “how” (engagement). We will return to how the science of “nudging” (i.e., designing the choice architecture) may help to nurture the user engagement.
Behavior change must be approached as a user-centric design challenge. The role of biorhythms and chronology is very important. For instance, attention span has a complex structure with specific rhythms. Transient attention is very short (less than 10s) – this is how magicians and conjurers operate – while focused attention is on the order of less than 10 minutes. The “coaching content” needs to be delivered at the right moment, for the right duration and in the right “state of mind” from an emotional standpoint. A lot is known about demotivation and habit-formation cycles, but this is not a hard science, there is not much data available and many controversies. Still, it looks like we need two months on average (66 days) to create a new habit, with a “danger zone” of three weeks after the start (21 day) when motivation is at its lowest. This is consistent with a rule of thumb of elementary teachers that says that a new concept must be explained once, then repeated one day later and three weeks later.
Faced with the behavior change challenges, we need as much help as possible from social sciences, psychology, and neuro-sciences. Neuro-sciences have become very relevant in the last decade because we have learned a lot about the way the brain works and learns. Since the best-seller from Antonio Damasio, “Descartes’ Error”, we know that emotions play a critical role in our thinking and learning. I am quoting once again from the great book “Emotions, Learning and the Brain” by Mary Helen Immodino-Yang: “It is literally neurobiologically impossible to build memories, engage complex thoughts, or make meaningful decisions without emotion”. She explains very clearly that “Emotional Learning Shapes Future Behavior”: “ The learner’s emotional reaction to the outcome of his efforts consciously or nonconsciously shapes his future behavior, inciting him either to behave in the same way the next time or to be wary of situations that are similar”. The last chapter of the book is entitled “Perspective from Social and Affective Neuroscience on the Design of Digital Learning Experience”. It is very relevant and a great reading for anyone trying to help user change their behavior through connected devices and digital experiences. Here is a last quote from this chapter: “Here we turn the tables and suggest that many people may interact with their digital tools as if they were social partners, even when no other humans are involved. Thinking of digital learning as happening through dynamic, supported social interactions between learners and computers changes the way we design and use digital technologies for learning—and could help shed light on why we become so attached to our devices”
In 2015 the NAFT ICT commission has undergone a series of interviews related to the effects of ICT usage. We interviewed leading sociologists or psychologists, such as Francis Jauréguiberry, Dominique Cardon or Serge Tisseron, to better understand how digital experiences were accepted and appreciated by the average person. We wanted to better understand the tension, one could even say the paradox, between an ever-growing usage of smartphone, internet, and new digital services, while at the same times there exist clear and growing “distrust signals”. We started our discussions about “fears”: fear that digital communication was cutting people from “real communication”, fear that Google was making us stupid, etc. The conclusion from the majority of the interviews is that ICT adoption is indeed fast and widespread, and actually well received by the vast majority of users. Digital usage adds to, but does not replace real life, and most people value “real life contacts” over digital ones. This is a complex topic that would deserve a separate post. Here I will just point out some of the conclusions or recommendations because they are clearly related to learning and behavior change.
The main common idea from our experts is that the worries that are being expressed about ITC usage are the symptoms of “adaption stress”. The rate of technology change is faster than the usage rate of change, which is itself much faster than the rate at which we understand these technology changes. We live in, and we welcome for most of us, a “world of accelerated permanent change”, where our products and services are constantly “upgraded” (we hate these “updates” because we do not understand them, they usually come when we do not expect them and they are forced on us). The main worry that is a consequence of this adaptation stress is the fear of not being in charge, the lack of mastery, especially from a time management perspective. Users who are interviewed by sociologists complain that are no longer in charge of their own time any more. They see ICT usage as taking too much of their free time, with a great difficulty to reclaim control (e.g, the fear of missing out). The “digital detox” approach is a classical counter-reaction to this feeling.
The main recommendation from this workgroup is, quite logically, to spend more effort in training and explanations related to the new digital products and services. The best way to reduce the stress of “losing control” is to give back the sense of “being in charge” with practical training. For instance, “digital life hygiene”, that is the practice of digital usage control, with both temporal and spatial zones of “digital detox” deserves to be taught. Digital training works best when it is both practical (in the “learn by doing” philosophy) and rooted in the real world, using devices, real life environment and situations to embed the conceptual learning into a kinetics experience (in the tradition of Maria Montessori). This idea of “inviting the real world back into the virtual one” came in many forms. A great piece of advice for teenagers and adult alike is to read aloud a message that is about to be sent electronically (SMS, chat, email, …) if complaining is involved. Neuroscience shows that reading aloud forces the facial muscle to express emotion, which are then carried to our “mirror neurons” so that we instantly feel what the effect on the other person may be (and then possibly adjust our message). Another set of recommendations about how to build better accepted digital experience were related to emotions: how to adapt the experience to user emotions (emotional design) but also how to leverage emotions as a training tool.
We all know that users don’t read “user manual” or documentation anymore. The challenge to alleviate the “adaptation stress” is to deliver digital experiences where learning and training is part of the customer journey. This is especially true for connected devices and quantified self digital experiences, as presented in Section 2. “Digital” means that data analytics is a given: we can analyze user journeys at each step of user experiences and measure both discovery and appropriation. From this, an appropriation maturity model may be build, which can be used as a guideline for “embedded connected tutorial”. The use of IOT and connected devices gives the additional advantage of the continuous feedback loop. Still, if training is conceived as an additional “online tutorial experience”, it most often fails to deliver the engagement that is needed for behavior change. The real challenge is to design the complete digital experience as a learning journey.
Pleasure plays a key role in learning. The diagram shown to the right is borrowed from a biology conference about learning in a complex systems conference that I attended a few years ago. All living being, from very simple organisms to humans, build their behavior from this simple cycle (among other things). This is well recognized in design. I borrow this great quote from “The A-B-C of Behaviour” : “Fun is the mean by which we retrain our brain to learn new patterns of behavior”. Fun and pleasure are introduced into a digital experience through many means, from rewards to surprise. Rewards systems are heavily used in coaching or behavior change products. Surprise is a powerful emotion to trigger fun and to facilitate learning. I refer you to Michio Kaku’s explanation about the evolutionary role of emotions in his book “The Future of Mind”, which I have already mentioned in a previous post. He sees human as hard-wired to like surprises because they help to constantly tune our planning system, which is an evolutionary advantage. Intelligent beings plan and predict about their environment; a surprise occurs when what happens (a joke, a conjurer’s trick) is not what you were expecting. Because evolution has developed this pleasure from surprises, we are wired to explore and to learn. Helen Immodino-Yang express a similar idea: “In this sense, emotions are skills—organized patterns of thoughts and behaviors that we actively construct in the moment and across our life spans to adaptively accommodate to various kinds of circumstances, including academic demands”.
Learning is also a social activity, which means we should leverage the power of communities when designing behavior change experiences. This is also well-known in the design and digital world. Seth Godin has taught us to build viral experiences, where sharing is not an after-thought that is added to increase the spread of the product, but something that is at the core of the experience: “Virality is the product”. Experimental psychology and neurosciences show that we learn by imitation. The best way to build a tutorial is to show a video of someone else doing the very thing that needs to be learned (a great insight from the workgroup mentioned in the previous section). Dan Ariely, in his best-seller “Predictably Irrational – The Hidden Forces that Shape Our Decisions”, explains the importance of social norms. In many instances, social norms are much more powerful than money to motivate people. More generally, what behavioral sociology tells us about cognitive biases is very relevant to design engaging learning experiments towards behavior change. For instance, Dan Ariely talks about the planning fallacy, the fact that we consistently underestimate the time it will take us to complete a task. Here a digital feedback loop may prove a useful help. Another fascinating example is the “high price of ownership”: “Ownership pervades our lives and, in a strange way, shapes many of the things we do”. Once we think that we own a thing, an idea, a goal … we overvalue what it represents. This is why the “emotion of ownership” is proposed as a goal by Don Norman, which is very relevant for digital experience and why customization is such an important feature (make it your own). On the opposite side, each choice is painful because of the effort that we make for any decision. Procrastination should never come as a surprise, and choice architectures should factor in the “consequences of non-decision”.
The best way to develop a learning experience that is woven into the overall experience is to “nudge” users towards behavior change. I am referring here to the “choice architecture” concepts popularized by R. Thaler and C. Sunstein’s best-seller : “Nudge – Improving Decisions About Health, Weath, and Happiness”. This books shows very interesting of designing choices frameworks that take our cognitives biases ( anchoring, over-valuing the present versus the future, availability biais : over-valuing what we have in front of us or at the top of our mind) into account. For instance, helping people to save more is a great behavioral change challenge. The “Save More Tomorrow” program showed how to use behavioral economics to increase employee savings. The section about “social nudges” is a source of inspiration for introducing priming into digital experiences. Experimental psychology as a lot to tell us about how to nudge and motivate. For instance, to return to the reward topic, science shows that it is better to break down into many small rewards than giving a larger one less frequently. This is why the practice of small rewards such as “badges” is so common in the digital world (combining two insights about social and frequency). The concept of “nudge” and choice architecture is also very relevant to designing “progressive onboarding”, that is precisely the embedded incremental learning experience built into a digital product. To deliver the proper nudge, the experience designer must build a “usage and learning maturity model”, which is used to transform digital analytics (cf. Section 3) into an estimate about how much was learned already. From this the user may be “nudged” with the proper “tool tips” (a tool tip is a tiny piece of information that is presented at the right time, according to the usage context).
There exists a wealth of insights that behavioral change can borrow from experimental psychology. In addition to Ariely and Thaler, it is logical to mention Daniel Kahneman and his wonderful book “Thinking, Fast and Slow” (which I have commented in a previous blog post). This book contains wonderful examples related to “the marvels of priming”. For instance, hearing about other people changes your own ability : “This remarkable priming phenomenon – the influencing of an action by the idea – is known as the ideomotor effect. … The ideomotor link also works in reverse…. Reciprocal priming tend to produce a coherent reaction: if you were primed to think of old age, you would tend to act old, and acting old would reinforce the thought of old age”. Kahneman also illustrate our aversion to loss, which is closely related to the emotion of ownership : “We should not be surprised: losses evoke stronger negative feelings than costs. Choices are nor reality-bound because System 1 is not reality-bound”. He explains a number of “fallacies” (in the sense of the “narrative fallacy” of Taleb), such as the availability biais (WYSIATI : What you see is all there is). The insights of “the law of small numbers” is very relevant to dashboard and tracking: humans are not good at analyzing small data sets, we tend to see stories and correlations everywhere. This is even true for professional statisticians: “ It was evident that the experts paid insufficient attention to sample size”. The list of biases from System 1 (our fast thinking process, cf. “Blink” from Malcom Gladwell) is summarized on page 105; this list is quite useful to improve the design of behavior change experiences. The combination of framing (using the power of words and emotion to build the choice architecture) and understanding decision weights (we overvalue low probability events – the table page 315 is an eye-opener) can be leveraged to “nudge” more efficiently.
If we assemble everything together - the need for incremental learning, the necessity of pleasure, the pleasure of learning – the best digital experience that we may propose for behavior change is to propose a “game to learn about yourself”. Technology is definitely available to help : IoT sensors may monitor the user and her environment, digital tools and user interfaces may be used to tell a story and data science may be used to generate insights that feed self-discovery, learning and surprise (i.e., learning something new). Data Science is very relevant to develop such “serious games”. Machine learning algorithms are known to provide predictive, prescriptive and cognitive knowledge from data. Predictive analysis is very useful for the playful nature of the game. It is what makes a behavior change digital experience dynamic and interactive. Even if the prediction is not always accurate, it creates a surprise element and contribute to making the experience fun. Prescriptive analytics is about providing insights. This is a heavily debated topic because as we all know, correlation is not causation. Still, experience shows that powerful insights may be drawn from data collected with multiple sources of IoT sensors. Last cognitive analytics is about helping the user learn about herself. To build such a self-learning experience and to understand the related challenges of behavior change motivation, the book from Samantha Kleinberg, “Why – A Guide to finding and using causes”, is a great source of insights. The book is full of warnings, which are closely related to the biases evoked in the previous section, such as the following: “many cognitive biases lead to us seeing correlations where none exist because we often seek information that confirms our beliefs”; ”It’s important to remember that, in addition to mathematical reasons why we may find a false correlation, humans also find false patterns when observing data” and “Most critically for this book, the approach of interviewing only the winners to learn their secrets tells us nothing about all the people who did the exact same things and didn’t succeed”. This last quote is interesting because it emphasizes the limit of statistics and samples, in contrast with the power of personalized medicine and coaching. Samantha Kleinberg’s book is also very positive because it shows that understanding the “why” is critical for self-motivation in behavior change: “Will drinking coffee help you live longer ? Who gave you the flu ? What makes a stock’s price increase? Whether your’re making dietary decisions, blaming someone for ruining your weekend, or choosing investments, you constantly need to understand why things happen”. It is also a comprehensive source about the art of explanations, which is closely related to learning.
The game paradigm implies that you learn by doing. This also applies to learning about yourself as a system. The game becomes a search to discover new insights, as with a treasure hunt. The object of the game is not the “static you” but the “dynamic version, a system that evolves constantly”. Applied to weight loss, it means that insights are not what your weight should be ( a form of medical advice – no fun in that) but rather behaviors that make you gain unnecessary weight (learning about yourself from your experience). This type of approach is natural for regular “quantified self” practitioners, but as we noticed, they are but a fraction of the overall population. This is a missed opportunity, in a sense, since self-tracking is good for you if you have the discipline for it. There are multiple references in all kinds of disciplines, from mental health and psychology to dieting or quitting smoking, including sports coaching. Quoting from another best-seller from Gretchen Rubin: “Current research underscores the wisdom of Benjamin Franklin chart-keeping approach. People are more likely to make progress on goals that are broken into concrete, measurable actions, with some kind of structured accountability and positive reinforcement.” The challenge that a behavior change game must solve is how to leverage behavioral science to bring the benefits of self-tracking (insights and systemic self-discovery) to people who do not have the mind nor the inclination for it. The system paradigm means that the user is “in the loop” and that the game must yield actions (and reactions) from which learning may be derived. For instance, experience shows that it is easier to nudge people to fix approximated data than to enter it in the first place.
Let us conclude with the observation that systemic games for behavior change fit squarely in the field of P4 (Predictive, Preventive, Personalized, and Participatory) medicine. Prevention is the goal of behavioral change experiences; the predictive capabilities from data science are necessary to develop engaging experiences; behavioral change games are personalized by construction, operating under the assumption that we are all different when it comes to behavior change, from motivation to effects. Last, behavior change games are participatory by construction on an individual level (learning comes from acting) and may leverage the power of social and community nudges, modulo the respect of customer privacy. Personalized medicine is, actually, more concerned with small data than big data.