Future-Proofing Organizations #KMWorld

KMWlogo_Stacked_Session Description:  As our world continues to change at a rapid pace and take unexpected turns, our organizations have to be prepared to deal with what’s coming next even if it is unanticipated. Our popular speaker shares his strategies for future-proofing your organization.

Speakers: Dave Snowden, Director, Cynefin Centre, Bangor University, Wales Cognitive Edge

[These are my notes from the KMWorld 2016 Conference. I’m publishing them as soon as possible after the end of a session, so they may contain the occasional typographical or grammatical error. Please excuse those. To the extent I’ve made any editorial comments, I’ve shown those in brackets.]


  • What’s the Current State? 
    • We are suffering a modern malaise — too many years of struggling to fit the complexity of life into the simplified, engineering view of the world dictated by systems thinking.
    • We have used tools like Myers-Briggs that contrive to squash and flatten people so they fit into predefined boxes. Snowden ran a controlled experiment at IBM that established that astrology was a more reliable way of staff identification and team assignments than Myers-Brigg.
    • Techno-fetishism
      • The Nonaka Model launched thousands of failed KM initiatives.
      • The reduction of an artisan process to a simple methodology. The latest version of this is design thinking. You cannot master artisan processes in a two-day workshop.  It takes 2-3 years for the brain and body to co-evolve to the point that we can drive and talk at the same time. It takes 3-4 years for the brain and body to co-evolve sufficiently to apply expert knowledge. This is why apprenticeship is such an effective approach.
    • The false dichotomy of Order and Chaos. Despots throughout history have created or exploited chaos so that they can appear like heroes who promise (and occasionally deliver) order. We should adopt a more nuanced, less Manichean view of the world.
    • The Cult of Measurement. Six Sigma is a cult — its priests have different colored belts. Black belts do no real work because their job is to impose cult discipline.
      • PROBLEM: Whenever people are working for explicit rewards (e.g., measurements), this destroys intrinsic motivation.
    • The Intolerance of Deviance — HR departments create norms of how we should be. However, people are natural deviants. Yet we are forced to adhere to a particular view of how we should be.
    • The Obsession with the Strong Leader. This obsession ignores the fact that we work best with distributed leadership where different people contribute their unique talents and judgment.
    • The Anglo-Saxon Malaise: this is related to our over-emphasis on the individual. Yet we work best in communities.
    • The Tyranny of the Herds. The principle of democracy is that people should make individual decisions and those decisions collectively produce the wisdom of the crowds. However, if you permit opinion polls, then people start gaming the system and produce the tyranny of the herd. (He asserts that opinion polling should be banned during election season.)
      • Crowdsourcing is NOT the wisdom of the crowds.
    • The Naturalistic Fallacy — David Hume teaches that you should never derive an “ought” from an “is.” Just because you want it does not mean you should have it.
  • When to try novel solutions?
    • Start by asking: Where is the ecosystem? What stage is it at?
      • Snowden maps Geoffrey Moore’s Crossing the Chasm with S-Curve theory.
    • Dominant Predator Theory
      • During a period of dominance of a standard methodology, your best bet is to conform.
      • Once you see that the dominant predator, the standard methodology is not working so well anymore , then you have an opportunity to try something new because the old way is no longer reliable.
        • Six Sigma developed to try to wring efficiencies out of an old manufacturing system. Therefore, you should look for new manufacturing methods.
    • Past competency stops us from seeing future novelty.
      • We see only that which we are trained to see.
      • Drew, Vo & Wolfe published a study in 2013 that reported when 24 radiologists were asked to interpret a scan, 83% of them failed to notice the seriously enlarged picture of a gorilla inserted into the scan. Even those who looked directly at the gorilla did not realize they were looking at a very large picture of a gorilla. They saw only what they were trained to look for.
  • The Issues with Case-Based Evidence.
    • A fundamental obsession with Cases distorts our learning.
    • The Cobra Effect — when the British were in India, they decided there were too many cobras. So they announced an award for every cobra head turned in.  Then people set up cobra farms so they would have a supply of cobra heads.
    • The Butterfly Effect — a small thing can combine with other small things to create a big effect.
    • The Hawthorne Effect — if you do something new and pay attention to people, it will nearly always work the first time. However, you should not assume you can scale it. Until you really know WHY it worked, you should not replicate WHAT you did.
    • Cases are useful for explaining a situation. However, few cases have any predictive power. (Good science should have predictive power.)
      • if all you have is observations, you cannot scale
      • you need to be able to explain WHAT happened using reliable science
  • The Nature of the System Constrains how we can Act in It.
    • Start by understanding the nature of the current system
      • Ordered system — there are effective links in the system
        • checklists work
        • predictable, repeatable behavior
        • the whole = sum of the parts
      • Chaotic system — there are no effective links in the system — if you cannot contain the system, you have crisis; if you can contain the system, you have an opportunity for innovation.
      • Complex system — not a rigidly defined structure, it is ambiguous
        • variable links, permeable container
        • the whole is not the sum of the parts
        • use real-time feedback to moderate/modulate behaviors
    • The Law of Unintended Consequences — this is the only guaranteed feature of Complexity. If you know unintended consequences are inevitable, then you are ethically responsible for those consequences. Therefore, you should not make large, unmanageable interventions. Instead, make small safe-to-fail interventions in the present situation and then, once you have a body of evidence, announce the existence of these interventions.
      • This is in contrast to the usual corporate approach:  start by announcing a major initiative. In Snowden’s view, this inevitably dooms the initiative to failure.
      • The better approach is to set out on a journey rather than setting goals.
  • Distributed Ethnography.
    • Allow individuals to describe for themselves what is happening, rather than relying on experts. This empowers them and triggers novel solutions to tough problems.
    • Peer-to-peer knowledge flows are more effective than top-down mediated knowledge flows. Therefore, we need to engage people in the sensemaking.
  • New Theory of Change.
    • Discard the systems approach that starts by identifying a future perfect state and then tries to drag everyone into that future state. This appears in KM when we try to create the ideal future: a knowledge-sharing culture.
    • The better approach is to amplify what is working and diminish that which is not working. So, instead of striving for a distant goal, aim for the “adjacent possible.”
    • This translates into “nudging” the system into a better state rather than attempting to drag the system into that better state.