Epiphany and Apophany


Human beings have an astounding ability to see patterns and apply them in new contexts…but how often do we see patterns that don’t truly exist, and what happens when those patterns are misapplied? In a complex domain it’s only in retrospect that we can understand how outcomes emerged, and we don’t get much more complex than human systems.

In this session, we look at our confirmation biases and how they might be preventing us from creating change, or enabling us to move forward in uncertainty. When we don’t know what’s going to work, how can we avoid our tendency towards root cause analysis? How can we find out what might work if we tried it? And how can we ensure it’s safe to fail?


related with Cynefin

our tendency to see things that don’t exist and we move forward with them => uncertainty

Epiphany: oh I see it

Apophany: that sense of Epiphany of schizophrenics: oh I see it, but in fact they don’t get it

2 different elephant species:

  • african bush elephant
  • african forest elephant but they look similar

the ecologists that protect and track elephants know there are 2 different species but they can’t track 2 different species, because their processes are stuck in time.

how does it come that we don’t see the 2 species?

observe data -> filter data -> generate assumptions -> draw conclusions -> build beliefs -> filter more data

=> the more we belief, the more we filter the data

Confirmation Bias

Apophenia: tendency to see patterns in random data -> the Google effect …

159 Cognitive Biases on Wikipedia => makes it very difficult to handle uncertainty

… a fundamental assumption of organizational theory and practice: that a a certain level of predictability and order exists in the world.

– Dave Snowden, a leaders framework for decision making

and this is simply not true!!

The Innovation Cycle

  • Differentiators -> Spoilers -> Commodities -> Build-on Differentiators

Cynefin [ku-nev-in]

  • Obvious: sense, categorise, respond
  • Complicated: sense, analyse, respond

    requires expertise, still predictable

  • Complex: probe, sense, respond

    results are only predictable by hindsight

  • Chaotic: act (and quickly), sense, respond

    a lot of our desire for predictability is to avoid chaos

The difference between safe-to-fail and fail-safe (estimate a thing you never developed before)

Lots of people treat complex as complicated because they want predictability

There is a transition between Obvious and Chaos: complaisance zone, a cliff you fall over

ex: Knight Capital Services: deploying in production was treated as obvious, forgot to install on 1 of 8 servers + reuse of toggle => got into chaos

Estimating Complexity

  1. nobody has ever done it before
  2. someone outside the org has done it before (probably a competitor)
  3. someone in the company has done it before
  4. someone in the team has done it before

  5. we all know how to do it

start with the new stuff, not the stuff you already know -> spike it, prototype it

A Safe-To-Fail Probe has …

  • a way of knowing it’s succeeding: indicators, not measures, measures are treated as targets!
  • a way of knowing it’s failing
  • a way of dampening it
  • a way of amplifying it

Coherence

-> problem with hypothesis: just an example of what might happen, the side effects are evenly interesting, if you only focus on the hypothesis, you may loose sight of the side effects curiosity, looking at the side effects is evenly important

We jump to root causes so quickly. Within complexity you have a whole ecosystem: there is more then one cause, there is not one root

Coherence: a realistic reason for thinking the probe might have a positive impact

In high uncertainty …

… scenario’s and estimates and plans

provide coherence, not tests

A Fail-To-Safe probe is not a way of avoiding failure completely.

Real Options

Book: Commitment, a novel about managing project risk, Chris Matts

options expire

closing options: sibling intent - you have to be very careful when communicating an experiment: “I would really try this out as an experiment, it might fail” this avoids it becomes a commitment

be really careful when communicating options so they stay options

you learn what you need, while your options are still open (don’t commit to early)

Sensemaker by Cognitive Edge

what you have in complexity is disposition, i.e. some things are more disposed to land than others

Is a football game complicated or complex? Is the outcome predictable ?

finding the things that already work and amplify it and have the people in the teams to amplify it too

it is more important to amplify the things that are going right, then prevent the things that are going wrong

when you start change happening, it frees for even more change and you start to see the organisation shift

The greatest gift giving yourself:

  • ignorance is bliss
  • listen to the stories: the good ones because you remember the bad ones all too much
  • use existing dispositions: see if you can spot what likely will help change

If people don’t feel safe, they won’t take risks, they won’t do experiments, they won’t do great things

=> important to improve psychological safety!