Sensemaking is the way in which people understand the world at large scale: how they decide what kind of goals are reasonable to try for and what kinds of strategies are worth trying or using. Sensemaking is related to what the military call situational awareness.

One of the best ways I’ve come across for thinking about sensemaking is the Cynefin framework, developed by Kutz and Snowden.

They suggest that it’s helpful to think of the world, or systems, as being of five kinds:

  1. Known systems, where cause and effect are clearly understood, and the “rules” are understandable and widely understood.
  2. Knowable systems, where causes and effects are separated, but analysis, systems thinking, scenario planning, and other mental tools can produce some level of understanding.
  3. Complex systems, where causes and effects only make sense with hindsight, and patterns are seductive but entirely misleading.
  4. Chaotic systems, where there is no discoverable relationship between cause and effect.
  5. Disordered systems.

Most adversarial situations, and many others, are complex systems. The danger is that they get treated as knowable (complicated) systems.

Kurtz and Snowden give two great examples of the problem caused by confusing the two. A group of West Point graduates were asked to manage a kindergarten. They developed objectives, made plans, identified back up plans; and the whole thing was a total disaster.

In another case, marines were taken to Wall St and played against traders using trading simulators. Unsurprisingly, the traders won. Then the traders were taken to Quantico and played war games against the marines. Again the traders won!

In both cases, the situations are complex, and so rational planning is not the way to approach them. A better approach is to look for aspects that are considered positive and find ways to enhance or reward them, and look for situations that are negative and discourage them. The traders did better than the marines because they were able to see and exploit transient opportunities. Kindergarten teachers don’t base kindergarten activities on a carefully worked out plan either.

This structure is characteristic of complex systems, where the Law of Unintended Consequences rules. Systems can be nudged in a good direction, but only nudged, not pushed wholesale. If a nudge turns out to make things worse it’s easy to back off, and maybe to reverse it.

Insurgencies are complex systems, and so are terrorist attacks, and it’s both dangerous and fairly useless to try and defend against them using techniques appropriate to knowable systems.

Knowable system approaches still tend to creep in. Here’s a quotation from U.A. Army Field Manual FM3-24 Counterinsurgency (the famous Petraeus manual in use in Iraq):

The underlying premise is this: when participants achieve a level of understanding such that the situation no longer appears complex, they can exercise logic and intuition effectively.

(of course, “complex” here doesn’t mean in the Cynefin sense.) Much of the language in paras 4-2 to 4-13 has an implicit assumption that counterinsurgency situations can be understood by rational planning, given enough time, effort, and thought.

In insurgencies and terrorism, it’s probably not possible to get such an understanding, and perhaps foolhardy to try. It seems helpful to be able to distinguish the two kinds of situations that might look superficially similar, but are hugely different underneath.

(There are many other interesting ideas in the Cynefin framework. A good starting place is here.)



%d bloggers like this: