Organisations are still jumping to the conclusion that they absolutely need a ‘knowledge repository’ to successfully harness employee know-how. While a database (let’s be honest with ourselves, it’s just a database) can be an important part of a knowledge solution, by itself, it’s typically an expensive waste of time. This white paper provides an alternative approach where content generated by subject matter experts (SME) creates new social networks, which in turn can provide useful pointers to content held in the ‘knowledge repository.’ People access the database at points recommended by the subject matter expert in context of the seeker’s current need. It’s a type of social indexing. While the paper takes a sales force application area, the solution is widely applicable.
The beauty of the solution is that it relies on simple and inexpensive software (blogs and RSS aggregators). Feel free to contact me if you would like to know more on how you might implement this solution in your organisation. In particular, how might you motivate SMEs to participate.
Managers can apply complexity science as a metaphor to better understand their organisation. Like all metaphors, they are only a partial description and will always break down. For example, you might describe a colleague as a veritable tiger to illustrate his ferociousness, agility and willingness to attack, but he is unlikely to have a long tail and stripy fur coat.
When managers apply complexity ideas they invariably encounter the concept of ‘attractors’. Unfortunately there is considerable confusion about what is meant by an ‘attractor’ and therefore is usefulness can be diminished.
The confusion arises from the meaning the term ‘attractor’ has for a complexity scientist and its colloquial meaning. For example, if you ask anyone without a background in complexity science, ‘what is an attractor?’ their likely response is: ‘anything that attracts.’ A complexity scientist, however, might say: “an attractor is the pattern which forms from the interaction of many connected entities.” The attractor for a complexity scientist is the result not the cause.
Cohen and Stewart (1995) provide a useful description that illustrates the complexity science view of attractors. Imagine a beach. At one end is a pier and the other is a rocky point. Two ice cream vendors arrive to sell their wares and decide to locate themselves so they are equidistant from the pier, the point and one another. By pure chance, vendor A gets the first group of customers. So as not to miss out on business, vendor B moves a bit closer to vendor A. Now vendor B has customers, so vendor A decides to move closer to vendor B. Over time they creep toward each other until they are both side by side. The resulting cluster is called the attractor. They are not attracted to a particular grain of sand in the middle of the beach. Rather, their interaction results in the attractor pattern forming.
From a management practice perspective both views of an attractor are useful and we should avoid being dogmatic about which is right or wrong. Perhaps a way to explain attractors to those people wishing to use this concept is to describe two types of attractor: those that attract a behaviour, such as people, events, rituals and communities (this is how Cynefin describes attractors); and those that emerge from the behaviours of people interacting.
The key point to remember is, regardless of how we define attractors they are simply a metaphor to help us better understand how organisations work. Our next challenge is to understand the other often quoted complexity concept: strange attractors.
Cohen, Jack, and Ian Stewart. 1995. The Collapse of Chaos: Discovering Simplicity in a Complex World: Penguin.
Technorati tags: attractor
If you want to see the power of anecdotes, just check out this site by Andy Hertzfeld. Andy has collected 117 anecdotes that document the development of the Apple Macintosh and categorises each anecdotes into topics like software design, marketing, inspiration and celebrities. It provides an excellent example of how narratives can be organised to tell a bigger story. Imagine having a similar repository of anecdotes for your organisation. It is a great way to capture some of that valuable knowledge before it retires.
Thanks to Warwick Holder for pointing me to this gem.
Auckland has been a delight. I’ve spend the last 3 days running a workshop called, ‘Succeeding in Complexity’. It seems to have been well received. We spent much of our time getting hands-on experience with the Cynefin techniques and listening to how each participant was coming to grips with the new mind set required to address complexity.
I’m off tomorrow morning to Canberra for the wedding of two dear friends (Kathryn and Larry) and then its off to South Africa for a few days.
Blogging will be back to normal on my return to Melbourne.
Hi Michael, thanks for you kind comments about my blog. As you’ve probably gathered, my thoughts on monitoring are developing so I appreciate your questions. Take the following comment you make:
One question that comes to mind immediately is an extension of his base assumption that there’s an optimum level and pace of monitoring given a particular context. This suggests in turn that overmonitoring can be as much of a problem as undermonitoring.
Getting the balance right is tricky and I guess this is why I titled the post ‘The Art of Monitoring’. I don’t think there is an optimum level and there would be no way to really tell. In the complex domain you are looking for ‘good enough’.
You suggest, by omission, that monitoring doesn’t make sense in the complex domain: “Monitoring certainly makes sense in the known and knowable domains of the Cynefin model, when an organization’s context and activities are reasonably reduced to linear and causal models of behavior.” I hold the view that monitoring is essential in the complex domain for the simple reason that each intervention only makes sense in hindsight and therefore you have to have a look to see what happened. Of course just by having that look you are changing the system.
This group is now 4 months old. If you are Melbourne and would like to join our monthly meetings to discuss organisational complexity, just sign up as a member here
Technorati tags: emergence
CIO magazine recently published this article describing the now well-known argument that organisations will lose significant knowledge as baby boomers retire. There is reference to David DeLong’s book, Lost Knowledge: Confronting the Threat of an Aging Workforce, which might be an interesting read. The piece concludes with a couple of ways IT can be used to retain this knowledge, which I must admit seemed like a pretty lame effort. The key suggestions revolved around conducting email interviews and running web-based surveys; both techniques are extremely limited in understanding or transferring what people know. How about coaching, mentoring, narrative capture, communities of practice? Surely these techniques are more suited to transferring, as Dorothy Leonard would say, an organisation’s deep smarts.
Have a look at some of the reader comments. There is an interesting post suggesting that the aging workforce issue is overstated.
Organisations that operate in complexity need a monitoring regime. Combining the following three approaches to monitoring will improve an organisation’s ability to adapt in uncertain circumstances. The three approaches are:
- monitoring at intervals
- monitoring at events
- creating signposts
Monitoring at intervals
Some things change quickly while others move at glacial speed. Consequently, when monitoring a business environment, it is important to look for indicators which represent the spectrum of change speeds. For example, we might track staff blogs regularly for daily perturbations, while only reviewing competitors’ annual reports each year for slow moving trends. Monitoring at intervals means examining the world around us at set, regular periods. These intervals match the rate of change of the phenomena of interest. A monitoring regime, therefore, might schedule daily, weekly, monthly, quarterly and yearly visits to information sources.
A problem with relying solely on this type of environmental scanning is that it assumes that the world about us is predictable and that a phenomena moving at, say, a weekly cycle, does not rapidly change to a daily cycle. Many of us witnessed this speeding up and slowing down of cycles in the dot com boom and bust. The difficulty is knowing when to speed up or slow down the rate of monitoring.
There are two ways to approach the problem of unpredictability in monitoring: implement interventions and monitor the results (monitoring at events); and devise scenarios in order to identify warning signs that indicate that monitoring should accelerate (creating signposts).
Monitoring at events
When implementing an intervention, it is impossible to predict the outcome in detail. Monitoring to detect the patterns that emerge from the intervention enables corrections to be made. An explicit programme of monitoring reinforces the view that interventions in a complex system can never be set and forgotten and increases the mindfulness of the decision-takers and planners—a key requirement in handling complex and unpredictable environments (Weick & Sutcliffe, 2001).
Scenario planning assists the monitoring programme by helping decision-takers and planners identify warning signs. If an intervention takes the organisation close to one of these signposts, the rate of monitoring can be increased and corrective measures taken.The principal difficulty with scenario planning is that the breadth of possibilities is limited by the imagination of those involved in developing the scenarios. It is important, therefore, that the monitoring-of-events approach also helps identify signposts. While it is impossible to cover the entire space of possibility within a complex environment, a systematic method combining these three approaches considerably enhances an organisation’s ability to adapt.
Weick, K. E. & Sutcliffe, K. M. 2001. Managing the Unexpected. San Francisco: Jossey-Bass.
Acknowledgements: I’ve recently become re-aware of the need for monitoring after talking with Dave Snowden who pointed out the interval monitoring problem of missing catastrophic change. My conversations with Bruce McKenzie helped my understand the role scenarios might play in creating signposts.
The last week has been frantic. I’m developing a new course called ‘Succeeding in Complexity,’ which I’m delivering in Auckland next week, and, as always, it takes longer that you think to get everything together. I have put considerable effort into the course notes so participants walk away with more than just a slide pack and vague memories of what they did. Unfortunately, as a results of my efforts, I haven’t had a chance to blog. The good news, however, is that the course development has created a wealth of new material to blog about, which I will be doing in a few weeks time.
Darren makes the following comment:
“The potential danger is this could limit thinking rather than expanding it. If people are constantly reminded of their past point of view, could it not encourage many not to move forward, but to reinforce their thinking of old?”
I guess like any tool, in the wrong hands it can be a dangerous weapon—or at least an ineffective one. Darren’s point is a good one. Past ideas should act as a prompter for new ways of thinking given the current circumstances and only regurgitated if the writer is confident it makes sense in the new context.