One of our newsletter readers asks the following question in relation to our article on expertise location:
“do you have any comments or references on how to go about the evaluation of this expertise?”
Building a groups’ capability to assess the validity of advice, ideas and suggestions is an important skill. The first step is awareness. How often do you hear people, especially management gurus, saying “my research suggests …” then never actually referencing the research results. This happened this week in ActKM where a poster said:
Research indicates a 2/3 reduction of time from traditional face-to-face classroom approach and even a reduction in time from more traditional self-paced approaches, but with a 2 standard deviation improvement in learning outcomes.
So I asked for the reference and the poster said that he didn’t have access to the reports but thought it would be a good idea to cite references—and then omitted the citations!
I asked for the names of the reports, but no reply. Another member of the community supported my idea of citing references in these cases but after that, total silence. The community is tacitly giving permission to this behaviour.
This is an example of where the community is not building a capacity to recognise flawed advice. Following are some ideas on the skills communities could develop which would help the entire group be more discerning.
I remembered two useful references when I was asked about evaluating expertise: Chris Argyris’ book Flawed Advice, and Bob Sutton’s essay called Management Advice: Which 90% is Crap? But before I remind us of some of Chris and Bob’s suggestions on evaluating expertise let me describe some of the things I’ve learnt from experience (probably just common sense).
Chris Argyris suggestion is to listen for advice which is: “… illustrated, encourage inquiry, and are easily tested.” On the other hand be wary of advice “that include little or no illustration, inquiry, or testing” and where defensive reasoning dominates. The problem with this suggestion, as I see it, that much advice is not easily tested or takes considerable time to test it. For example, we have been saying that anecdote circles are an excellent method to elicit stories and they create a positive and trusting experience simply based on our experience. It is not until this year that we have put this to the test and had each participant who attended an anecdote circle provide an evaluation of the experience that we were able to test our assertion. BTW we are presenting our results at KM Asia.
I’ve written a post about Bob Sutton’s suggestions to test management advice here. I call his suggestions heuristics for bullshit detection (please excuse the vulgarity of this phrase. You should know that it is quite a common term in Australia and I believe Australians are great bullshit detectors).
Thanks to Nancy White for a conversation that helped me remember some useful ideas.
About Shawn Callahan
Shawn, author of Putting Stories to Work, is one the world's leading business storytelling consultants. He helps executive teams find and tell the story of their strategy. When he is not working on strategy communication, Shawn is helping leaders find and tell business stories to engage, to influence and to inspire. Shawn works with Global 1000 companies including Shell, IBM, SAP, Bayer, Microsoft & Danone. Connect with Shawn on:
Send this to a friend