Evaluating expertise

Posted by  Shawn Callahan —June 9, 2006
Filed in Collaboration

One of our newsletter readers asks the following question in relation to our article on expertise location:

“do you have any comments or references on how to go about the evaluation of this expertise?”

Building a groups’ capability to assess the validity of advice, ideas and suggestions is an important skill. The first step is awareness. How often do you hear people, especially management gurus, saying “my research suggests …” then never actually referencing the research results. This happened this week in ActKM where a poster said:

Research indicates a 2/3 reduction of time from traditional face-to-face classroom approach and even a reduction in time from more traditional self-paced approaches, but with a 2 standard deviation improvement in learning outcomes.

So I asked for the reference and the poster said that he didn’t have access to the reports but thought it would be a good idea to cite references—and then omitted the citations!

I asked for the names of the reports, but no reply. Another member of the community supported my idea of citing references in these cases but after that, total silence. The community is tacitly giving permission to this behaviour.

This is an example of where the community is not building a capacity to recognise flawed advice. Following are some ideas on the skills communities could develop which would help the entire group be more discerning.

I remembered two useful references when I was asked about evaluating expertise: Chris Argyris’ book Flawed Advice, and Bob Sutton’s essay called Management Advice: Which 90% is Crap? But before I remind us of some of Chris and Bob’s suggestions on evaluating expertise let me describe some of the things I’ve learnt from experience (probably just common sense).

  • Listen for stories. Without stories advice and expertise remains abstract and devoid of experience. Become aware of the richness of the story—how detailed are they? do they include facts?
  • Are they ever wrong? I’m suspicious of people who purport to have all the answers and have never made a mistake in their life.
  • Can they see what’s missing? Deep expertise is not just the ability to see what’s happening and make suggestions for improvement but the ability to see what’s missing and knowing what to leave out. This idea was introduced to me in Gary Klein’s book on Intuition.
  • Simple, clear language. If you really understand what you are talking about you should be able to convey your ideas simply, clearly and concisely.
  • Triangulate the expertise with your social networks. Jim tells me that Martha knows her stuff; Anne tells me that Martha is top notch; but Martha doesn’t keep telling me how wonderful she is. My confidence in Martha is high.
  • An expert in one field doesn’t make them an expert in everything. There is a well known psychological pattern where if people believe a person is an expert in a field, such as corporate strategy, they are inclined to believe that person in also expert in other similar fields, such as mergers and acquisitions.

Chris Argyris suggestion is to listen for advice which is: “… illustrated, encourage inquiry, and are easily tested.” On the other hand be wary of  advice “that include little or no illustration, inquiry, or testing” and where defensive reasoning dominates. The problem with this suggestion, as I see it, that much advice is not easily tested or takes considerable time to test it. For example, we have been saying that anecdote circles are an excellent method to elicit stories and they create a positive and trusting experience simply based on our experience. It is not until this year that we have put this to the test and had each participant who attended an anecdote circle provide an evaluation of the experience that we were able to test our assertion. BTW we are presenting our results at KM Asia.

I’ve written a post about Bob Sutton’s suggestions to test management advice here. I call his suggestions heuristics for bullshit detection (please excuse the vulgarity of this phrase. You should know that it is quite a common term in Australia and I believe Australians are great bullshit detectors).

Thanks to Nancy White for a conversation that helped me remember some useful ideas.

 

About  Shawn Callahan

Shawn, author of Putting Stories to Work, is one of the world's leading business storytelling consultants. He helps executive teams find and tell the story of their strategy. When he is not working on strategy communication, Shawn is helping leaders find and tell business stories to engage, to influence and to inspire. Shawn works with Global 1000 companies including Shell, IBM, SAP, Bayer, Microsoft & Danone. Connect with Shawn on:

Comments

  1. Are they an Expert? – Evaluating Expertise

    So you’re dealing with someone who’s an expert and they gladly volunteer to talk the talk, but can they walk the walk? In this age of having so many professionals and experts on about every subject it’s extremely important to be able…

  2. I like this list of heuristics very much. Not all “true” experts manage the simple and clear language criterion, but it’s certainly a good indicator of useful/communicable expertise. The “what’s missing?” heuristic strikes me as the hardest one to test – do you have any suggestions for that?
    And can I add one more? “Their learning process is visible” – ie they listen, reflect and test perceptions with their audience in a very transparent way.

  3. Thanks for you comments Patrick. I think the ‘what’s missing’ heuristic is simply tested when an ‘expert’ (btw this is a overblown term) says: you know what’s missing here … If your ‘expert’ can only discuss what’s aleady ‘on the table’ then I would be wary of the expertise.
    I like your addition. Why do you think that many experts are the exception rather than the rule on this one?

  4. Ahh… got it… so experts are a little bit like good fortunetellers or Sherlock Holmes… they can tell a great deal from very little – ie they can make accurate inferences and home in on salient issues very quickly.
    I think there are several reasons why experts might not make their learning visible:
    *they might not think they are experts
    *they might come from a background or culture where experts are supposed to know everything already
    *they might not be experts but want to look like one
    *they might be “high” experts who simply operate on a completely different level and are difficult to interpret or figure out.
    What I liked about your post was the very pragmatic way it gave indicators for expertise that can be useful inside an organisation.

Comments are closed.

Blog

Anecdote