Imagine that you’re responsible for communicating your company’s new strategy, and just as you’re about to do so, you hear something worrying on the grapevine. A lot of employees believe the strategy was created by the new CEO going home one night, digging up an old strategy from his previous company, then making copies of it and plastering them across the executive suites of your company’s headquarters.
Now, you know this is not how it happened. The executive team went through a well-thought-out process to develop the strategy. So your instinct is to refute these claims. You want to set people right with the facts. This thinking, however, is just leading you into a trap. To simply negate an untrue story only serves to reinforce the misinformation.
A much more effective way of countering a misleading story is with another story. In fact, this is one of the few approaches that really work. We’ll come to this shortly. But let’s start with how researchers discovered this enduring feature of stories.
Correcting misinformation with a better story
In 1994, Holly Johnson and Colleen Seifert conducted one of the first experiments to show how stories can have a big impact on correcting misinformation. Their experiment involved a report into a suspicious fire. Two groups were sent a series of messages simulating how a warehouse fire might be reported in real time. The first group learned that the firefighters had traced the fire to a short circuit next to a closet, inside which had been volatile materials such as oil-based paints and pressurised gas bottles. In a follow-up message, the group was then told that there had been a mistake. There had not been any volatile materials in the closet, and the group was instructed to just ignore that misinformation.
The second group was also told about the short circuit and the closet full of volatile material, and then received a message that this information was incorrect and to just ignore it. But they were also given an alternative explanation for what might have happened. They were told that, in fact, rather than paints and gas bottles, the closet had actually held petrol-soaked rags and empty drums, suggesting that arson might have taken place. This group now had a new story to explain the fire.
Both groups were subsequently questioned about their understanding of what had happened. When the first group was asked ‘Why did the fire spread so quickly?’, their response was that ‘the paints and gas bottles must have exploded and accelerated the fire’. Evidently, despite being told to do so, they hadn’t struck that misinformation from their minds. The second group, however, responded to the question by suggesting arson, having discarded the misinformation and instead embraced the new story about the fuel-soaked rags.
Clearly, the second group, which had heard the plausible story that implied arson, was much less influenced by the original misinformation than the first group, which had simply been told that a mistake had been made. This study demonstrated that it is very difficult, if not impossible, to beat a story with just facts. What you need is a better story.
It’s understandable why stories are so robust and resistant to change. When we hear a story, we take in a complete and internally coherent bundle of information. But if you cut out even one small bit, it’s no longer complete and coherent. Without a replacement story at hand, we revert to the original version – even when, as logical and rational decision makers, we know that part of the story we are relying on to make decisions is untrue.
The power of a plausible story
I’ve felt the effects of clinging onto a story despite knowing an important part of it was false. It happened recently on my first visit to Washington, DC. I was staying at the Willard Hotel, a grand and historic place at the top of Pennsylvania Avenue. My friend and fellow story practitioner Paul Costello swung by to guide me around the monuments of the National Mall, but we started our tour in the lobby of the Willard Hotel.
‘Back in the 1870s’, Paul began, ‘the White House wasn’t the most comfortable place for President Ulysses S. Grant to relax, so he would often unwind with a whiskey and a cigar in the lobby of the Willard Hotel. Word soon got around that the President could often be found in the hotel’s foyer, and people began to come here to get Grant’s ear or seek favours. After a time, these people became known as lobbyists.’
‘Wow’, I said. ‘What a great way for that word to come about.’
Then Paul said, ‘It’s just a myth. The term originated from the gatherings of members and peers in the lobbies of the British Houses of Parliament’.
It’s such a good story, though. I can picture President Grant with his hand around a cut-glass tumbler, smoke billowing from his cigar as he sits in a far corner of the Willard’s lobby with a gaggle of people around him. I have to fight hard to include in its retelling that this story is a myth.
So we’ve established that when faced with misinformation, you shouldn’t merely deny it. Instead of just saying that it’s not true, you need to tell your own story about what happened.
How to make your alternative story stick
When someone listens to your alternative story, they evaluate its truthfulness from a number of different perspectives, asking themselves:
- Is what you are saying compatible with the other things I know?
- Is your story coherent and plausible? Or does it sound like bullshit?
- Is the source credible?
- Do others believe this?
By addressing each of these questions, you will give your version of events the best chance of survival. Remember that the best story will win.
Start by getting to know your audience and understanding what they already know. Your alternative story will be much stronger if it explains how the misinformation occurred in the first place.
Your story also needs to flow and be relatable. In other words, it not only has to be true, it needs to ring true as well. If your story relies on divine intervention or magical incantations, then you’re in trouble. Credibility will come from who’s telling the story and the characters in it. Your audience will be asking themselves: Can it be verified? Are there witnesses willing to testify to this?
We often take our cue about what to do in an uncertain situation from what others are doing. If we walk into a room and everyone is standing up, we will remain standing. Professor Robert Cialdini, a psychologist who’s an expert on influence, calls this social proof. The same goes for what people believe. The more people who believe your story, the greater the chances that others will too.
And because stories are so resistant to change, the first story told really counts. So if you see misinformation brewing, don’t wait until the anti-story has taken hold. Tell your story first, regardless of how tough that might be.
Finally, remember that people prefer simple explanations over complex ones. Don’t tell an overly involved story to correct misinformation.
We will always be confronted with misinformation, half-truths, even barefaced lies. Sometimes we just need to ignore them. But when we do need to counter misinformation, stories are our most powerful ally.
How would all this apply to the example given at the start of this post, about the misinformation around a new company strategy?
Well, if you found yourself in this situation, you would first gather the members of the executive team and make it clear that they need to tell the story of what actually happened, not just negate what people are saying. You would do this orally at company meetings as well as in smaller gatherings, beginning with an acknowledgement of what people are thinking: ‘You might be thinking that Bob just picked up an old strategy and recycled it. In fact, back in the first week of May, the ELT held the first of four sessions to work out the new strategy. Then we tested it with each of the division heads at the end of May. What you see here is the result of that process’.
Persist, and the actual story will soon stick.
 Johnson, H. M. and C. M. Seifert (1994). ‘Sources of the continued influence effect: When misinformation in memory affects later inferences.’ Journal of Experimental Psychology, 20(6): 1420–36.
 Wilkes, A. L. and M. Leatherbarrow (1988). ‘Editing episodic memory following the identification of error.’ The Quarterly Journal of Experimental Psychology, ‘Section A: Human Experimental Psychology’, 40(2): 361–87.
 Seifert, C. M. (2002). ‘The continued influence of misinformation in memory: What makes a correction effective?’ Psychology of Learning and Motivation, 41: 265–92.
 Cialdini, R. B. (1993). Influence: The Psychology of Persuasion. New York, Quill Publishers.
 Lombrozo, T. (2007). ‘Simplicity and probability in causal explanation.’ Cognitive Psychology, 55(3): 232–57.
About Shawn Callahan
Shawn, author of Putting Stories to Work, is one of the world's leading business storytelling consultants. He helps executive teams find and tell the story of their strategy. When he is not working on strategy communication, Shawn is helping leaders find and tell business stories to engage, to influence and to inspire. Shawn works with Global 1000 companies including Shell, IBM, SAP, Bayer, Microsoft & Danone. Connect with Shawn on: