As Edward R. Tufte published each of his beautiful books on the visual display of complex data – The Visual Display of Quantitative Information (1983)1, Envisioning Information (1990)2 and Visual Explanations (1997)3 – I became more and more fascinated by the question of how to grab the attention of decision-makers and inspire them to act based on the data.
I remember an early success back at the start of the 1990s. I was helping some councillors see the need for a tree-planting program to provide nature corridors across their shire. Before they saw the results of our geographic systems analysis (what we called computerised mapping before Google Maps), the councillors had told themselves the story of how, over the years, they’d invested in planting trees and that’s why their towns had such leafy surrounds. They felt they’d done more than enough. So when we unfurled maps showing tiny and mostly unconnected stands of mature trees and a vast expanse of paddock, they were more than surprised. After the initial shock and some resistance, they agreed to fund the tree-planting project.
Cognitive psychologist Gary Klein has said that ‘insight is when you unexpectedly come to a better story’.4 These councillors had just had an insight, one that we helped them to have.
Back then, I was unaware of the role stories play in the process of making sense of the data and communicating insights the analyst uncovers. But over the last 15 years, my work in business story techniques and my interest in conveying the results of data analysis have merged, and I now see a strong role for story work beyond just telling the story of the results.
What follows is a framework for how story techniques can help the data analysis process. It is useful for any individual (or group) working with data, whether you’re a scientist, a marketer, an engineer or a policy-maker. The challenge in each case is similar: how do you put yourself in the best position to make sense of a mass of data in order to gain insights, and then inspire people to change based on the discoveries?
We are starting to see things written about the role of stories in data analysis. For example, Brent Dykes over at Forbes5 has written a compelling piece describing the need for data storytelling, showing that the emphasis today is on data manipulation and analysis tools and skills. He predicts there will be a shift to storytelling when the gap widens between the analysts who are discovering the insight and the decision-makers who are learning about it in a way that helps them care. Cole Nussbaumer Knaflic’s Storytelling with Data6 also points to the popularity of the idea.
However, Nussbaumer Knaflic’s book is really a guide to data visualisation, with only a single chapter on storytelling. And while Dykes laments the missing link between the analyst and the decision-maker, his article doesn’t take the opportunity to appreciate the wider role stories play in data analysis, beyond just inspiring the decision-maker.
Here, we will explore the three types of story work, the role stories play before, during and after the data analysis, and the various story patterns that could be employed to inspire a decision-maker to take action.
The three types of story work
In addition to storytelling, there are at least two other ways to employ story work. You can elicit stories to find out what is really happening and how people are thinking. I call this story-listening. You can also trigger the telling of a new story by doing something remarkable that others will remark on. I call this story-triggering.
All three forms of story work – storytelling, story-listening and story-triggering – play a role in discovering an insight and influencing a decision-maker to act.
Let’s explore these three types of story work from three perspectives of data analysis: what happens before analysis, what happens during it, and what happens after the insight has been discovered.
BEFORE data analysis
In business, data analysis serves a purpose. The results of an analysis are designed to inform or even inspire decisions – we are not talking about pure research here. And you typically know who’s going to make these decisions. It might be a select number of leaders in a company, or a broader population of people with a specific demographic, such as overweight, 50+ men prone to heart disease. Regardless of the size and shape of the group you plan to influence, it’s useful to get an idea of the stories they already tell, especially the ones they tell themselves.
Long before Jon Snow became famous for being the King of the North, a more mild-mannered John Snow, a doctor, lived in London, just when some of the biggest outbreaks of cholera hit in the mid-19th century.7 Even in the 1850s London was a massive city, but with one big difference from today: it didn’t have a sewage system. London stank. And to make matters worse, although the flushing toilet had just been invented, there was no sewer to flush it into.
So when cholera broke out, it was common practice to blame the spread of the disease on the miasma – the stench – that sat over the city. The thinking was that, as citizens produced more unsanitary smells, cholera would float from one neighbourhood to another. This story went deep. In one survey of that time, only 5% of doctors thought cholera was a waterborne contagion.
The miasma-causes-cholera story was told publicly and pervaded the medical literature in John Snow’s time. But more often than not, the stories that are informing the way people think are hidden. So you need to actively seek out these stories, those that lie under the surface, to know what you are dealing with. I call this story-listening: the active collecting of the stories that are being told in any defined population.
Running anecdote circles is an effective technique for collecting stories. It’s much like running focus groups, but instead of seeking opinions you elicit stories. We wrote a popular guide called The Ultimate Guide to Anecdote Circles,8 which shows you how to prepare and run them.
Getting back to John Snow, to change the minds of the medical professional and also the policy-makers of Victorian England, he needed to tell a new story with data. His now-famous map showing the cholera deaths in and around Soho’s Broad Street (now called Broadwick Street) did the trick by clearly illustrating the connection between the street’s water pump, the people who drew water from it, and those who perished from cholera.
It would be folly to think that this one map and the associated story Snow told changed the minds of the miasma camp in one fell swoop. The fact is that Snow had been campaigning for years to convince people that cholera was a waterborne disease. That said, his map marked a turning point. It illustrated what can happen when data and story combine, especially when you know the prevailing story you are up against.
DURING data analysis
The act of analysing data involves a constant flow of evolving stories. For example, in your analysis you might discover that in your city, 1000 university graduates got a job within a month of finishing their degrees. But compared with what? What happened to graduates in other cities? You then discover that 2000 people got jobs right out of uni in another city. But what is the relative size of each city? What are the demographics of each population? What’s the employment rate? What industries are employing these graduates?
Each answer fills out the story, either making it stronger or triggering more questions. A good analyst uses the data along with their imagination, curiosity and experience to conjure new scenarios and see if the data rules them in or out, or they use the data to prompt new possible stories.
Business storytelling specialist Paul Smith, author of Sell with a Story,9 was once a market analyst for Procter & Gamble. The Pampers business called him in to get the data together and run a strategy session for them. Now Paul was familiar with the dominant story in this business: if you want to generate more profit, you need to increase volume of product.
The data did show this strong correlation, but only up to 1984. After that, there was a marked change in the pattern Smith was seeing, with no discernible link between profit and volume. He had to start testing a range of alternative stories to explain what had happened. Had the change occurred when competitor Kimberly-Clarke launched Huggies? Was it when commodity costs got out of control? Paul chased down each hypothesis and in the end discovered that it was when the market reached full penetration.
Paul tells the story this way:
“Before we launched disposable diapers in the early ’60s, everyone used cloth diapers. But it’s not like once disposable diapers came out, everybody switched from cloth immediately. It took years for that to happen. In fact, it turns out it took exactly 21 years.”
“By 1983, the market for disposable diapers had essentially reached 100 percent of households with kids who wore diapers, and cloth diapers had almost entirely vanished from the marketplace. Up to that point, everyone making disposable diapers had rapidly growing sales numbers, and the rapidly growing profit numbers to go with them. The cloth diaper makers, of course, were going out of business.”
“What that means is that the disposable diaper business in the United States went from a ‘developing market’ to a ‘mature market’ in 1983. And apparently, we (Procter & Gamble) failed to notice it. We’re still following the same basic ‘sell more’ strategy we’ve been using during the developing market period.”10
Analysis is a battle of stories in a very Darwinian fashion. The one with the best fit with the data wins. It’s the job of the analyst to explore the many possible stories that might explain what they are seeing.
The connection between cause and effect, however, doesn’t have to get down to root causes. Marketers have discovered that if you can uncover a reliable correlation, then you can make decisions – I’m sure this sends the scientists nuts, but for a business it can be a practical approach. For example, large retailers collect masses of data around loyalty cards, such as purchases, dates, times, geographies, shopper demographics and so on. An analyst can explore this data for strong correlations, and once they are found, predictions can be drawn.
The following scenario, told by Charles Duhigg in The New York Times,11 is from the retailer Target. A woman aged 23 buys cocoa-butter lotion, a purse large enough to double as a diaper bag, zinc and magnesium supplements, and a bright blue rug. Target can predict she has an 87% chance of being pregnant and will have her baby in five months. With that knowledge, they can then send brochures directly to the woman, encouraging pregnancy-related purchases.
As Target discovers, though, you have to be careful with this knowledge. Their analysis results in a high-school girl receiving advertisements of maternity clothing and nursery furniture. Her father is enraged and complains to a Target manager about the obvious mistake. A few days later the manager calls to apologise again to the father. A little sheepishly, the dad admits he’s had a conversation with his daughter and she is due in August.
Whether you are telling the story of the root causes during analysis or exploring the possible stories that explain strong correlations, it’s the stories that help the analyst make meaning out of what they discover and see whether the data supports, complicates or refutes the story.
AFTER data analysis
As I’ve said, there’s often a chasm between an analyst’s insight and the decision-makers who need to have this insight. Part of the challenge stems from the background disciplines of the analysts. They typically are steeped in mathematics, statistics and IT, and are more comfortable digging into the data than conversing with decision-makers about what they’ve found. On the flip side, decision-makers often assume that people trained in STEM disciplines are poor communicators. But the more I work with engineers, technologists and the many other flavours of number-cruncher, the more I’ve learned that many do just fine communicating their discoveries once they have the basic skills under their belts. And they are keen to learn.
When you add to that the simple fact that we are all storytellers, helping analysts convey their findings in interesting and compelling ways using story techniques becomes a straightforward task.
The first thing you need to have clear in your mind is exactly what we mean when we say ‘story’. This is vital because you won’t get all the benefits of sharing stories unless what you’re sharing is actually a story. I’ve offered a definition in Putting Stories to Work12 and also on our blog here13 and here.14 But in a nutshell, you can tell whether something is a story if it has the following characteristics.
A story begins with a time marker or a place marker. So when you hear someone say, “A couple of days ago…” or “Last year…” or “A while back…”, then it’s likely to be the beginning of a story. It could also start with a place, like, “We were next to the river…”
A story has a series of events connected in a way that infers causality: this happened a couple of days ago, but then that happened, and as a result this happened.
A story has people in it who are doing things. It’s a giveaway when you hear dialogue, as dialogue can only be delivered in a story.
Finally, in a story, something unanticipated happens. When the audience hears it, they are a little surprised. It’s news to them.
With these simple giveaways, you can now spot stories. So what types of stories should you tell?
There are three story types and one story technique you should consider using in data storytelling.
The data story
When you have a time series and the data does something unanticipated, then you can tell a story about it – a data story.
Here’s an example. From the 1920s to the 1930s in Norway, deaths from heart disease steadily rose.15 Then in 1939 they plummeted and stayed low until 1945, after which they quickly began to rise again. So why would that happen?
Well, in 1939 the Nazis occupied Norway and confiscated all of its livestock, forcing the Norwegians to live off a plant-based diet for the duration of World War II. This diet reversed the death rate from heart disease. When the war ended in 1945, livestock returned and meat and dairy was added back into the Norwegian diet, and heart disease came back.
The story about Procter & Gamble and its Pamper strategy is also a data story.
The data story typically has this basic structure:
- In the past…
- Then something happened…
- As a result…
One of the ways to present a data story is to share a high-level version of the narrative and then ask the audience what they think is happening. This is like presenting the audience with a mystery to be solved and asking them to be the detectives (we love mystery stories16). When they come to the right answer, you can show them the full data story. Now they own the results – they have had some involvement in working it out.17
A data story is vulnerable in one way: it can be usurped by a better story. For example, in relation to the Norway story, what if scientists discovered that when humans are under extreme stress, we produce a chemical in our blood that reduces the likelihood of heart failure? Then the story could become something like the following.
Before the war, Norwegians were a relatively relaxed population, and the incidence of heart disease increased on a par with other Western cultures. But when the war began, Norwegians’ stress levels went through the roof. Their bodies produced heaps of chemical X, and heart disease almost disappeared. But when the war and its stresses ended, the Norwegians resumed their old stress-free lives, and rates of heart disease climbed again.
OK, it’s perhaps not the most compelling alternative story, but you get my meaning. A key story principle is that you can’t beat a story with fact. You can only beat it with a better story.
Stories of the past are often overtaken by new discoveries. Clearly, the founder of IBM couldn’t have imagined the scale of future technology when he predicted the world would only ever need a handful of computers.
The explanation story
When your analysis is not a time series, then your story could explain your insight. John Snow’s cholera map is a good example. On its own, without a story, the map lacks meaning. However, you could say it’s likely that on 28 August 1854, the water pump on Broad Street became infected when the cesspool for the block overflowed because of broken brickwork. Houses on Broad Street fell first – as water was taken from the pump, you could see a fanning out of the disease and the resulting deaths in a radial pattern. Interestingly, there were no deaths at the nearby brewery as everyone there drank beer to hydrate and they had their own well to take water from. The few other unaffected households in the area were discovered to prefer the water from a pump that was further afield and unaffected by cholera.
John Snow wrote extensively on how the disease spread in London and explained all the anomalies he found in the data. His story was a compelling account and, as I said earlier, it changed public health policy in England. As Steve Johnson, author of The Ghost Map18 and my source for John Snow’s epic data storytelling, has noted: “It was going to take more than body counts to prove that the pump was the culprit behind the Broad Street epidemic. Snow was going to need footprints too.”
The discovery story
Sometimes you have to explain how a discovery was made for the audience to both appreciate the insight and understand how much work went into having it.
Google recently completed a comprehensive study19 into what makes a team productive.
They concluded there were five factors:
1. Psychological safety – people can and do speak up
2. Dependability – you do what you promise to do
3. Structure and clarity – goals are clear and the process for getting there is known
4. Meaning of work – everyone is here for more than just a paycheque
5. Impact of work – the team can see how their contribution makes a difference.
So they are the facts of the study. But to really appreciate the work that went into it, and the twisting and turning the researchers did to find the five aforementioned factors, you need to hear the story of the discovery of the insight. The discovery story adds meaning.
New York Times reporter Charles Duhigg does a wonderful job of telling this story. Of course, you wouldn’t relate all the detail in Duhigg’s article20 in an oral telling of the story, but the plot points provide a handy guide to what you might cover. Here are some of the things I would tell before sharing the results of the Google study.
Back in 2012, Google kicked off a study codenamed Project Aristotle to understand what makes a great team. The research team started by reviewing a stack of academic literature on teams, then applied what they found to 180 Google teams, but they couldn’t find any patterns. Also, in the past, Google had thought that putting the best people together would simply allow magic to happen. But the researchers’ initial investigation showed that ‘who’ was on the team wasn’t the determining factor regarding performance.
The researchers then started searching the data for anything on group norms: those things that a group does that denote its habits, its patterns of behaviour, its culture. This avenue of inquiry explained the patterns of performance better than the characteristics of the team members.
Then the team uncovered the idea of psychological safety in the literature and it was as if everything fell together. The patterns became clearer, and five factors emerged that have the biggest impact on team performance…
Now you would share the five factors that affect team performance.
This story would then lead to the next obvious question, which is how to get these norms established in your team. Again, you might tell the discovery stories of the trials and errors that led to very practical approaches. Or you could tell the story of how one manager did something so simple, yet it had a significant impact on psychological safety.
People want to hear how you made a discovery. It adds plausibility to your results. If there isn’t an obvious story in the data, then tell the story of how you discovered the insight.
Back then, the dominant story about stomach ulcers was that they were caused by stress. The crazy lives of executives caused more acid to be in their stomachs, which resulted in ulcers. Barry Marshall had discovered through his research that ulcers were actually caused by bacteria. But when he presented his results, no-one believed him. He needed to trigger a new story that was so remarkable, it would replace the dominant story.
Marshall took the radical step of brewing a batch of the ulcer-causing Helicobacter pylori bacteria and infecting himself with it. Over a period of several days when he became incredibly ill, Marshall tested himself for ulcers, which were confirmed in abundance. Then he drank his antibiotic antidote and the ulcers disappeared. The media found out and reported his findings under the headline: ‘Guinea-Pig Doctor Discovers New Cure for Ulcers’. Medical opinion changed overnight, and in 2005 Marshall won the Nobel Prize in Physiology for this groundbreaking work.
If you find something remarkable in your data which goes against the grain of what’s regarded as deeply true, then test your results by doing something so remarkable that it will trigger a story. That said, please don’t drink a flask of poisonous chemicals!
It’s true that storytelling can help bridge the gap between discovering an insight in the data and influencing decision-makers to use that insight to take action. But we are leaving so much on the table by just thinking of story work in these narrow terms. Story is fundamental to all parts of the data analysis process.
Hopefully, we will give more thought to the role story plays before, during and after data analysis. And when stories are told about the data, we will look at the multitude of ways in which they can be told, and the most effective story patterns to employ to have the greatest impact.
Anecdote offer defined processes to make your strategy stick, training programs to build your story skills and a process to ensure stories are shared regularly. Interested to find out more? Here are the details on what we do
1The Visual Display of Quantitative Information http://amzn.to/29Ks9V9
2Envisioning Information http://amzn.to/29W9dWy
3Visual Explanations http://amzn.to/29W81Ts
4Klein, G. A. (2013). Seeing what others don’t: the remarkable ways we gain insights. New York, PublicAffairs.
6Storytelling with Data http://amzn.to/2aorpHu
7The facts presented here about John Snow and the cholera epidemic of 1854 are drawn from Steven Johnson’s terrific 2007 book, The Ghost Map: A Street, an Epidemic and the Hidden Power of Urban Networks (Penguin Books).
8The Ultimate Guide to Anecdote Circles www.anecdote.com/pdfs/papers/Ultimate_Guide_to_ACs_v1.0.pdf
9Paul Smith (2016). Sell with a Story: How to Capture Attention, Build Trust, and Close the Sale, AMACOM http://amzn.to/2aEsSbV
10Paul Smith (2016), Sell with a Story: How to Capture Attention, Build Trust, and Close the Sale, AMACOM
11Charles Duhigg in The New York Times www.nytimes.com/2012/02/19/magazine/shopping-habits.html
12Putting Stories to Work http://amzn.to/29Y4Mrm
15A. Strom, R. A. Jensen, M. D. Oslo and M. D. Oslo (1951), ‘Mortality from Circulatory Diseases in Norway 1940–1945’, The Lancet, 257(6647): 126–9.
16Mystery stories www.anecdote.com/2008/04/why-should-we-care-about-mystery-stories/
17My friend and fellow story practitioner Paul Smith pointed out this approach to me.
18The Ghost Map http://amzn.to/2aoyg3I
19Comprehensive study https://rework.withgoogle.com/blog/five-keys-to-a-successful-google-team/
About Shawn Callahan
Shawn, author of Putting Stories to Work, is one of the world's leading business storytelling consultants. He helps executive teams find and tell the story of their strategy. When he is not working on strategy communication, Shawn is helping leaders find and tell business stories to engage, to influence and to inspire. Shawn works with Global 1000 companies including Shell, IBM, SAP, Bayer, Microsoft & Danone. Connect with Shawn on: