Tom Davenport, the distinguished professor, author and co-founder of the International Institute of Analytics, argues, “Whenever I speak with successful analytics people—and I do that all the time—it’s usually not long before they mention the phrase ‘telling a story with data’.”
Data storytelling is the current capability du jour amongst data discovery and visualization vendors as well. It’s no surprise why, given these companies are seeking greater adoption of these tools that have been designed for a non-technical user base.
Davenport explains the effectiveness of telling the stories in your data:
Stories have always been effective tools to transmit human experience; those that involve data and analysis are just relatively recent versions of them. Narrative is the way we simplify and make sense of a complex world. It supplies context, insight, interpretation—all the things that make data meaningful and analytics more relevant and interesting.
Transforming data into something meaningful, relevant and interesting? Sounds great. So why are individuals and organizations so bad at telling stories with data? Why are people still wrangling with spreadsheets, interpreting bar charts and manually annotating powerpoint presentations?
Where’s the story here?
The confusion lies within the definition of data storytelling, understanding how it can be operationalized across the enterprise, and eliminating the preconceived barriers that make this capability a reality.
Let’s dispel the 4 myths:
1. “Our dashboard does data storytelling!”
Make no mistake—a visualization can be a powerful way to display information, uncovering anomalies, patterns and other insights not easily seen in data alone. But is a dashboard a story? Return to Davenport’s definition of “supplying context, insight and interpretation.” A dashboard may contain 1 or 2 of those elements, but only in narrative form can the user obtain all three.
What about a dashboard with an accompanying story? A story that is dynamic, changing as the user continues to drill down in the visualization, and offering explanations and deeper insights with each iteration. Now that’s a story I want to read.
2. “Generating stories from data? No problem. Our IT team can just build them.”
When some technologists see text being generated from data, they may think, “I could do that.” And by building some tools that achieve basic translation driven by pure business logic, they may be able to, although it would take a significant amount of time and resources. However are snippets of text that populate pre-defined templates a story?
Advanced NLG platforms, however, like Quill, transform data into narratives that are driven by the purpose of a particular communication. Quill highlights what is most interesting and important in the data, and does so at tremendous scale, generating countless personalized stories, on-demand. These stories are indistinguishable from what a human would write, powered by artificial intelligence.
Leave the true storytelling capabilities to the intelligent system, so you can focus your efforts on the story’s outcomes.
3. “We have too much data. Big Data! Our data analysts need to centralize the data first before we are ready.”
The “data first” argument is a good one, although quickly becoming irrelevant, as “Big Data” is just becoming, well, data, and the idea that data first needs to be centralized in order to extract insights is becoming obsolete.
Instead of making further infrastructure investments to manage all of your data, and then asking your data, “What secrets can you tell me?,” you need to return to the business question at hand: “What is driving sales performance?”, “Why are we not meeting inventory goals?”, “What is contributing to the uptick in fraudulent activity?”.
After determining the goal of your analysis, you should then pull in the necessary data to answer that question. Sounds simple, but it's truly about the story first and foremost. Everything else comes after.
Our patented methodology starts by understanding the user’s communication goals and the metrics necessary to meet those goals, performs the relevant analysis to highlight what is most interesting and important, and identifies the required data needed to generate the narrative specific to the intended audience.
Notice: data extraction is the last step, not the first.
4. “It’s too time-consuming for me to tell every story that needs to be told about our business.”
Davenport speaks to the time-consuming nature of manual analysis:
It takes a lot of analysts’ time to think creatively about how to tell a good story with data. In fact, one senior analyst at a pharmaceutical company told me that he (and most members of his analytics group) spend about half their time thinking about how best to communicate their analytical results.
Many analysts will be reluctant to devote that much time to the issue, even if it would make them more effective.
In addition, a story is only powerful if it is relevant to the person reading it, and that requires personalization. But how do you scale personalization?
Let’s return to that sales question above: “What is driving sales performance?” Quill can immediately generate custom narratives for the director of sales, the district manager, and the individual salesperson.
Each story different and each one highlighting what is most relevant to that person. By automatically transforming data into narratives, Quill dramatically reduces the time and energy spent on communicating data insights to others.
Now you know the real story behind data storytelling.