I spent the first part of my career consulting for companies that wanted to extract actionable information from their data. As a predictive modeling lead and data analyst, most of my time was dedicated to working in our SQL warehouse, which housed billions of transactional records for call centers. Our customers wanted to know how how they could solve and improve various business problems like customer attrition, fraud, and sales conversions. I led a team that created predictive models we thought would meet these analytic needs. We considered sample sizes, p-values, whether or not to use logistic regression and so many other things I had learned through statistics and data mining courses in school.
We thought we solved various problems on the data side: data sanitization and normalization, data modeling and prediction, automating the calculation of probabilities based on our models for each incoming transaction. However, that was only a portion of my role. As a business analyst, I was charged with explaining what these predictive models meant to our customers. Based on the data how should call center managers coach their representatives to improve sales performance? What drivers were more indicative of churn? When should they do outreach to mitigate a bad series of customer service experiences?
So it was off to create 20 slides with various slices of data, charts, and sparse bullet points coupled with a 2-hour meeting to review recommendations for adopting business process changes based on data. What I encountered was that reviewing these slides with customers time and time again was a melancholy moment. They didn’t care about the SQL stored procedures we made. Nor did they want to know why this thing called a t-test was important and why it mattered that their data follows a binomial distribution. In reality, I spent 80% of those 2 hour meetings trying to communicate what mattered to them most – how to apply our findings to their daily operations and make better decisions based on our analysis. I did my best to explain concepts verbally, at the time, relying on tables of numbers as the facts rather than business insights themselves as facts. In time, I learned that the most effective communication is written language along with some visuals – rather than slides and a voice-over.
I picture how different those experiences would have been had I had access to our Artificial Intelligence platform, Quill. With Quill, I could have given these predictive models a better voice. I could have codified language that personalized each call center manager’s coaching tips for the month. I could have sent alerts to representatives notifying them that an incoming customer’s call was “the one” to try and close and even give them talking points, personalized to their caller’s information.
The concept of using Artificial Intelligence to improve call center customer experience has been steadily gaining adoption. IBM’s Watson was recently trained(1) to answer customer inquiries and provide callers with purchasing decision advice. Likewise, an experience around conversational customer service is contemplated in a recent article from Virtual Agent Chat(2). The applications of these technologies refocus the problem on the consumers themselves, empowering them to interact with it to learn what they need to know.
In that same vein, Quill would have helped me bridge the gap between hundreds of thousands of calls and the call center managers that needed to make sense of their data so, in turn, they could provide an excellent experience to their customers. At Narrative Science, one of our goals is to arm analysts with technology that extends how they think about business problems and communicate data driven insights in a way that provides tangible benefits to people that rely on making smarter decisions. No 20 slides required.http://slate.me/1pBobmV 2. Does Automated Content Technology Hold Promise for Chatbots and Intelligent Assistants? http://bit.ly/1lsM2Dy