Built and tested 124 speech categories to support automated QA
Fast, four-month project delivery
We worked with a client in the US who had an existing analytics team but required urgent support to help build out their speech analytics programme. Specifically, the client needed to design and specify the expected quality assurance (QA) behaviors to be tracked in speech analytics, as well as create an augmented team between us and the client to build and test the content.
We also provided support in upskilling the client’s analytics and reporting teams in how to use the speech analytics platform to get the maximum benefit.
We asked the client to provide their QA scorecards and once we reviewed them, established the optimum route to covering that content in the speech analytics platform.
Our expert analysts worked with the client’s analyst team to build and test the content, to ensure that timelines were met.
We worked with the client to specify, build and test 124 speech categories in order to support the automated QA requirements. We made sure that each category was supplied with full documentation, detailing the use case, key criteria, accuracy assessments and a detailed score flow; this allowed the client’s reporting team to easily adapt the content into the reporting suite.
Our team of experts managed to deliver this complicated task in just four months.
Lee Mostari
Director of Insight & Analytics
Davies Consulting
T. +44 (0)7985 555 125
E. Lee.Mostari@davies-group.com
Michael Anderson
Vice President (US/Canada)
Davies Consulting
T. +1 647 929 9002
E. michael.anderson@davies-group.com