30th May 2023
In the pursuit of stellar customer experience (CX), accurate measurement is vital. But are you clear about what you’re measuring? There’s a diverse array of survey methodologies, each with its own channels and initiation methods. Grasping the context of your survey is essential to interpret the data accurately.
Companies that use changes in these metrics to steer their CX strategies need to peel back the layers of their data. It’s essential to understand who they’re measuring, the methodology of measurement, and the response rate. Oftentimes, a critical question is overlooked: does the survey response reflect the customer’s experience at the time, or is it a general opinion formed long after the interaction?
Non-statistician managers, who find themselves reacting to fluctuating survey scores, should approach the results with a healthy dose of caution. Emotional skewing is a frequent pitfall that can significantly distort data. Typically, it is only the customers with exceptionally good or bad experiences who take the time to respond. This leaves a substantial void in responses from those who had a merely ‘average’ experience.
The complexity of survey completion plays a vital role in determining the response rate. The simpler it is for customers, the more likely you’ll obtain a broad spectrum of responses. This inclusivity provides a truer reflection of the overall CX, including those ‘average’ experiences that often get lost in the noise.
If the response rate does not provide a statistically relevant population of responses, you can often augment the information with analytics, speech analytics is a good tool to provide analysis of all your customer’s sentiments. There are similar tools for digital interaction analysis.
Those customers who are loyal to your brand or organisation are more likely to respond to surveys, providing a more balanced distribution of responses. Hence, it’s crucial to understand the survey responder’s identity and their inclination to respond.
Interestingly, the survey channel you choose can also significantly impact the response rate. For instance, according to a paper by ServiceTick, response rates differ vastly depending on the medium:
– IVR (Voice) – 15%-45% response rate
– SMS – 5%-25% response rate
– Email – 5-10% response rate
– Web – 2-5% response rate
As a customer experience professional, I always want to respond to a survey. I almost feel it is my duty to provide feedback. However, I do struggle with email surveys that direct you to a website. I get a feeling of dread wondering how long will it take, whether I have the time, and how many questions will there be.
As such, I believe the most effective surveys are those that reach the customer promptly after their interaction and are simple to complete. For example, I particularly like SMS text surveys due to their timing and simplicity.
These surveys are typically short and require little effort to complete, making them less intrusive and more appealing to customers. This also means they are more likely to draw on a wide range of experiences, not just those at the extreme ends of satisfaction.
I am sure the customer experience professionals are doing a great job and think very deeply about the questions they want to ask and the information they want to gather – but there’s an element of irony that CX professionals need to be aware of.
In their eagerness to gather insights, they often overlook the customer’s experience when completing the survey. This can lead to an unintended consequence: lengthy and complex surveys that may deter customers from providing feedback. Therefore, the survey experience should be as thoughtfully designed as any other customer interaction.
In my experience, one common design flaw that can discourage completion (particularly during web-based surveys) is the ‘endless’ survey. These are the surveys that present one question at a time, asking for a score and then an explanation. Then, another question pops up with the exact same format, and you have no indication when it will end.
Personally speaking, I like to know how long it will take, so I can assess if I have time. I need to know how many questions are being asked and the progress I am making through the survey. This repetitive format can feel daunting and never-ending to customers. With no visible end in sight, many customers give up, leaving the survey half-completed.
So, when constructing a customer survey to evaluate the CX, consider the following:
– Timeliness of the survey
– Propensity to respond (influenced by customer emotion and loyalty)
– Customer effort required for survey completion
– Survey design, including number and type of questions
– Survey channel or medium used
By understanding the customer experience when completing the survey, you can truly measure and improve your customer experience.
Blog author
Nigel Medforth
Senior Consultant
nigel.medforth@davies-group.com
linkedin.com/in/nigel-s-medforth
Too many companies talk about being customer-centric, having a customer-centric strategy,…
Operational Resilience is defined as an organisation’s ability to withstand and…
The Financial Ombudsman Service (FOS) annual report is an interesting read…
A deep dive into today’s experience as a vulnerable customer A…