C-Sat: So what?!

Question: Let's say your business engaged in a process that was mildly annoying to your customers and provided absolutely no value. You aren't quite sure how it got started or who in your company owns it but the process continues simply out of habit. What would you do if you found about about this process?

For many companies that gather customer satisfaction (C-Sat) data, the answer is, "We'd keep doing things the same way." 

Sports talk radio personality Jim Rome has a favorite saying for his callers, "Give me an A or give me an F." If your company gathers C-Sat data, I hereby challenge you to do the same. Either make sure your C-Sat process earns an A, or stop wasting time and annoying customers with a process that yields no value.

How can you tell if your C-Sat process gets an A?

I propose three simple tests to get you started. This is by no means an exhaustive list, but if you answer "Yes" to these three questions you are probably doing OK.

#1 Do you know why you're asking what you're asking?

Forget your survey questions for a moment and ask, "What do I want to know?" Now, ask, "Why do I need to know that?" If you can't think of a really good reason to ask the question, don't.

I recently received a survey after getting my car's oil change that contained 36 questions (see "Customer Service Survey Mistakes to Avoid"). Does it really take 36 questions to find out if I was happy with my oil change? Of course not!

Surveys of any sort consume your customers' time. The shorter you make them, the better. As a rule of thumb, if you can't get the information you need in five questions or fewer, you are probably asking the wrong questions.

#2 Do you do anything with the data?

The whole point of capturing C-Sat data shouldn't be getting a good score. It should be using the data to improve actual customer satisfaction. If you aren't acting on the data you receive, you aren't extracting any value from the process.

In most cases, you don't have to be an expert in statistics to find value in your C-Sat data, provided you are asking good questions (see #1, above). For example, a client of mine recently grouped the comments attached to their C-Sat survey and discovered that the majority of negative feedback was attributed to one particular process. My client used this insight to fix the process and make it more customer friendly. Customer satisfaction immediately jumped and many people commented on how pleased they were with the improved process.

#3 Do you close the loop?

C-Sat instruments usually collect individual data points and combine them into an average. That's helpful for an overall score, but what do you know about each individual customer? A good system allows you to follow-up with people to either thank them for their business or fix a problem. It can be as easy as asking for their email or phone number at the end of the survey, but it's essential that you follow-up if you request this information.

Here are two examples that highlight the value of closing the loop:

A shipping company recently left a case of wine on my front doorstep. Not only did they fail to get an adult signature for the wine, they left the wine outside where the wine might have been ruined if it had been a hot day. They never asked for my opinion, so they never got my feedback. However, the next time I ordered wine from a winery that used them, I shared the story and asked the winery to use UPS instead.

One of my favorite hotels, the Napa River Inn, sent me a survey after my wife and I stayed there last year. The visit was terrific overall, but there were a few things that weren't up to their usual standards. The General Manager emailed me in response to my survey, thanked me for my feedback, and assured me she would correct the problems I had noted. She also invited me to let her know the next time I visited so she could personally ensure I had a wonderful stay. I took her up on her offer and on my next visit my wife and I had an absolutely amazing time. And, the problems we had noticed on our previous stay had clearly been corrected.

Customer service survey mistakes to avoid

It seems like such a good idea.

Send out a survey to your customers to get some feedback. Your company looks like it cares and you might actually gain some ideas for improving service.

Just beware that any good idea poorly executed can quickly become a bad idea.

Here is a case in point.

I recently took my car to the dealership to get an oil change and complimentary inspection. A day or so later I received an email from my service advisor giving me a heads up that I would soon receive a survey asking me about my experience. The service advisor referred to the survey as her “personal report card” and urged me to contact her immediately if I was unable to rate my experience as truly exceptional.

The survey arrived via email the next day. I clicked on the link to open it up and was astonished to find 36 questions crammed into one long, rambling page. That’s right – 36 questions to ask about my oil change!

The survey seemed like a hassle. I was also concerned that my responses would reflect poorly on my service advisor if I responded that I was satisfied with my oil change but didn’t view it as truly exceptional. I decided to send her an email instead to provide my feedback and also share my concerns about the survey process. She didn’t respond.

The intent may have been good, but there are at least three big problems with this survey. Make sure you don’t make these mistakes if you plan to survey your customers.

Wrong Goal
The point of doing a survey should be to find out how satisfied your customers are and learn ways to further improve. The goal of my auto dealer’s survey seemed to be getting a good score. The heads up email, telling me the survey is my advisor’s “report card”, and urging me to give her a chance to fix any issues the survey may uncover all tell me her primary goal is earning a high score. If this wasn’t the case, why not just call or email me to ask about my service without mentioning the survey at all? Why not respond to the feedback that I did email?

If you are going to survey your customers, make sure you are doing it for a good reason.

Too Long
Does it really take 36 questions to accurately assess my satisfaction with an oil change? Really?! My level of satisfaction declined significantly with each survey question after the first five.

Keep survey questions to an absolute minimum and never ask for any information that you don’t specifically plan to use. Be respectful of your customers’ time when asking them to help you improve your business.

Dumb Scale
Most of the survey questions contained a response scale from one to ten with the following points labeled:

1 = Unacceptable

4 = Average

7 = Outstanding

10 = Truly Exceptional

This scale invites problems. If I answer truthfully, I’d give my oil change a 5. Is that bad? Not at all. It’s actually slightly above average. The problem with the survey is the average experience with my dealership’s service department is pretty good. There were a few minor points on my recent visit that were slightly better than usualy, so I’d rate the experience a little better than average.

I could also answer untruthfully if I felt compelled to rate the service as “truly exceptional” so my service advisor would get a good grade on her report card. The problem is I find it hard to imagine an oil change being truly exceptional. Maybe if they waived the charge, gave me a $100 gift card to my favorite steakhouse, and filled my car up with gas I’d rate it as truly exceptional. But they didn’t and I was just fine with their service anyway.

If you are going to ask for feedback, make sure you design a response scale that doesn’t lead to inaccurate or inconsistent responses.