New Report: Contact Centers Fall Short on Surveys

Contact centers struggle to use customer service survey data.

That's the conclusion suggested by a new report from ICMI called Collapse of the Cost Center: Driving Contact Center Profitability. The report, sponsored by Zendesk, focuses on ways that contact centers can add value to their organizations. 

Collecting customer feedback is one way contact centers can add value. This feedback can be used to retain customers, improve customer satisfaction, identify product defects, and increase sales.

So, what's the struggle? Here's a statistic that immediately caught my attention:

63% of contact centers do not have a formal voice of the customer program.

Yikes! It's hard to use your contact center as a strategic listening post if you aren't listening.

Let's take a look at some of the report's findings along with some solutions.

Key Survey Stats

Here are some selected statistics from the report.

First, let's look at the types of surveys used by contact centers that do have a formal voice of the customer (VOC) program:

Source: ICMI

Source: ICMI

Customer Effort Score (CES) presents an untapped opportunity. 

CES measures customers' perceived effort (see this overview). A good CES program will help companies identify things that annoy customers and create waste. This makes it a great metric for improving efficiency.

Why is efficiency so important in a customer-focused world? Here's another statistic from the ICMI report that explains it:

62% of organizations view their contact center as a cost center.

That means efficiency is one of the most important success indicators for those companies' executives. CES marries cost control and service quality by measuring efficiency from the customer's point of view.

Another revealing statistic shows what's not measured:

44% of contact centers don't measure customer retention

Keeping customers should be the name of the game for contact centers. If you don't measure this statistic, than customer retention can't be a priority.

 

Challenges With Surveys

The report highlighted challenges contact centers face with survey data. Here are the top five:

Challenge #1: Using survey data to improve service. Survey data is more than just a score. The key is analyzing the data to get actionable insight. That's a skill that many customer service leaders don't have. One resource is this step-by-step guide to analyzing survey data.

Challenge #2: Getting a decent response rate. Response rate is a misleading statistic. There are two things that are far more important. First, does your survey fairly represent your customer base? Second, is your survey yielding actionable data? Your response rate is irrelevant if you can confidently say "Yes" to these questions.

Challenge #3: Analyzing data. See challenge #1. You can't improve service if you don't analyze your data to determine what needs to be improved.

Challenge #4: Designing effective surveys. Survey design is another skill that many customer service leaders don't have. Here's a training video on lynda.com that provides everything you need to get started. You'll need a lynda.com account to take the full course, but you can get a 10-day trial here.

Challenge #5: Taking action to help dissatisfied customers. You'll need a closed loop survey to tackle this challenge. A closed loop survey allows customers to opt in for a follow-up contact. Once you add this, it becomes very easy to initiate a program to follow-up with upset customers.

 

Additional Resources

The full report provides a lot more data and advice on leveraging contact centers to improve customer service and profits. It's available for purchase on the ICMI website.

Here are some additional blog posts that can also help:

 

Anatomy of a Lousy Survey

This blog has spent a lot of time on surveys lately. 

There's a post on how to write a great survey with just three questions. There’s another post on five ways to capture VOC data without a survey. You can even read about five signs your survey may be missing the point.

This post focuses on that last topic by giving you a detailed breakdown of a lousy customer service survey from Buffalo Wild Wings.


Tip: You’ll get more survey responses if you make it easy for people to respond.

Here’s the survey invitation. There’s no QR code and the survey site itself isn’t optimized for mobile, so guests are discouraged from completing the survey on their smart phones.


Tip: Don’t bother your customer with questions you should already know the answer to.

The survey asks a lot of questions that could easily be tied to the survey code or a customized survey link. Examples include the store, date visited, and the time of day. Each of these questions are on a separate screen which makes the survey even more tedious. 

So far, we're at 8 screens:


Tip: An annoyingly long survey will remind customers how annoyed they were already.

This survey is tedious! Customers don’t have the opportunity to share any feedback until they reach the 15th screen.


Tip: Cut out extra questions and give customers a comment box instead. 

The survey also assumes it knows what’s driving customer dissatisfaction. The question in screen 15 (above) asks customers to provide an overall rating. The question in the screen below (screen #19 in the survey!) presumes to know what might drive customer satisfaction.

The danger is these questions might be irrelevant to the customer, but they're required to complete the survey.



Tip: Keep questions to a minimum by avoiding repetition.

By now, the questions are starting to get repetitive. Didn’t someone in marketing check this survey before giving it the green light?

Here's the question on Screen 15:

Overall, compared to your expectations of what a restaurant can and should be, how would you rate your experience at Buffalo Wild Wings?

Here's the question on Screen 26:

Compared to your expectations of what a restaurant can and should be, how would you rate the Buffalo Wild Rings you visited on providing you with the "ultimate social experience for sports fans in your community?"

Screen 26


Tip: Surveys should have a single purpose to give them razor-sharp focus.

The questions just keep coming! Buffalo Wild Wings really makes you work for that $5 coupon. Now, they want to gather some demographic data.

It looks like someone in another department said, “Hey! You’re doing a survey? Can I add a few questions?” This is the 35th screen.


The final tally on this survey was a whopping 39 screens!

The effort required to complete it is a big turnoff. Here’s an example of a survey that’s far easier to complete. It’s limited to just five questions and customers are giving two options to access the survey.

(Please excuse the blurry picture.)

Note: My personal policy is to share negative feedback privately before naming a company in a blog post. Members of my party (including me) attempted to share our feedback with the store manager but he refused to come to our table.

In this case, I'm grateful for the poor service we received since their survey provides an excellent illustrative example.

5 Signs Your Customer Service Survey is Missing the Point

Note: This post originally appeared on the Salesforce blog. Check out my latest post on the Salesforce blog, "Why Role Playing Doesn't Work for Customer Service Training."

Customers are getting tired of surveys. A 2010 study by Vovici revealed that Americans are inundated with over 7 billion survey requests per year. That’s nearly 23 survey requests for every American. (Ironically, I encountered a pop-up survey request when I went to the US Census Bureau website to track down that statistic.)

Many companies survey their customers, but that doesn’t mean they are doing it right. Here are five signs that a customer service survey program is missing the point.

 

1) Your survey has no purpose

Perhaps someone in customer service decided a survey was a good idea so they wrote a few questions. Then marketing added a few more questions. Sales chimed in with a few questions of their own. Operations got in the act too. The end result is a 100-question survey with no clear purpose.

Thinking of questions to ask your customers is the wrong place to start. Instead, think about what you specifically want to know and then design your survey to achieve that clear purpose. 

Keep in mind that you may have multiple audiences. For instance, a business-to-business software company might have a transaction survey for users contacting technical support and a relationship survey for executives who make buying decisions.

 

2) Your survey is tiresome

The second sign of a pointless survey is it needlessly annoys customers.

For example, the dealership where I get my car serviced routinely sends me a 36 question survey after I get an oil change. That’s obnoxious.

Focus on what you really want to know and limit your questions to as few as possible. You can always use text boxes to capture additional information.

My car dealership could cut their survey from 36 questions down to 3 and still get an amazing amount of useful data:

  1. How satisfied were you with your recent service?
  2. (Comment box): Is there anything we can do better?
  3. Would you like one of our service advisors to follow-up with you?

Correlating satisfaction levels with individual comments could tell the dealership what they’re doing well and what can be improved. And, asking customers if they’d like to be contacted allows service advisors to try to fix any problems that customers are venting about.

 

3) You’re focused on the score, not the feedback

The third sign a survey is missing the point is focusing on the score and not the feedback. 

The service department at my local dealership provides a great example. All of their post-transaction follow-up focuses on cajoling me into giving them a good score on their survey. My actual feedback is irrelevant.

  1. A sign by the cash register reminds customers they’ll be getting a survey.
  2. Someone from the dealership calls the next day with a reminder about the survey.
  3. The service advisor sends an email reminder a day later.

 Each point of contact encourages customers to provide a top box score on the survey. At no time are customers asked about the quality of service they’ve received.

Surveys should be designed to give you feedback that you can use to improve service. Focusing on a score versus the feedback itself defeats that purpose.

 

4) You only look at aggregate data

The fourth sign of a pointless survey is the data isn’t analyzed. Only total scores are viewed.

Knowing what percentage of your customers are satisfied is a relatively useless statistic. There’s not much you can do with that.

It necessary to dig a little to make customer service survey data truly useful. For example, let’s say you have a 75 percent customer satisfaction rating. It takes a little bit of analysis to reveal actionable information:

  • Is service quality consistent amongst all employees?
  • What factors make it more likely for a customer to be satisfied?
  • What factors make it more likely for a customer to be dissatisfied?

 

5) You Don’t Take Any Action

The last sign of a pointless survey is the company doesn’t do anything with the data it collects.

Chip Bell, author of 9 1/2 Principles of Innovative Service, shared this startling statistic with me:

 95 percent of companies survey their customers but only 10 percent actually use the feedback to take action. 

It’s a waste of your customers’ time to ask them for feedback and then do nothing with it. It’s also a waste of your time too.

Smart companies use surveys as part of their continuous improvement cycle. They analyze their survey data to look for trends and pinpoint problems. This analysis leads to solutions that are implemented to improve service. Creating survey’s that generate actionable results is the key to creating a company that is constantly evolving and improving.

 

Want to know more?

Here are links to recordings of two of my recent webinars on making the most of customer service surveys:

How to Analyze and Act on CSAT Data

checklist.jpg

You probably survey your customers.

But, do you learn anything from those surveys? More important, do you use that insight to improve service?

Chip Bell, author of 9 1/2 Principles of Innovative Service, recently shared this startling stat from a Gartner Group study:

95 percent of companies survey their customers, but only 10 percent actually use the feedback to take action.

Yikes!

Today, I hosted a webinar on analyzing and acting on customer satisfaction survey data. This post is a re-cap of the key lessons from the webinar along with some bonus information. 

You can watch the webinar here.

 

How to Quickly Analyze Your Data

Let’s say your overall customer satisfaction is 85 percent. 

By itself, the number doesn’t tell you much. The key to analyzing this data is to dig one level deeper. 

For example, you could look at the distribution of survey scores by employee:

vocdist1.png

Suddenly, you see that Leo may need some extra help.

You could also look at the distribution of survey scores by the type of service request:

vocdist2.png

This reveals that product inquiries are a strength, but technical support leaves something to be desired. 

If Leo gets a technical support call, you’re doomed.

 

Identifying Pain Points

Before you fire Leo or stop offering technical support, you may want to dig deeper still. The goal should be finding the true root cause of the problem.

One way to do this is to hone in on surveys where customers gave technical support an unsatisfactory survey score. Can you spot some themes among their comments?

Here’s an example:

  • It took way too long to get a simple issue resolved!
  • The guy didn’t seem to know what he was doing.
  • I got transferred twice before someone could help me.
  • Problem not resolved! I’ve had to call back three times.
  • The lady seemed confused and overwhelmed. 

A theme or two emerges. The comments suggest that customers give low scores for technical support when they get the runaround or the support rep doesn’t appear to be highly competent.

 

Turning Insight into Action

Knowing the specific issues that annoy your customers is a good start. Now, you need to investigate to find the root cause of the problem.

In the technical support example, the best way to do this would be to spend some time with those employees. Share with them the problem you’re trying to solve. Ask a few questions. Watch them do their work.

The root cause of the problems often becomes obvious after just a few minutes of observation.

Four things might jump out if you spent some time with this technical support team:

  1. They feel pressured to solve problems quickly to meet the department standard for Average Handle Time (AHT).
  2. This pressure causes them to take shortcuts to maintain the AHT standard. 
  3. These shortcuts frequently lead them to transfer a call too quickly or misunderstand a customer's needs because they aren’t listening carefully.
  4. The most inexperienced rep, Leo, has a particularly hard time with this.

Now that you know the root cause, you can take action.

Perhaps you could emphasize first contact resolution over AHT with your team. Focusing on resolving the problem makes customers happier. It also reduces callbacks, which in turn reduces overall call volume. 

And, focusing on first contact resolution doesn't necessarily cause a spike in AHT. 

You may also want to check in with Leo to make sure he doesn’t get lost on the learning curve.

 

Revisit Your Surveys

The final step in the process is to revisit your survey. You want to see if the actions you’ve taken have actually increased customer satisfaction.

You’ll be realizing the true value of a customer service survey when you follow these steps on a regular basis. Service will improve and your customers will be happier.

Why You Should Stop Trying to Improve Your Survey Scores

The score shouldn't be the goal.

The score shouldn't be the goal.

There’s a lot of pressure these days to improve customer service survey scores. 

Executives review the scores on a regular basis. Business units are compared to one another. Employees are held accountable for their scores. Rewards are given for outstanding results. Low scores bring about consequences.

If this is your company’s focus, you’re wasting your time.

Getting fixated on a number can bring about all sorts of unwelcome behaviors. I recently overheard a coffee shop barista pleading with a customer to give them a great score. “We are sooooooooo graded on this!”

Never mind that the barista’s groveling turned an otherwise pleasant interaction into an awkward moment for the customer and everyone else in the store.

Sometimes, enterprising store managers take it upon themselves to have their employees nudge customers in the right direction. I recently received a survey invitation while shopping at Sports Authority. The cashier stamped the expected answer on the invitation.

highlysatisfied.JPG

Gee, thanks.

Sadly, this isn’t an isolated incident. Auto dealerships are notorious for this type of obnoxious behavior where they relentlessly pester customers to give them a great survey score. All the while, any actual feedback is ignored.

What if you don’t engage in manipulative tactics? 

Focusing on the score is still missing the point. The score, by itself, doesn’t tell you anything. It certainly doesn’t help you do anything differently.

A score is nothing but an average. It’s an aggregate representation of many individual experiences. Focusing on the score might even hide service failures so long as the average looks good.

Think of it this way: your 85 percent satisfaction score won’t help the individual customer who is currently receiving poor service. 

 

What you should focus on instead

Companies that truly care about customer service focus on continuous improvement.

The customer service survey can be a valuable tool in this quest. Used correctly, it provides valuable insight into what your customers want you to do better. This insight should lead to action that will improve service.

Better service, not better a better score, is the obsession.

Making sense of survey data can be tough. Companies can have a hard time culling insights from reams of data and turning those insights into action. That’s why I’m hosting a webinar on Wednesday, March 5 from 10-11am (Pacific) called How to Analyze and Act on Customer Service Survey Data.

The goal is to show you how to quickly use this data to continuously improve service. I hope you can attend.

Developing Effective Customer Satisfaction Surveys

Is your survey effective?

Is your survey effective?

This post is a short re-cap of the Designing Effective Customer Satisfaction Surveys webinar.

It was designed to be an overview of the basics. There are also links to additional resources at the bottom of this post so you can take a deeper dive into the subject.

You can view the webinar here.

The webinar focused on three key points:

  • Impactful survey design
  • Fast survey creation
  • Response rate strategies

This was a very interactive webinar and a lot of participants contributed some terrific ideas. One of my favorites was our brief discussion on why we don’t complete surveys ourselves. You can learn a lot about why your customers won’t do something if you examine why you yourself won’t do it.

The webinar runtime is approximately 48 minutes and is definitely worth viewing. 

 

Don’t Miss Part 2!

Surveying your customers is just the first step. Next, you must use that data to take action! I’m hosting a follow-up webinar on Wednesday, March 5 at 10am PST. It's called How to Analyze and Act on Customer Satisfaction Data.

You can use the form below to register.

Three Roadblocks to Effective Customer Satisfaction Surveys

How would you rate your customer service survey?

How would you rate your customer service survey?

Customer service surveys are everywhere. 

A 2010 study by Vovici revealed that Americans are inundated with over 7 billion survey requests per year. That’s nearly 23 survey requests for every American. 

The intent of these surveys is to capture Voice of the Customer (VOC) information that can be used to improve service. Unfortunately, that intent is rarely realized. Some surveys are poorly designed. Others have low response rates.The worst problem is not doing anything with the data collected. 

This is a common theme among the small and medium sized business I work with. Most have some sort of survey program in place but there’s a nagging feeling that it's not very useful.

Perhaps your company is considering a customer service survey. Or, you have one already but now you’d like to make the most of it. Here are three common roadblocks you’ll need to avoid:

 

#1 Inertia

Many customer service professionals believe that a robust VOC program is important but they just don’t know how to get started. Inertia sets in. Do any of these excuses sound familiar? 

  • I don’t have time right now.
  • It’s too expensive.
  • Our customers are tired of surveys.
  • It’s just a number that senior management wants to see.
  • Surveys don’t really apply to us.

Nothing changes without action. Companies who delay implementing a VOC program could be missing out on a gold mine of information. Even worse, they might continue to run an ineffective program that wastes everyone’s time, including their customers'.

 

#2 Poor design

Customer service surveys are often little more than a pile of questions that reveal little or no insight. The surveys become longer and longer as each stakeholder thinks of things they’d like to ask. A simple transaction survey soon becomes 100 questions long. 

The net result is the survey annoys the customer while the company is left with piles of data they don’t know how to use.

 

#3 Low response rates

Poorly designed and executed survey programs often yield low response rates. It can be disheartening to go through the trouble of creating a survey and then have hardly anyone respond. Continuing the program can be difficult to justify if customers aren’t responding.

 

Solutions

I’m offering a complimentary webinar to help you avoid these roadblocks.

Designing Effective Customer Satisfaction Surveys

  • Date: Thursday, February 13
  • Time: 10 am - 11 am (PST)

You’ll learn: 

  • Simple ways to quickly create surveys on a tight budget
  • Proven techniques for writing impactful survey questions
  • Three secrets to improving your response rates

This is the first of a two-part webinar series that covers the basics of customer service surveys. The second webinar, called How to Analyze and Act on Customer Satisfaction Data, will be held on March 5.