How to Find Trends in Your Survey Comments

Updated: June 12, 2023

The Director of Customer Experience was proud of her company’s new customer service survey. She had been a strong advocate for collecting voice of customer feedback and now they were finally getting it.

"That's great!" I said. "What are you doing with the data?"

There was an awkward silence. Finally, she replied, "Uh, we report the numbers in our regular executive meeting."

That was it. The survey was doing nothing to generate insights that could help improve customer experience, increase customer loyalty, or prevent customer churn.

One problem was the survey had no comment section. Customers could rate their experience, but there was no place for them to explain why they gave that rating.

Comments are a critical element that tell you what your customers are thinking and what you need to do to improve. But having a comment section isn't enough.

You need to know how to analyze those comments. This guide can help you.

Why Survey Comments Matter

Comments provide context behind numerical ratings. They help explain what makes customers happy, what frustrates them, and what you can do better.

Let’s look at the Google reviews for a Discount Tire Store in San Diego. As of June 12, 2023, the store has a 4.5 star rating on Google from 482 reviews.

tire store.jpeg

That's great news, but two big questions remain if you’re the store manager:

  • How did your store earn that rating? (You want to sustain it!)

  • What's preventing even better results? (You want to improve.)

The rating alone doesn't tell you very much. You need to look at the comments people write when they give those ratings to learn more.

The challenge is the comments are freeform. You'll need a way to quickly spot trends.

 

Analyze Survey Comments for Trends

The good news is you can do this by hand. It took me less than 30 minutes to do the analysis I'm going to show you.

Start with a check sheet. You can do this on a piece of paper or open a new document and create a table like the one below. Create a separate column for each possible rating on the survey.

checksheet1.jpeg

Next, read each survey comment and try to spot any themes that stand out as the reason the customer gave that rating. Record those themes on your check sheet in the column that matches the star rating for that review.

For example, what themes do you see in this five star review?

review1.jpeg

I recorded the following themes on my check sheet:

checksheet2.jpeg

Now repeat this for all of the reviews. (If you have a lot of reviews, you can stick to a specific time frame, such as the past three months.) Look for similar words or phrases that mean the same thing and put a check or star next to each theme that's repeated.

Once you've completed all of the reviews, tally up the themes that received the most mentions. Here are the top reasons people give a 5 star rating for this Discount Tire store:

  • Fast service: 72%

  • Good prices: 35%

  • Friendly employees: 23%

There weren't many bad reviews. The few that had comments mentioned a long wait time, a lack of trustworthiness, or some damage done to the customer's vehicle.

You'll see a larger theme emerge if you look across all the reviews.

Some aggravation usually accompanies a trip to the tire store. Maybe you got a flat tire. Perhaps you're trying to squeeze in car service before work. There's a good chance you're dreading the cost.

When Discount Tire customers are happy, their comments tend describe some sort of relief. For instance, more than one customer mentioned arriving just before closing and feeling relieved to get great service from helpful and friendly employees.

 

Take Action!

The purpose of this exercise is to take action!

If I managed that Discount Tire store, I'd make sure employees understood they are in the relief business. (Perhaps they do, since their rating is so high!) Relief is one of the top emotions in customer support.

I'd also respond to negative reviews, like this one:

badreview.jpeg

For private surveys, you'll need a non-anonymous survey or a contact opt-in feature to do this.

Many public rating platforms like Google My Business, Yelp, and TripAdvisor allow you to respond publicly to customer reviews. A polite and helpful response can signal other customers that you care about service quality.

And you might save that customer, too. One Discount Tire customer changed his 1 star review to a 5 star review after speaking with the manager who apologized and fixed the issue!

You can watch me do another check sheet in this short video on LinkedIn Learning:

What is a Good Survey Response Rate?

It's the most common question I get about surveys.

Customer service leaders are understandably concerned about getting a lot of voice of customer feedback. So my clients want to know, "What is a good response rate for our customer service survey?" 

The answer may surprise you—there's no standard number. 

There are situations where an 80 percent response rate might be bad while a 5 percent response rate might be phenomenal in other circumstances.

In fact, I'm not overly concerned with the percentage of people who respond. My advice to clients is to use a different set of criteria for judging their survey responses.

Here's how to evaluate your own survey response rate the same way I do.

Three Response Rate Criteria

There are three criteria that you can use to determine if you're getting a good response to a customer service survey:

  • Usefulness

  • Representation

  • Reliability

Usefulness is the most important consideration.

Any response rate that provides useful customer feedback is good. That's not to say you can't do even better than your current rate, but the whole purpose of a customer service survey should be to yield useful data.

For example, let's say you implement a contact opt-in feature that allows you to follow-up with customers who leave negative feedback. That survey could become tremendously useful if it allows you to contact angry customers, fix problems, and reduce churn.

Representation is another important way to gauge your response rate.

You want your survey to represent all of the customers you are trying to get feedback from. Imagine you implement a new self-help feature on your website. A representative survey in this case would ask for feedback from customers who successfully used self-help as well as customers who weren't successful and had to try another channel.

Sometimes you need to augment your survey with other data sources to make it more representative. The authors of The Effortless Experience discuss the self-help scenario in their book and suggest having live agents ask customers if they first tried using self-help.

This question can help identify people who didn't realize self-help was available and therefore wouldn't complete a survey on its effectiveness. It could also capture feedback from people who tried self-help, were unsuccessful, and didn't notice a survey invitation because their priority was contacting a live agent to solve the problem.

My final criterion is reliability.

This means the survey can be relied upon to provide consistently accurate results. Here's a summary of considerations from a recent post on five characteristics of a powerful survey.

  1. Purpose. Have a clear reason for offering your survey.

  2. Format. Choose a format (CSAT, NPS, etc.) that matches your purpose.

  3. Questions. Avoid misleading questions.

Many surveys have problems in one or more of these areas. For instance, a 2016 study by Interaction Metrics discovered that 92 percent of surveys offered by the largest U.S. retailers asked leading questions that nudged customers to give a more positive answer.

For example, Ace Hardware had this question on its survey:

How satisfied were you with the speed of your checkout?

The problem with a question like this is it assumes the customer was satisfied. This assumptive wording makes a positive answer more likely.

A more neutral question might ask, "How would you rate the speed of your checkout?"

 

Resources

A survey response rate is good if it generates useful data, is representative of the customer base you want feedback from, and is reliable.

That doesn't mean you shouldn't strive to continuously improve your survey. Here are some resources to help you:

A Simple Way to Double Your B2C Survey Responses

Everyone wants a better survey response rate. The Center For Client Retention (TCFCR) recently shared some data about business-to-consumer (B2C) surveys that revealed an easy way to improve results.

TCFCR helps businesses conduct customer satisfaction research. The company's client focus is primarily Fortune 500 companies in business-to-business (B2B) and B2C segments.

There's a big need for these type of services given that a recent study from Interaction Metrics found 68 percent of surveys offered by America's largest retailers were "total garbage."

I provide similar services to small and mid-sized businesses, so I was curious to see what TCFCR's survey data might reveal.

One quick look and I immediately saw a way for businesses to double the response rate on their B2C surveys.

The Response Rate Secret

TCFCR pulled aggregate data from thousands of surveys across all of their clients for a 12-month period. The company compared response rates for "in the moment" surveys versus follow-up surveys sent via email. 

Here are the results:

Follow-up surveys had more than twice the average response rate!

An in the moment survey is offered at the time of service. It could be a link in an email response from a customer service rep, an after-call transfer to an automated survey, or a link in a chat dialog box.

A follow-up email survey is sent after the customer service interaction is complete.

TCFCR also found that sending a reminder email after the initial survey invitation typically generated an additional 5-point increase in response rates!

Some companies do follow-up surveys via telephone instead of email. TCFCR's data shows that those surveys get an average response rate of 12-15 percent, which is on par with in the moment surveys.

One thing to keep in mind is that this data is for B2C surveys only. TCFCR found that B2B surveys typically get a response rate that's half of what you'd expect from a B2C.

 

Increase Response Rates Even More

There are a few more things you can do to stack the deck in your favor.

One is to keep your surveys short. A 2011 study from SurveyMonkey found that survey completion rates drop 5-20 percent once a survey takes 7+ minutes to complete. The same study discovered that's usually around 10 questions.

Most surveys will gather adequate data with just three short questions.

Another way to improve response rates is through rules-based offering. A lot of customer service software platforms, such as Zendesk, have a built-in survey feature that allows you to adjust which customers receive a survey and when.

For instance, you might only send a follow-up survey once a support ticket is closed rather than after every single interaction. Or if you offer a subscription-based service, you might survey all customers when they reach the six month mark in their annual subscription, regardless of whether they've contacted your company for support.

You can learn more about response rates and other survey-related topics here.

The Powerful Survey Feature That Drives Customer Loyalty

Improving loyalty is a big reason companies survey customers.

The challenge is finding ways to actually accomplish that goal. Customer service leaders tell me confidentially that analyzing survey data is a struggle. Getting leaders to take meaningful action is another tough task.

There's one survey feature that can immediately improve your results. Seriously, you could implement it today and start reducing customer defections.

What is it? 

It's the contact opt-in. Here's a run-down on what it is, why it's essential, and how to implement it immediately.

What is a Contact Opt-In?

A contact opt-in is a feature at the end of your customer service survey that allows customers to opt-in for a follow-up contact.

The opt-in does three important things:

  • It allows you to follow-up with an upset customer and save their business.

  • The survey itself remains anonymous, which is important to some customers.

  • The opt-in doesn't promise a contact, it just gives you the option.

Best of all, it's really simple. Here's a sample opt-in:

May we contact you if we have additional questions?

Just make sure you add fields to capture a customer's name and contact information if they say yes!

 

Why are Follow-ups Essential?

There's a widely held perception among customers that surveys are meaningless.

That's because we're inundated with survey requests, but we rarely see any meaningful changes as a result of our feedback. Many customers are convinced their feedback is routinely ignored. (Spoiler alert: they're right.)

A follow-up tells customers you're listening. It demonstrates caring and empathy. Some customers have told me they were surprised and amazed to get a follow-up contact!

Now here's the best part: you might even be able to solve the problem and save the customer!

Data provided by the customer feedback analysis company, Thematic, shows that customers who give a "0" rating on Net Promoter Surveys have a lot more to say in the comment section than customers who give other ratings:

Data source: Thematic

Data source: Thematic

“Detractors across dozens of companies we’ve worked with complain about the inability to contact the company about an issue they have, lack of communication, or difficulty finding information on how to fix an issue themselves”, says Alyona Medelyan, CEO at Thematic. “We have also observed that many customers leave their full name, phone number or reference number in a free-text comment. Detractors are three times more likely to leave contact details than others.”

This presents customer service leaders with two choices:

You can ignore all that anger and wait for the customer to tell family, friends, and colleagues or you can contact the customer and try to iron things out.

 

How to Implement a Contact Opt-In

The process is very straight forward.

  1. Add a contact opt-in to the end of your survey.

  2. Review your survey for opt-ins (I recommend daily).

  3. Contact as many customers as possible, especially angry ones.

Through trial and error, I've found that a phone call often works better than email or other channels for following up. It's easier to have a dialogue if you catch them on the phone and a surprising number of customers will call you back if you leave a message and a phone number where they can call you directly.

Here are a few other tips:

  • Empower your follow-up person (or team) to resolve as many issues as possible.

  • Use customer conversations to learn more about their situation.

  • Summarize feedback from customer follow-ups to identify broad trends.

 

Conclusion

Some leaders worry about the time required. If that's your focus, your head's probably not in the right place.

Here are three compelling reasons why you definitely have the time:

  1. Follow-up is optional. You don't have to contact every single customer.

  2. Saving customers can directly generate revenue and reduce servicing costs.

  3. Fixing chronic problems leads to fewer customer complaints in the long run.

Here are some additional resources to help you turn your survey into a feedback-generating, customer-saving, money-making machine:

Five Characteristics of a Powerful Customer Survey

Customer are constantly getting pummeled with survey requests.

We get them via email. They pop up when we visit a website. The auto mechanic pulls us aside after an oil change and begs us for a 10.

A 2016 study from Interaction Metrics found that more than 80 percent of America's top retailers offered a customer survey on purchase receipts. The study also found that most surveys were total garbage.

Most customer service leaders I know are concerned about their surveys. They recognize customers get too many. Leaders also aren't certain what to do with the data they're collecting.

This post aims to solve that problem. 

Below are five characteristics of a powerful customer survey. Use them to put your existing survey to the test. And, if you want more help, I'm willing to do an evaluation of your existing survey at no cost or obligation (details at the end of the post).

#1 Purpose

Always start with why. Understand why you want to survey your customers. Whenever possible, be specific.

Customer service leaders typically respond by saying, "We want to collect feedback." That's not enough. It doesn't provide clear direction because there's no action involved.

Here's a better reason I recently heard from a customer service leader:

Customer retention is a key driver of our company's success. We want to use our survey to help pinpoint the causes of customer churn.

See the difference? A clear purpose will help you use the survey to drive action.

 

#2 Choose the Correct Format

There's a lot of debate around which type of survey is best. Here are the three most popular:

  • Customer Satisfaction (CSAT): measures customer satisfaction with a product, service, or transaction.

  • Net Promoter Score (NPS): measures a customer's likelihood to recommend your product or service.

  • Customer Effort Score (CES): measures how easy it was for a customer to resolve their issue.

So here's a secret: there's no single survey type that's best!

Choosing the wrong survey type can yield less helpful data, so it's important to choose the correct survey type to match your goal.

A municipal utility probably shouldn't use an NPS survey because they have a monopoly on their service so generating positive word-of-mouth isn't the goal. The utility would be better off using a CES survey to find ways to serve their customers more efficiently.

Here's a primer that can help you decide which survey is best for your situation.

 

#3 Ask the Right Questions

A survey is only as useful as the questions it contains.

Most surveys contain too many questions. Those questions are frequently poorly designed and do little to reveal useful information.

You can ask better questions if you keep a few things in mind:

  • What's your purpose for doing the survey? (See #1 above)

  • What type of survey are you using? (See #2 above)

  • What will you do with the data?

If you don't know what you will do with the answer to a question, there's no need to ask it. In fact, I challenge my clients to use just three questions whenever possible:

  1. How would you rate (product, service, experience)?

  2. Why did you give that rating? (open text response)

  3. May we follow-up with you if we have additional questions?I challenge my clients to 

This short explainer reveals the rationale behind each of these questions (and why you usually don't need any more).

 

#4 Make Your Survey Easy

Offering a survey is really asking a customer to do you a favor.

The easier you make your survey, the more likely your customer is to do you that favor and to feel okay doing it. This means your surveys should follow a few simple principles:

  • Easy to access

  • Offered in a timely manner

  • Easy (and quick) to complete

A 2011 study from SurveyMonkey found that survey completion rates drop 5-20 percent once a survey takes 7+ minutes to complete. The same study discovered that's usually around 10 questions.

 

#5 Take Action

The number one survey gripe I hear from customers is the survey doesn't matter. 

Truthfully, they're usually right. Studies consistently show the vast majority of survey feedback is never acted upon.

You need to use surveys to drive improvement if you want to avoid wasting your customers' time. That means analyzing the data for trends and identifying opportunities for improvement.

Your survey serves no purpose if you're not doing that.

 

Resources

Here are a few more resources to help you improve your existing customer survey or implement a new one.

Training Video: Using Customer Surveys to Improve Service

If you don't have a subscription to either source, you can get a 30-day Lynda.com trial account by dropping my name.

You might also want to check out my customer service survey resource page.

Finally, here's my offer to review your survey:

Send your survey as a link or PDF file to jeff [at] toistersolutions [dot] com by June 30, 2017. In your email, answer these three questions:

  1. What is your objective for this survey?

  2. How are you offering the survey? (Ex: via email to customers who contact you)

  3. What are you doing with the survey data?

I'll respond with notes about your survey's strengths and some suggestions for improvement.

Study: Surveys On Store Receipts Are "Total Garbage"

We've all gotten a survey invitation on a store receipt.

A 2016 study from Interaction Metrics found that 41 of the 51 largest U.S. retailers included a survey invitation on the standard receipt. The surveys were evaluated to see how useful and engaging they were.

Not a single one was fully engaging and scientific.

The study also found that 68 percent of the surveys were "total garbage," meaning the surveys were so flawed they weren't worth the time required to complete them.

You can view the entire study here. Below is a summary of the results along with some action items and resources to help improve your organization's customer satisfaction survey.

How the Study Worked

The study assessed surveys based on four criteria. Each one was weighted to reflect the relative importance of each category:

  • Access: Ease of locating and beginning the survey (5%)

  • Branding: Style reflecting the brand, correct spelling and grammar (10%)

  • Engaging: Keep customers engaged throughout the process (35%)

  • Accuracy: Survey design that yielded accurate data (50%)

The surveys were all obtained by making purchases from the retailer, either in store or online.

 

Accuracy Flaws Uncovered

Inaccurate data can prevent companies from taking the right action to improve service. 

Or worse, a survey might be gamed to yield high scores that disguise the fact that service needs to be improved at all.

Asking leading questions was one of the most prevalent flaws, showing up in 92 percent of the surveys examined. These are questions that are worded in a way that naturally leads customers to a particular answer. 

For example, Ace Hardware had this question on its survey:

How satisfied were you with the speed of your checkout?

The problem with a question like this is it assumes the customer was satisfied. This assumptive wording makes a positive answer more likely.

A more neutral question might ask, "How would you rate the speed of your checkout?"

Another issue was the use of overly positive wording that can bias a customer's response. The study found that 82 percent of surveys contained at least one question with overly-positive wording.

Here's an example from GAP:

The look and feel of the store was very appealing.

This question also suffers from vague wording. Does "look and feel" refer to branding such as signage, displays, and decor? Or does it refer to cleanliness and organization? Perhaps it means the store's layout?

Here's an example from the now-defunct Sports Authority, where a cashier biased the survey in another way. He stamped the expected response right on the invitation:

highlysatisfied.JPG

Engagement Flaws Revealed

Surveys reflect on your company's brand.

They're part of the customer journey. Many retailers have made their surveys so needlessly long or aggravating that the survey itself reflects poorly on the brand, like this egregious example from Buffalo Wild Wings that required customers to navigate through 39 different screens!

The average retailer's survey had 23 questions.

That's a tedious amount of questions to expect customers to answer. Nordstrom advertised its survey took just 2 minutes, but it contained 25 questions. The survey actually took 4 minutes to complete.

The study found that 13 percent of surveys were difficult to access. Walmart required not one but two receipt codes to be answered. Rite Aide, Ross, and Walgreen's all had broken links.

The best surveys are short and easy to complete. In many cases, you can capture troves of useful data with just three questions.

 

Resources

There are many resources to help you develop, implement, and refine your customer service survey while avoiding these mistakes. Here are just a few:

9 Underhanded Ways to Boost Your Survey Scores

Updated: January 15, 2024

I'll never forget shoplifting class.

It was a workshop for associates at the retail chain I worked for in high school. The idea was to help us prevent shoplifting by showing us how shoplifters operated.

The class was amazing.

We learned advanced techniques used by professionals, such as how to defeat alarm sensors, conceal piles of merchandise, or confuse clueless sales people.

Quite a few thefts were prevented as a result of the class.

I write this blog post in the spirit of that training. Many customer service professionals are willing to stoop to underhanded means to artificially boost survey scores.

This post will help you catch them.

Technique #1: Manipulate Your Sample

You can't survey everyone, so companies survey a small portion of their customers, called the sample.

Ideally, your sample represents the thoughts and opinions of all your customers. However, you can make a few tweaks to increase the likelihood that only happy customers are surveyed.

For example, you could survey customers who complete a transaction using self-service. You'll likely get high scores since self-service transactions are typically simple and you are only surveying people who succeeded. Customers who get frustrated and switch to another channel for live help won't be counted in this survey.

There are other ways to get higher scores by being selective about your sample.

  • Limit your survey to channels with simpler transactions, like chat.

  • Limit your survey to people who have contacted you just once.

  • Limit your survey to people who contact you for certain types of transactions.

 

Technique #2: Manually Select Respondents

Some employees can manually select survey respondents. This enables them to target happy customers while leaving out the grumpy ones.

The survey invitation at the bottom of a register receipt is an excellent example.

If a customer is obviously happy, the employee can circle the invitation, write down his or her name, and politely ask the customer to complete the survey.

What if the customer is grumpy? It's pretty easy to tear off the receipt above the survey invitation so the customer never sees it.

Look for any situation where employees have some manual control over who gets a survey. There's the potential for an employee to by choosy about who gets surveyed and who doesn't.

 

Technique #3: Survey Begging

This occurs when an employee asks a customer to give a positive score on a survey by explaining how it will directly benefit the customer, the employee, or both.

I've written about this scourge before, but it's worth mentioning again here. Employees beg, plead, and even offer incentives to customers in exchange for a good score.

In one example, a retail store manager offered a 20 percent discount in exchange for a perfect 10 on a customer service survey.

Technique #4: Prime Customers

Survey invitations can nudge customers to give a certain score. An invitation might use language like, “Tell us about our great service,” to get customers thinking more positively.

Other examples are more overt, like this vacation rental company:

The cashier at a sporting good store primed customers by stamping register receipts with the requested survey response:

Image credit: Jeff Toister

Image credit: Jeff Toister

Technique #5: Fake Surveys

Anonymous survey systems are easy to game. These include pen and paper surveys or electronic surveys that aren't tied to a specific transaction or customer record.

Unscrupulous employees have been known to enter fake surveys, complete with top ratings and glowing comments. They enlist their friends and family members to do the same.

 

Technique #6: Write Positive Survey Questions

Survey questions can easily be slanted to elicit more positive responses. Consider these two examples.

This question is positively worded. Notice that the threshold for giving a top rating of "Strongly Agree" is pretty low; the customer merely has to be satisfied with the service they received.

This question is neutral. It's more likely to get a lower overall rating even though the feedback may be more accurate.

 

Technique #7: Use an Even Scale

There's a long-running argument over whether customer surveys should have a odd or even point scale.

An odd-numbered scale, such as 1 - 5, provides customers with the option to provide a neutral rating.

An even numbered scale, such as 1 - 4, forces customers to choose a positive or negative overall rating. 

More often than not, you'll tip customers into a positive rating by eliminating the mid-point.

 

Technique #8: Change Your Scoring Process

The exact meaning of a "customer satisfaction rate" is up for interpretation. You can interpret this loosely to increase your score.

Let's say you survey 100 customers using a scale of 1 - 5 with the following scale points and responses:

  • 1 = Highly Dissatisfied (2 responses)

  • 2 = Dissatisfied (6 responses)

  • 3 = Neutral (7 responses)

  • 4 = Satisfied (45 responses)

  • 5 = Highly Satisfied (40 responses)

You could report the score as a weighted average and call it 4.15 or 83 percent. Or, you could simply add the satisfied (45) and highly satisfied (40) customers and give yourself an 85 percent rating.

Even better, combine this technique with an even-numbered scale. Those same 100 customers might respond this way:

  • 1 = Highly Dissatisfied (2 responses)

  • 2 = Dissatisfied (8 responses)

  • 3 = Satisfied (50 responses)

  • 4 = Highly Satisfied (40 responses)

Suddenly, you can boast of a 90 percent customer satisfaction rating from the same group! This little bit of trickery just boosted the score by 7 percentage points.

 

Technique #9: Adopt a Generous Error Procedure

Some people advocate rejecting surveys with an obvious error.

For example, let's say a customer rates your service as a 1, the lowest score possible, and then writes:

"Hands-down the best service ever. If I could give a higher score I would. I absolutely love their service!!"

You can reasonably conclude this customer meant to give a 5, not a 1. Using that logic, the survey could be removed. Some unscrupulous people might correct the score to a 5 (the highest score possible).

This technique manipulates your scores directly, so unethical service leaders might adopt a generous error procedure. 

For example, a neutral score of a 3 combined with a mildly positive comment might be kicked out as an error or even adjusted to a 4 based on the comment. These adjustments can really add up and significantly impact an overall average!

 

Conclusion

I want to be clear that I don't advocate the use of any of these techniques.

The real purpose of a customer service survey is to gain actionable insight from your customers that allows you to improve service. You can't do that if you use these methods to artificially inflate your scores.

You can learn more about sound methods for implementing a customer service survey via this training video.

You'll need a LinkedIn Learning account to watch the entire course, but you can get a 30-day trial.

How to Use Surveys to Save Angry Customers

Companies that use customer service surveys fall into three groups.

The first is the majority. These companies just report the numbers. They don't really understand why they're surveying their customers, they just know that higher numbers are good.

Unfortunately, you really haven't learned anything if all you know is your Customer Satisfaction (CSAT) score is 85 percent one month and 86 percent the next. 

The second group uses their survey data to identify actionable insight. This group knows why CSAT moved from 85 percent to 86 percent. They also have a clear idea on how to get it to 87 percent next month.

The final group uses their survey to identify actionable insight, but they also use it to connect with individual customers.

This group knows that if 85 percent of customers were satisfied, then 15 percent were not. They want to find that 15 percent and help them before they take their business to a competitor.

This post explains how you can be a part of that third group too.

Why Yelp is (Almost) the Perfect Survey System

Take a moment to consider the beauty of Yelp.

Yes, it has some flaws that businesses don't like. The reviews are public (scary!), some of the reviews are fake (true story), and most people leave negative reviews (patently false).

Yelp also has a simple design that can give you a lot of feedback.

First, it asks customers to give a single rating. There's no convoluted mess of 36 different dimensions that will never be read or analyzed. Just one rating. One to five stars, with five being best.

Do you think people would write Yelp reviews if they had to answer 36 questions? Not a chance.

Next, Yelp asks customers to explain their rating in the comment section. The beauty of this is you can do some basic text analysis to understand why someone would give you a five star rating versus a three star rating.

Best of all, Yelp allows you to close the loop with your customers.

You can follow-up with the customer in private to (hopefully) resolve their issue. You can also respond to their review publicly so other customers know you're listening.

In many ways, Yelp emulates the ultimate three question survey. In fact, the biggest problem with Yelp as I see it is most businesses don't get enough reviews. 

 

Creating Your Own Better Yelp Model

You can easily create a survey that includes Yelp's best features.

Unlike Yelp, you will likely get a lot more responses and the results will remain private unless you choose to release your data to the world.

Here's a sample survey:

A survey like this can yield lots of useful data without burdening your customers with unnecessary questions. You just need to know how to analyze it. 

Fortunately, you can use this handy guide.

Notice the third question allows customers to opt-in for follow-up contact. This is the linchpin that can allow you to identify and follow-up with angry customers.

For example, you can set a rule that any customer who gives a rating of three or lower gets a follow-up contact. (Provided, of course, that the customer opted-in.) 

This follow-up can yield all sorts of great things:

  • You might fix the problem.

  • You might save the customer.

  • You might gain additional insight.

There's also a bonus.

One data analyst at a large company confided in me that customers who received a follow-up contact generally gave top scores on their next survey. So, closing the loop with angry customers can be really, really good for your overall survey score.

Let's not forget that our executives really do care about that score.

 

Resource

You can learn more about creating customer service surveys by watching this training video on LinkedIn Learning (subscription required).

How to Stop Employees From Survey Begging

We've all experienced survey begging.

Sometimes, employees offer an incentive. My nephew was recently offered free food in exchange for giving a fast food restaurant all 10s on their survey.

Other times, employees try to pull on our heart strings. They tell customers they'll get in trouble if they don't receive a good score. 

My friend Halelly recounted a recent experience taking her car into the dealership for service. "The (service advisor) coached me in person when I got the car serviced and has now sent me this email too."

The email warned Halelly that she would be getting a survey from the dealership and possibly the manufacturer. The advisor wrote:

We would greatly appreciate your time to complete the surveys. Anything other than all 10's is considered a fail.

This post explains why employees engage in survey begging. It also explains how you can stop them from this annoying habit.

Survey Begging Defined

Here's my definition of survey begging:

Asking a customer to give a positive score on a survey by explaining how it will directly benefit the customer, the employee, or both.

Here are a few examples:

  • Offering customers discounts in exchange for a good score

  • Telling customers a bad survey will get you fired

  • Displaying "We strive for five" or similar signs

  • Directly asking customers for a positive survey score

  • Ignoring actual feedback that's not attached to a positive score

Side note: this definition is a first draft, so I welcome your feedback!

 

Why It's a Problem

Survey begging causes two problems.

First, it's annoying. Customers don't like being begged and cajoled into giving a survey score. This practice reinforces the perception that companies aren't really using voice of customer data to improve service.

The second problem is survey begging can cover up real service issues by artificially inflating scores. Customers might start spending less or stop doing business with a company entirely, without the company ever understanding what's causing the problem.

In other words, survey begging defeats the purpose of using a survey.

 

Why Employees Survey Beg

It's all about incentives.

Employees engage in survey begging because they have a clear incentive to achieve a high score or a strong incentive to avoid getting a low score.

Some employees have bonuses tied to their average survey score. This incentivizes them to ask customers for good scores because those positive surveys are literally adding to their paycheck. A slightly negative, but truthful survey might prevent an employee from earning their bonus.

Other employees can face disciplinary action if they receive too many low scores. One automotive service advisor told me he only pushes the survey to customers he thinks are happy because he could lose his job if he gets too many low scores.

Survey begging happens in many industries, but it's a particularly big problem in the automotive sector. Here's a great article on Edmunds.com that explains why.

The bottom line is if you want to stop the begging, you need to remove the begging incentive.

 

Getting Rid of Incentives

Many customer service managers are reluctant to get rid of survey incentives.

They operate under the false assumption that employees need these incentives to be motivated. There's a mountain of evidence that shows this isn't true. In fact, the number one motivator for customer service employees is being able to help their customers.

I wrote about a great example of this in my book, Service Failure. The Westin Portland was achieving consistently high guest service scores. Then General Manager Chris Lorino explained that part of their success came from a resistance to implementing survey score incentives. 

Instead, the hotel made guest service a core part of each associate's job. Here's an excerpt from the book:

"Associates coach and encourage each other to deliver high levels of service that will help them achieve their (guest satisfaction) goals. The hotel's leadership team regularly discusses guest feedback with the associates and encourages people to share ideas that will improve service even further."

Other managers are concerned that eliminating incentives makes it difficult to monitor employee performance through survey scores.

The problem is survey begging artificially inflates survey scores, so you end up rewarding employees who are best at begging, not best at service. 

A better approach is to use survey feedback to manage behaviors. For example, if an employee frequently gets surveys saying they are a little abrupt, you can coach them on ways to create a better impression.

How Rating Your Customers Can Change Service Perceptions

Surveying your customers can bring some interesting benefits.

You can gain valuable insight that allows you to improve service. And, as I noted in a recent blog post, just asking for feedback might increase loyalty and spending.

There's another trend that's worth watching. Survey scores appear to rise when customers are also rated.

This post explores how this might be happening.

Who Is Rating Customers?

There's at least a few companies doing it now. They include Uber, Lyft, and Airbnb. Ebay offered this feature until they discontinued it in 2008.

The idea behind rating consumers is to encourage better behavior. The Uber website explains:

"The rating system works to make sure that the most respectful riders and drivers are using Uber."

Uber also posted an explanation on their blog that indicated passengers with low ratings might not be able to continue using their service.

The concept appears to work to a certain extent. A recent New York Times article explored several examples where passengers made a point to be more polite when they were using Uber.

 

How These Ratings Change Perceptions

There may be a downside to rating consumers.

A Boston University study compared ratings for vacation rental properties that are evaluated on both Airbnb and TripAdvisor. Airbnb allows properties to rate their guests while TripAdvisor does not. 

The results? Ratings on Airbnb averaged 14.4 percent higher than the same properties on TripAdvisor. 

Clearly, the knowledge that they too will be rated has affected these guests' ratings. What's not clear is why. There seems to be a few likely explanations:

  • Airbnb reviewers are naturally more lenient than TripAdvisor reviewers.
  • Airbnb reviewers rate higher because they know they'll be rated.
  • TripAdvisors give harsher ratings because they don't face any consequences.

 

Conclusions

This could be a trend to watch. I'm a big proponent of civility. It's important that we try to be kind and respectful to the hardworking people who serve us. 

If rating customers helps this, I'm all for it. On the other hand, I'm wary of any move that artificially manipulates survey scores and prevents problems from being solved.

Where do you come out on this?