Why all customer service surveys really measure just one thing

There's a good chance you and your colleagues have had a tortured conversation about customer service surveys.

  • What type of survey is best?

  • How many questions should it include?

  • Are the scores all fair?

That last one is a doozy. Executives worry whether a customer upset about a defective product will “unfairly” give the customer service team a low score on its post-transaction survey. As if the survey is somehow about assigning credit rather than getting unvarnished, actionable feedback from customers.

Often lost in these discussions is the one and only true purpose of surveying customers. Customer service surveys are good at measuring just one thing: sentiment.

Here's an overview of what your survey should not try to measure, and why sentiment is all you really need anyway.

What customer service surveys should not measure

Would you use a hammer to change a lightbulb? Probably not—the result would be disastrous. A hammer is a good tool, but it's the wrong tool for the light bulb changing job.

Surveys are often misused in the same way. Here are a few common survey errors.

Error #1: Fact-finding. Surveys shouldn't ask customers for facts, such as how long a customer had to wait to be served. The simple explanation is customers aren't good at remembering facts, so this can quickly skew your results.

Error #2: Isolation. Some surveys try to isolate customer service issues from other problems, such as defective products, late shipments, or service failures. It's a very company-centric approach that attempts to assign credit (or blame) to various departments without recognizing that all of these functions play a role in the customer experience.

Error #3: Granularity. Many surveys are loaded with 30 or more questions in an attempt to dig deep into the customer's journey. The problem here is two-fold. One, the survey itself becomes a bad experience for the customer. Two, those 30 exhausting questions don't necessarily get at what the customer truly cares about. That’s what the open comments field is for.

Try as you might, surveys just don’t measure these things really well. There are lots of other tools and techniques to gather essential customer experience insights that surveys don’t capture. For example, what customers tell your employees directly is a potential goldmine.

What surveys do measure is how a customer feels. Just like a hammer is great at pounding nails, surveys are a good tool for gauging sentiment.

Why sentiment is an essential insight

Sentiment is a measure of how customers feel about your product, service, or company. Those emotions influence how customers interact with brands in a number of ways, from purchase decisions to word-of-mouth advertising.

A good customer service survey identifies customer sentiment and then helps uncover what's driving those feelings.

  • Why do customers love us?

  • Why do customers loathe us?

In most cases, you don't need more than two simple questions to uncover how your customer feels.

  1. A rating question.

  2. An open-comment question so customers can explain their ratings.

That's the simple design of most online review platforms. You give a star rating and then write about why you gave that rating. It's up to you as the customer to explain why you feel the way you feel in your own words.

This design also works no matter what type of survey you use. (Here’s a video guide to survey types.)

Compare the two-question design to the typical, bloated survey that constrains customers with 30 nonsense questions about wait times, employee greetings, and would you hire the person who served you that day. The two question survey is easier for customers to complete and easier for you to analyze.

What you can do with customer sentiment

Capturing customer sentiment allows you to do two big things. The first is it helps you find the pebble in your customers' shoes, so you can remove it. The second is you can discover what makes your customer truly fall in love with your brand, product, or service, so you can do that more consistently.

One client used their two-question survey to identify a process that truly annoyed their customers. It was the one thing that was consistently mentioned in negative surveys. So the client investigated the issue and improved the process.

Complaints quickly decreased. Even better, the new process was far more efficient, saving my client valuable time.

The same survey revealed that the happiest customers knew an employee by name. They had made a personal connection with someone, and felt that employee was their advocate. My client leveraged that strength and encouraged all employees to spend just a little extra time connecting with customers on a personal level.

Customer satisfaction rose again. That extra time connecting with customers also dramatically reduced time-consuming complaints and escalations.

Analyzing your surveys doesn’t require advance math, sophisticated software, or hours of time. Here’s a guide to quickly analyzing your survey results.

Conclusion

Customer service surveys get much shorter and far more useful when we remove all the nonsense. Focus on learning how your customers feel and why they feel that way, and you'll have incredibly useful information.

I've put together a resource page to help you learn more.

LinkedIn Learning subscribers can also access my course, Using Surveys to Improve Customer Service. A 30-day trial is available if you're not yet a subscriber.

5 types of misleading data that hurts customer service

How long does it take to create a merchandise display?

This was an important question for a retailer with thousands of stores. Each week, stores received instructions to create new merchandise displays along with the estimated time they took to build.

Staffing decisions were centralized. The corporate office estimated how much labor each store needed for the week and created a staffing plan. The estimated time to create new displays was included in the plan.

The estimates were almost always wrong.

In reality, the displays took much longer to build than planned. Managers weren't allowed to deviate from the staffing plan, so they pulled employees away from other tasks like helping customers.

The retailer was plagued by many examples like this, where misleading data led to poor decisions. It eventually went bankrupt.

You can avoid a similar fate by identifying misleading data. Here are five types to watch out for.

Two colleagues are reviewing business data.

#1 Anecdotes

Stories can be powerful ways to communicate ideas, but they can also be misleading.

One CEO expressed confidence that his business was customer-focused because he had recently received several compliments from friends. These stories were reassuring, and the CEO rebuffed attempts to find more data about service quality.

Had the CEO dug deeper, he would have discovered a growing number of unhappy clients. There were critical gaps in hiring, training, and customer service that needed to be addressed.

Without data to support this, the CEO refused to invest in necessary improvements. The company was soon blindsided by a wave of lost business that it should have seen coming.

Lesson: Always look for data to prove or refute anecdotal evidence before making critical decisions.

#2 Generalizations

Data is often generalized. This means it is accurate in many cases, but might not be accurate in others.

Years ago, many customer service teams shifted from one computer monitor to two. The rationale was that "studies show" two monitors improve productivity.

This was true in many cases, but not always.

Those studies were largely commissioned by companies that made monitors. The authors had a vested interest in downplaying another discovery: one monitor is sometimes better than two.

This meant that some customer service teams hurt productivity when they added a second monitor.

Lesson: Make sure data is applicable to your situation before using it.

#3 Contextless

Data gains new meaning when you add some context around it.

One CEO balked at a plan to raise wages for customer service employees from $12 to $14 per hour. What the CEO saw was a 17 percent increase at a time when the company was already paying 50 percent more than the minimum wage ($8 per hour).

Graph showing a proposed wage increase for customer service employees from $12 per hour to $14 per hour would be a 17% jump.

The problem with this analysis is it didn’t consider the broader job market. Prospective employees might be considering job offers from other companies.

See what happens when you put proposed increase in context with the overall job market.

Graph showing the market rates for similar jobs. The current rate of $12 is at the very low end of the pay scale. The proposed $14 rate would be slightly below average.

The company was paying well below the market average for comparable jobs. Raising the starting wage to $14 per hour would still be below market, but it would give the company access to more talent.

There was one more piece of context that was helpful.

The CEO's chief concern was converting customer service inquiries into sales. A quick calculation showed that a 6 percent improvement in the conversion rate would pay for the $2 per hour wage increase.

Armed with more context, the CEO agreed to raise wages. It quickly became easier to hire good employees. Thanks to better hires, the sales conversion rate increased 36 percent.

Lesson: Add context to give data more meaning.

#4 Manipulated

Data is often manipulated by unscrupulous employees who have a direct incentive to make it look better.

Some cashiers emphasize a survey to happy customers, but fail to mention it when a customer is grouchy. This manipulation is one of at least nine ways unscrupulous employees can boost survey scores without providing better service.

The end result is voice of customer data that makes customer service look far better than it really is. Meanwhile, complaints go unnoticed or unreported.

Why do employees do this?

Because they either get an incentive for good survey scores or they get in trouble if too many customers complain. Sometimes, it's both.

Lesson: Avoid giving employees incentives that might cause them to manipulate data.

#5 Perfect

Some executives won't use data to make a decision until they're convinced the data is perfect. Unfortunately, perfect data does not exist.

First contact resolution (FCR) is a great example.

The idea is to solve customer issues on the first contact so the customer doesn't have to repeatedly contact a company. Repeated contacts frustrate customers and cost the company extra money.

The challenge is FCR is notoriously difficult to measure.

  • What counts as a repeat contact?

  • Can those contacts be easily tracked?

  • How can you tell a repeat contact from a contact about a new issue?

  • How do you determine whether an issue is fully resolved?

  • What if a customer uses multiple channels to contact a company?

So the trouble with FCR is you can't craft a perfect measure, but you can still work towards improving FCR by taking a few specific actions.

Lesson: Don't wait for data to be perfect because you'll end up doing nothing.

Conclusion

Executives are easily deceived when they don’t question the data they use to make decisions. When in doubt, ask a few more questions.

For example, think about the time it takes to make retail store displays. A few questions could have fixed the company's data problem:

  • Are we accurately estimating how long it takes to build the displays?

  • What's the impact of an inaccurate estimate?

  • How can we improve the accuracy of our estimates?

You can imagine a different outcome if someone had asked those questions.

What questions should you ask on a customer survey?

Updated: June 12, 2023

Customer service surveys are too long.

Some have 10, 20, or even 30 questions. I've seen one with over 100. It takes customers a long time to answer that many questions.

This causes a few problems:

  • Customers get annoyed.

  • Many people abandon the survey.

  • You get a lot of data that isn't useful.

There is a solution. 

I'm going to show you how to dramatically shorten your customer service survey. Shorter surveys are easier for customers to complete and far less annoying. 

You'll also get better, more useful data.

A group of customers taking a survey.

What is the purpose of a customer service survey?

A survey should help you identify actionable customer feedback. It should help you spot problems so you can fix them. The survey should also let you know what's working, so you can keep doing those things well.

Long surveys often lack a clear purpose. The survey gets bloated with irrelevant questions that someone thinks might be somehow useful.

Here are a few discussion questions that will help you understand your survey’s purpose:

  • Why do you want to survey your customers?

  • What do you hope to learn from them?

  • What will you do with this data?

Knowing the answers to these questions can help you focus your survey and make it shorter. Here's a short video that can help you.

What can you learn without a survey?

You can often get data about your customer’s experience without relying on a survey. Getting this data from other sources allows you to eliminate survey questions.

Restaurants and retail stores typically include a survey invitation at the bottom of the receipt. Many of those surveys ask you to identify information that’s already known:

  • Store location

  • Time of day

  • Items purchased

Those questions can be eliminated if you tie that data to the survey on the backend.

This has an added benefit—customers have notoriously faulty memories and often make mistakes when answering these questions.

There's another source of data you might be overlooking if you survey customers after they contact your customer service department: your customers' own words.

Customers give direct feedback when they call, email, chat, Tweet, or using any other channel to complain or get help.

Check out this interview with customer experience expert Nate Brown, where he shares a simple way to collect and analyze this feedback.

What are the best questions to ask on a customer service survey?

You can get plenty of actionable data from your customers with just two questions. A two-question survey is easy on your customers makes analyzing the data a breeze.

Here are the two must-have questions:

  1. Rating scale

  2. Free text explanation

The rating scale can be any survey type. Functionally, they’re very similar. (Read more on different survey types here.)

This Net Promoter Score (NPS) survey Suunto is a great example. The survey was sent six months after I registered a new watch, which gave me enough time to really experience using it on a daily basis.

Net Promoter Survey

The survey has a clear goal to identify what causes people to spread positive or negative word-of-mouth about Suunto and its products.

For example, I answered 9 to the first question (which means I'm a promoter), but I also used the free text question to describe a small issue I had with the battery life indicator on my watch.

You can use the same two question approach with most common survey types:

  • Customer Satisfaction (CSAT)

  • Customer Effort Score (CES)

  • Net Promoter Score (NPS)

The rating scale tells you if the customer is happy, neutral, or upset. You can use the comments to learn more about individual customers, or search the text for trends.

Here's an example of a great NPS survey from Ecobee. It starts with a rating question:

Ecobee NPS survey.

The rating you give then triggers a comment box that asks you to provide more detail:

Feedback box on an NPS survey.

Ecobee's Senior Director of Customer Experience and Operations, Andrew Gaichuk, told me he and his team analyze survey comments to identify trends.

"We define trends through key words such as Customer Service, Installation, Wifi, etc. to help narrow down what key issues customers are experiencing so we can action it for future improvements. For example if we see any detractor for 'Customer Service' we can investigate the interaction, determine the issue and provide one on one coaching/feedback with the CSR."

Analyzing survey comments like this is surprisingly easy.

Suunto and Ecobee can identify the specific customer giving the feedback because the survey is triggered when a customer registers a new product. This allows them to follow-up with customers if there's a problem, or to ask more questions if they want to get additional information.

Companies don't always have access to each customer's contact information. You can add an optional third question if that's your situation. The third question allows customers to opt-in to a follow-up contact.

Here’s a sample CSAT survey that contains the third question:

Three question survey.

Can you have more than three survey questions? 

The short answer is yes, but think carefully before making your survey any longer. Here are a few things to consider:

  • Do you have a clear purpose for asking this question?

  • Is this the only way to get the answer?

  • Do you have plan to use the data you collect?

If you answered "no" to any of the above, you probably don't need the question on your survey.

One survey mistake that adds extra questions is to make assumptions about what’s important to your customers. For example, a restaurant might ask questions like these:

  • “Were you greeted promptly?”

  • “Was your order correct?”

  • “How would you rate the food quality?”

Those may or may not be the issues your customers truly care about. You can focus on what truly matters to customers by analyzing the comments in your two-question survey.

Most online review sites use the same two question format.

For example, I was able to review the Yelp comments for a popular San Diego restaurant and quickly learned that reservations were the number one service issue.

Take Action

Asking a customer to take a survey is like asking them to do you a favor. It's a good idea to make that favor as easy to grant as possible.

Here are some additional resources to help you improve your customer surveys:

How Surveys Can Make Service Failures Worse

My local car dealership struggles with service.

On multiple occasions, I've arrived for an appointment only to learn a needed part didn't arrive as expected. That meant I had to drive home and come back another day.

The mechanic once badly scratched my car's front fender and didn't say anything—I noticed the damage just as I was getting in the car. The dealer fixed it, but my car was in the body shop for a few days.

A recent experience was the last straw.

I called to make a service appointment and asked how long it would take. The employee informed me it would be two hours, but when I arrived, the service advisor told me it would take four hours. 

That was time I didn't have.

You'd think the dealership would be interested in learning from mistakes and finding a way to keep my business.

In reality, what matters most to the dealer is my survey score. An employee has directly asked me to give a good rating on their survey after every one of these service failures.

Unfortunately, they aren't alone. Here's how surveys can make service failures worse. 

A customer giving a poor rating on a survey.

Does your survey focus on the wrong thing?

Surveys should focus on the experience itself, not just the customer service employees who are there to help when things go wrong.

For example, I recently bought an inflight internet pass to use while I was flying cross country. The internet was spotty the entire trip and my connection repeatedly dropped, so I emailed customer service to ask for a refund.

The customer service rep responded quickly and offered a credit for a future flight, which I accepted as a fair compromise.

I received a survey the next day. It asked me to evaluate the support employee, but not my experience using the company's product.

From a customer perspective, I had already shared all the feedback the company needs to know:

  • The service failure itself

  • My satisfaction with the resolution

We had ended the poor experience on a high note. Now the survey reminded me of the bad experience all over again. As Shep Hyken recently wrote, the survey shouldn’t be the last thing the customer remembers about you.

The day after the latest service failure at the dealership, I received this text from my service advisor:

Text message sent from service advisor at car dealership.

The message was clearly automated, but it still comes across as completely oblivious.

  • It’s an example of survey begging.

  • I already shared my feedback with the service advisor directly.

  • He knows it wasn't an exceptional service experience.

You can prevent this problem by establishing a clear purpose before creating a survey. Understand who you want to survey, why you want to survey them, and what you plan to do with that information.

This short video explains how to set a survey goal.

Should you even send a survey?

There are situations when a survey is a bad idea. For example, some companies send a survey after each customer service interaction. That could really infuriate a customer who has to contact support multiple times to resolve the same issue.

That text from my service advisor was another poor example. It re-hashed the memory of the service failure and made it even fresher in my mind.

Our ensuing text exchange tells me he still doesn't get it.

Text conversation with service advisor at car dealership.

The last thing I said to him when we spoke in person was, "I'm tired of you wasting my time. I'm taking my business to another dealer."

He still hasn't apologized. Now he's inviting me to come back in like nothing happened?!

The good news is many customer service survey platforms can be configured with rules that determine when to send a survey and when not to. For example, you can:

  • Limit the amount of surveys a customer is sent in a certain time period

  • Avoid sending multiple surveys for the same issue

  • Prevent surveys from being sent when they're not warranted

This last one is tricky.

Some companies have found that unscrupulous employees will prevent surveys from going out just to keep their average higher. I never received the promised survey from the dealership, which leads me to believe that’s what’s happening here. The service advisor anticipates a low score and might have prevented it from going out.

Does your survey inspire action?

It's frustrating for customers to give the same feedback over and over again. You need to use your surveys to identify issues and take action to fix them. Otherwise, you're just wasting your customers' time.

You probably see a lot of survey invitations at the bottom of receipts. A 2016 study from Interaction Metrics found that 68 percent those surveys “are total garbage.” The questions are so manipulative and the surveys so badly designed that they yield little useful information.

I once spoke with an executive who proudly announced her company had implemented a new survey. "What are you doing with the data?" I asked.

She explained that the survey scores were shared in a monthly executive meeting. There was a long pause, since I expected her to continue. No. That was it.

The survey was a waste of time in a number of ways:

  • It didn't have a comment field so customers could explain their ratings.

  • The company wasn't analyzing survey data to identify trends.

  • Nobody was taking any action to improve service.

My local dealership has experienced the same issue.

I’ve directly shared my concerns about service quality multiple times. The service advisor knows about it. Several of his colleagues do, too. I’ve talked with at least two of his bosses.

And yet, after every poor experience, someone awkwardly approaches me and asks me to be nice on the survey. Meanwhile, nothing gets better.

The sad part is the issues are fixable.

I called another dealership to book an appointment for the service my car still needed. The employee was careful to advise me that the appointment would take approximately four hours, and he gave me the option to wait, get a loaner car, or have Uber take me somewhere.

Take Action

Now is a good time to take a hard look at the surveys you offer. Ask yourself:

  • Why are we surveying our customers?

  • How are we using this data to improve the experience?

  • What aspects of these surveys could be annoying?

You can learn more about customer service surveys and get tools to create a great one on this resource page.

How to Increase Survey Response Rates by 370%

Andrew Gilliam, ITS Service Desk Consultant

Andrew Gilliam, ITS Service Desk Consultant

Small changes can often lead to big results.

Andrew Gilliam is an ITS Service Desk Consultant at Western Kentucky University. He improved the response rate to customer service surveys by 370 percent simply by changing the wording of the survey invitation email.

I interviewed Gilliam to learn about how he was able to do it. He provides a lot of helpful, actionable advice into this short, 20 minute interview. 

Topics we cover include:

  • Why you should survey both internal and external customers

  • What constitutes a "good" response rate

  • How to improve your survey invitation email

  • What types of customers typically complete surveys

  • Why you need feedback from angry, happy, and even neutral customers 

You can watch the full interview here. Follow Gilliam on Twitter at @ndytg or contact him via his website.

I Took Every Survey For a Week. The Results Weren't Good.

Customers are inundated with surveys.

We get them on receipts, via email, and in the mail. Shop somewhere and you're asked to take a survey. Don't shop somewhere, and a survey still appears. Visit a website and ping!, you get asked to take a survey.

I decided to take a week and do a small experiment. During that week, I would take every single survey I was asked to complete. The idea was to test three things:

  1. How many surveys would I be offered?

  2. Were any of the surveys well-designed?

  3. What was the experience like?

I was asked to complete 10 surveys during the week. That pencils out to over 500 surveys per year! No wonder customers experience survey fatigue.

Only one of the 10 surveys was well-designed. Every other survey had at least one glaring flaw, and most had multiple failures. More on that in a moment.

And what was my experience like? Most of the surveys backfired. The experience was so poor it made me like the company even less.

Person filling out a customer service survey to report a negative experience.

Surveys Are Too Difficult

When you ask a customer to take a survey, you're really asking the customer to do you a favor. A lot of the surveys I took made that favor really difficult.

Just accessing the surveys was a big challenge. 

My first survey request was on a receipt from the post office. The receipt had a QR code that I was able to quickly scan with my phone, but then the survey site itself was not optimized for mobile phones.

A survey from Dropbox wanted me to first read and acknowledge a confidentiality agreement before completing its survey.

Confidentiality agreement required to take the Dropbox survey.

The super odd thing was the confidentiality agreement had it's own survey! This extra bit of aggravation got even more annoying when the survey required me to fill out the comments box to explain my rating of the confidentiality agreement.

Survey requiring a comment.

Back to the first Dropbox survey, I had been working on it for 11 minutes in when I hit an infinite loop. None of the answers to a question applied to me, and it lacked a “Not Applicable” option for this required question. I felt I had put in enough time at that point and just gave up.

The survey invitation from Vons, my local grocery store, was a real piece of work. It was a receipt invitation, but there was no QR code, so I had to manually enter the web address. Then I had to enter a string of numbers along with my email address!

Vons survey invitation page, which requires an email address.

I couldn't complete two surveys due to errors. An email invitation from Chewy linked to a web page that I couldn't get to load. The Human Resources Certification Institute sent me a survey on May 24 that closed on May 23. Completing that survey is pretty low on the list of things I would do if I had access to a time machine.

Poor Survey Design

Beyond being difficult, just one of the ten surveys was designed well enough to provide useful, actionable, and unbiased information.

Many surveys were too long, which often triggers low completion rates. The Dropbox survey advertised it would take 15 minutes. (Who has that kind of time?!) These companies' surveys could easily be redesigned to get better data and higher completion rates from just three questions.

Many were full of leading questions designed to boost scores. This AutoZone survey arranged the rating scale with the positive response first, which is a subtle way to boost ratings. Like many of the surveys I took, there wasn't an option to leave comments and explain why I gave the ratings I did.

AutoZone customer service survey.

The survey from Vons was an odd choose your own adventure survey, where I got to decide which topic(s) I wanted to be surveyed on. 

Screenshot of multi-part customer service survey from Vons.

This created unnecessary friction and generated a little confusion since my biggest gripe on that particular visit was the large number of aisles blocked off by people stocking shelves. Is that a store issue, an employee issue, or a product issue? It’s a great example of where asking a customer to simply give a rating and then explain the rating would quickly get to the core of my dissatisfaction.

The One Good Example

The best survey was a Net Promoter Score (NPS) survey from Suunto. 

I received this survey invitation about six months after I registered a new watch on the Suunto website. NPS surveys measure a customer's intent to recommend, so giving me six months to use the watch before asking if I'd recommend it allows enough time for me to know what I like and don't like about the product.

Another positive was it asked just two questions: a rating and a comment. 

Suunto NPS survey.

Short surveys tend to have much higher completion rates than longer ones. Counterintuitively, you can almost always get more useful data from a short survey than a long and tedious survey. (More on that here.)

My question about the Suunto survey was whether the survey was linked to my contact information. This is necessary so someone from Suunto can follow-up with unhappy customers to learn more about the issues they're experiencing. (More on that here.)

Resources to Create Better Surveys

Here are some resources to help you avoid these mistakes and create better surveys.

You can also get step-by-step instructions for creating a great survey program by taking my customer service survey course on LinkedIn Learning.

Report: Why Retail Customer Service is Dropping

A new report from the American Customer Satisfaction Index shows a drop in retail customer satisfaction. From department stores like Nordstrom to specialty stores like Bed Bath & Beyond, customers are less happy than they were a year ago.

How can this be possible in an era where customers are bombarded with survey requests and access to big data is at an all-time high?

The answers have to do with people. How people are staffed, managed, and the duties they are asked to perform all have an impact on customer satisfaction.

You can access the full report or read below to see the highlights and analysis. To kick things off, the chart below shows a comparison in overall satisfaction between 2017 and 2018 on a 100-point scale:

Retail customer satisfaction declined from 2017 to 2018.

Retail customer satisfaction declined from 2017 to 2018.

Trend #1: Courtesy and Helpfulness of Staff

This one is down across the board.

Courtesy and helpfulness from retail employees has declined.

Courtesy and helpfulness from retail employees has declined.

Staffing levels have a big impact on this category. Retailers routinely understaff stores in an effort to save money, but this leaves the few available employees running ragged trying to serve multiple customers and complete tasks like restocking and merchandising.

Another issue is the surveys that seemingly appear on every retail receipt. These should help retailers detect problems like unfriendly employees. But the dirty secret is many retailers don't actually use those surveys to improve. And many even manipulate the surveys to make the scores look better than they really are.

A 2016 report from Interaction Metrics found that 68 percent of retail customer satisfaction surveys were "total garbage."


Trend #2: Layout and Cleanliness of Store

There's a slight dip in this area.

Stores need to improve the cleanliness and layout.

Stores need to improve the cleanliness and layout.

Part of the challenge is staffing (see Trend #1). Stores struggle to stay clean and organized when there aren't enough employees to do the work.

Another is command structure. Many retail chains make store layout decisions at the corporate level, and don't do enough field testing to ensure the designs actually make sense. Last year, I did a comparison of my local Walgreens, Rite Aid, and CVS and noted important differences in the layout of each store.


Trend #3: Speed of Checkout Process

The checkout process was another area where satisfaction dropped across the board.

Checking out is too slow at retail stores.

Checking out is too slow at retail stores.

Here again staffing plays a role. We've probably all wasted time wandering around a department store, searching for someone to ring us up. And that's precisely why so many people would rather shop online—it's much easier.

Customer satisfaction with speed isn't just about the actual amount of time it takes. People are heavily influenced by perception. So a pleasant experience with a friendly cashier that takes five minutes will feel like a breeze, while an unpleasant experience that also takes five minutes will feel like an eternity.

Retailers could help themselves by studying these factors that influence wait time perception.

Take Action

There are three easy ways retailers can check these trends in their own stores.

Talk to employees. I have no idea why managers don't spend more time doing this. Employees will almost always be forthcoming about the challenges they face if you ask them sincerely.

Walk your stores. Spend time walking through your stores like a customer. You'll often discover unexpected problems that your customers encounter every day.

Use surveys wisely. Customer feedback surveys can be valuable tools, but you should use them wisely or not use them at all. This short video will help you decide why you want to run a survey program.

Why You Need to Analyze Survey Comments

I'm putting the finishing touches on the second edition of my book, Getting Service Right. The book was originally called Service Failure, and I've now updated both the title and some of the research.

The cover is one of the most important sales tools for a book, so I worked with Anne Likes Red to come up with a few designs. I then launched a survey to ask readers for their feedback on three cover options. The survey was up for just a few days and a 135 people responded.

Here were the results:

Option A (28%)

GSR-3.jpg

Option B (52%)

Option C (20%)

Picking cover option B should be a no-brainer, right? After all, more than half of all survey respondents picked that option.

Without qualitative information, I might have made that mistake. Fortunately, I also included a comment field in the survey. When you analyze the comments to learn why someone chose a particular option, a new pattern emerges.


Searching for Themes

I recently hosted a webinar with Alyona Medelyan, CEO of the customer insight firm Thematic. Medelyan brought actual client data to reveal some interesting insights that a survey score alone wouldn’t show:

  • A cable company found customers with modem issues were impacting overall NPS by -2 points.

  • Another company discovered one variable that caused customers to spend $140 more per year.

  • An airline learned passengers were 4x angrier about missed connections than delayed flights.

The point Medelyan made is we usually get deeper, more actionable insights when we analyze the comments and not just the scores. So I applied this concept to my book cover survey and found two significant themes contained in the comments.

The first was quite a few people chose B because they liked the subtitle below the title better than the way it was shown in option A and C. So it wasn't just the color that's drove people to option B.

The second theme was quite a few people who selected option B mentioned they liked the title arrangement of option B, but preferred the color of option A. There were even a handful who picked B but mentioned they liked the color on option C best.

Suddenly option B isn't such a clear and convincing winner. Here's what happened when I revised the survey results to account for color choice alone:

Option A (40%)

Option B (39%)

Option C (21%)

Now I have two insights:

  • People prefer the blue cover shown option A

  • People like the title arrangement in option B

Keep in mind I only made adjustments where respondents were explicit in their survey comments. If someone didn't explain why they chose B, they may have done it for the title arrangement, the color, or pure whimsy.

Making a Final Decision

I did a similar survey with my last two book covers, and both times I ended up choosing elements from different options. I did the same thing this time.

Going with option B's title arrangement was a pretty easy decision. There were numerous comments describing option B as the preference without any support for the layout of options A and C.

I ultimately chose the blue color from option A. 

Several survey comments mentioned color theory, and my friend Jim even shared this helpful resource from Quick Sprout. According to the guide, the color blue symbolizes tranquilty and peace and has more positive associations across various cultures than purple and green.

The kicker is the blue is my personal preference. I really like it, and it's important for an author to really like the cover of their book! Here's the final cover:

It was also important to consider how the cover will look when shown together with my other books on Amazon, in a bookstore, or at a trade show. Here's how it will look displayed next to my other books:

Take Action

You can gain so much more from a survey if you combine the fixed choices (ex: option A, B, or C) with comments. Try analyzing one of your own surveys to see what hidden insight is revealed.

You’ll find a lot of simple analysis techniques in the webinar with Alyona Medelyan from Thematic.

You can also get more help with your survey on this survey resource page.

How to Get Customer Feedback Without a Survey

Updated: December 15, 2023

I frequently use subscriber feedback to improve my Customer Service Tip of the Week email newsletter. Yet I've never used a survey.

Customers are inundated with surveys, so think carefully before rolling out yet another one. You can get a lot of useful voice of customer feedback from several alternative sources.

Here are five ways to collect and use customer feedback without a survey.

Business people sitting around a conference table analyzing survey data.

Issue Alerts

Create a process to alert you to issues in real-time.

My weekly email will occasionally have a small issue such as a typo or a broken hyperlink. I try to proofread each email and test all the links, but problems occasionally do happen.

Typos are my kryptonite.

Thankfully, I can count on subscribers to let me know when there is an error. It's usually just a handful of people who email me about the problem, but that's all the feedback I need. Keep in mind most customers won't bother to tell you about small issues, but that doesn't mean they don't notice!

I have a process in place where I can flag a problem and fix it the next time I send out the same tip. In some cases, such as a broken hyperlink, I may re-send the email with the correction, although I try not to do this very often because I don't like swamping people's inboxes with extra emails.

Discussion question: What process do you have in place to allow your frontline agents to resolve or report problems?

 

Investigate Icebergs

A customer service iceberg is an issue that seems small and isolated on the surface, but is actually a sign of a much larger and more dangerous problem that's hidden from view.

Someone recently emailed me to let me know she had tried to sign-up for the Customer Service Tip of the Week email, but never received a confirmation. This was a classic iceberg because it was easy to dismiss the problem as a one-off where maybe she just missed the email or the confirmation wound up in a spam folder. 

I was tempted to just manually subscribe her to my list, but I decided to investigate. 

My research led me to a helpful exchange with a support agent at MailChimp, the company that powers my newsletter. With his help, I identified a technical setting in my account that would make my emails more recognizable to corporate email servers.

Here comes the kicker—my weekly subscription rate instantly doubled!

Some of those extra subscribers undoubtedly came from a marketing campaign I was running. But some of that huge increase was certainly due to this technical issue. I never would have found it if I hadn't investigated the iceberg that came from just one email.

Discussion question: What do frontline employees do when they encounter a strange or unusual problem? Are they trained to search for and identify icebergs?

 

Invite Conversation

There are a few books that have absolutely changed the game for me. One was Kevin Kruse's book, Unlimited Clients.

A key piece of advice in the book was to invite conversation with your customers. The first version of the book had Kevin's phone number and email address right on the cover, and I can tell you from experience he actually responded!

So I took Kevin's advice and added a special invitation to the welcome email I sent to new subscribers. 

Excerpt from Customer Service Tip of the Week welcome email.

Subscribers have always been able to reply to any email and send a message directly to my personal email address. However, this invitation substantially increased the number of people who actually emailed me.

It's not everyone. (Thankfully—I don't know if I could keep up!) But a couple times a day I get an email from a new subscriber who tells me a little about themselves.

It helps me learn more about them and I often try to share something helpful in response. I've also learned those subscribers are more likely to share their feedback as they begin to receive the weekly tips.

Discussion Question: How can you invite individual customers to engage in a one-on-one conversation?

 

Catalog Unstructured Data

Something really amazing happens when you take all those individual conversations you have with customers and categorize them.

I went through hundreds of emails from subscribers and categorized the customer service challenges they shared with me. When I decided to put my weekly tips in a book, I put the top ten challenges in a chart and identified tips that could help with each one.

Going through several hundred emails may seem like a lot of work, but it really doesn't take that much time. I probably spent an hour. 

It goes even faster if you catalog feedback from individual customers as it comes in. A lot of customer service software platforms have a tagging feature that allows agents to do this on the fly. If your technology won't do it, you can have agents use a spreadsheet or even a piece of paper.

Here are some resources for capturing unstructured data:

Discussion Question: How can you capture and analyze unstructured data?

 

Be a Customer

I learn a lot by subscribing to my own email.

This was a trick I learned from working in the catalog industry. Catalog companies would mail themselves a copy of each catalog so they could time how long it took to arrive and could verify each catalog arrived in good condition.

Subscribing to my own email allows me to do something similar.

For example, the Customer Service Tip of the Week goes out each Monday at 8:45 am Pacific time. One week, the email didn't arrive as expected. I double-checked the system and discovered I had set that particular email for 8:45 pm

Oops! Fortunately, I was able to quickly change the send time and the email went out only a few minutes later than normal.

Discussion Question: What can you learn from being your own customer?

 

Take Action

Here are all the discussion questions in one spot:

  1. What process do you have in place to allow your frontline agents to resolve or report problems?

  2. What do frontline employees do when they encounter a strange or unusual problem?

  3. How can you invite individual customers to engage in a one-on-one conversation?

  4. How can you capture and analyze unstructured data?

  5. What can you learn from being your own customer?

All of these questions can yield terrific customer feedback without ever resorting to a survey! Best of all, the feedback you get from these sources can often be quickly used to make improvements.

You can get five more survey alternatives from this old post.

And, if you really want to use a survey, my course on LinkedIn Learning can guide you. Here's a short preview.

Why You Should Stop Surveying Your Customers

What if you discovered your business was doing something that more than 25 percent of your customers disliked?

That should get your attention, though some businesses engage in unfriendly practices that bring in significant revenue. Think of airline baggage fees, hotel resort fees, and cable equipment rental fees. 

Okay, but what if you learned an activity that more than 25 percent of your customers disliked delivered absolutely no value to your business?

You'd probably stop it immediately.

The customer service survey falls into that category for many companies. Customers don't like it and it delivers absolutely no value. Smart customer service leaders should either fix their broken surveys or stop doing them altogether. 

Read on to learn which path you should take.

A team of professionals analyzes a customer service survey.

Customer Service Survey Drawbacks

A 2017 study from Customer Thermometer asked 1,000 customers to give their opinions on surveys by, you guessed it, surveying them.

  • 25 percent dislike being surveyed after a purchase

  • 47 percent dislike being prompted for feedback on a website

  • 43 percent dislike being surveyed in exchange for a contest entry

The caveat is an inherent bias in the results. The chances of you filling out a survey about surveys when you really don't like surveys is pretty low. So we could reasonably expect the positive results to be inflated.

In fact, 45 percent of respondents reported they routinely ignored survey requests.

Okay, so far the data shows that surveys annoy a lot of customers and nearly half of customers don't complete surveys, so they aren't representative of your customer population.

It gets worse.

A 2016 study from Interaction Metrics concluded that 68 percent of surveys from leading retailers were "total garbage," meaning the surveys yielded no useful information.

The kicker is a 2017 study from Capgemini Consulting revealed that companies improperly used Net Promoter Score (NPS) surveys saw no difference in customer perception compared to companies that did not track NPS or customer experience data.

The big question is whether it's worth the risk of annoying so many customers if your business is getting zero value out of your surveys.

 

How to Tell if Your Survey Generates Value

Think about the intention behind a customer service survey. This is what a survey plan should look like:

  • Generate meaningful insights

  • Use those insights to take action

  • Measurably improve the business through those actions

So you can start assessing the value by starting at the beginning. Does your survey generate any meaningful insights?

Here are just a few questions it might answer:

  • What makes your customers happy or unhappy?

  • What products, services, or locations are performing the best or worst?

  • What generates the most complaints?

Insight alone isn't enough. You'll need to actually take action. Examples include:

  • Fixing customer pain points

  • Reducing customer service waste (ex: repeat complaints)

  • Strengthening areas where customers are happy

Finally, you'll need to make sure those actions are generating measurable business results in some one. For instance:

  • Can you improve customer retention?

  • Can you serve customers more efficiently?

  • Can grow revenue through more word-of-mouth advertising?

These are all examples and by no means an exhaustive list. The bottom line is your survey needs to be a conduit to improving the business or else it's a waste of time.

 

Take Action

I've assembled a customer service survey resource page to help you learn more about what makes a great survey. You'll find blog posts and helpful videos.

Take time to evaluate your survey. If it's not driving value you'll have a big decision to make. Should you scrap it or fix it?