One of the most common questions we get from our customers is: What type of survey response rate can I expect?

When doing some research, you’ll see a wide range of numbers on what a “good” or “average” survey response rate is. The response rates are usually qualified by a specific distribution channel or survey type:

  • 33% as the average response rate for all survey channels, including in-person and digital (SurveyAnyplace, 2018
  • >20% being a good survey response rate for NPS surveys (Genroe, 2019)
  • A realistic response rate range of 5% to 30%

What is a survey response rate and how is it calculated?

Your survey response rate is the percentage of people who respond to your survey. So, say you survey 1,000 people and 150 people respond – your survey response rate would be 15%.

What is a good survey response rate? Benchmarks from our 2020 data

You may be thinking, “Hang on, wouldn’t COVID-19 have impacted 2020 response rates?” When we compared the data from 2019, there actually wasn’t much difference in response rates year over year.

From what we’ve seen, businesses that had to pause or slow down during the pandemic also limited the scope of their feedback programs. Customers who were receiving services still responded to feedback surveys at a similar rate.

Because of that, we decided 2020 data would be the most helpful to share. Many of the factors that influenced 2020 are also still going to be in play for 2021.

Here are the response rate averages of Delighted users across our various survey distribution channels.

Delighted users' survey response rates for 2020

For Delighted users, depending on the channel used, the average customer survey response rate ranged from 6% to 18% in 2020.

Now, here are the response rates we get when we send surveys out to our own customers to find out how much they like using Delighted (also from 2020).

Delighted survey response rates for 2020

For comparison:

our survey response rate numbers vs the numbers of the general population

We’ve done quite a bit of work on figuring out how best to reach our audience, and it has paid off. (Check out best practices to follow in this guide on increasing response rates.)

You can see that we use multiple survey channels – Email, Web, and iOS SDK – to cast as wide a net as possible. The response rate varies substantially from 6% to 22% depending on the channel. Before we started using iOS SDK surveys, our internal response rate for email surveys was 21%, while for web surveys it was 10%. 

Your response rates will fluctuate as you test the best distribution channel for your audience. Since we’re a B2B business with an app, it makes total sense that the iOS SDK channel would be a great way for us to gather NPS feedback. 

At the same time, we still want to continue using email and web surveys to capture feedback from stakeholders who aren’t always logged into our platform.

On the upper end, some of our customers have had response rates as high as 85% for email employee NPS surveys, 33% for email NPS surveys, and 22% for SMS text surveys.

Why do response rate numbers vary so much?

A wide variety of factors can impact how high your response rate will be, the most important being how engaged your audience is with your brand. Survey response rates can also vary by:

  • Industry: B2B vs B2C, with the likelihood of survey responses for B2B being higher than B2C
  • Your audience demographics: younger audiences may be more likely to respond than those over 65, especially if you’re using digital survey tools
  • Internal employee surveys vs external customer surveys, with response rates for employee surveys trending higher
  • The type of feedback you’re gathering: transactional surveys (CSAT or CES) tend to have higher response rates than relational NPS surveys – for example, our post-support CES survey response rate via an email signature link is 20%
  • Whether an incentive is offered to complete the survey
  • Survey distribution channel and timing

Timing can play a much larger role in your survey response rate than you might expect.

“Recency from an event, specifically for transactional NPS or event-based surveys, is a major driver of response rates.

For example, sending a post-support survey 24 hours after an interaction versus 7 days after an interaction is likely to produce better results, since customers know why you’re reaching out, and have top-of-mind feedback to share.”

– Sean Mancillas, Head of the Delighted Customer Concierge Team

To move the discussion away from the actual response rate number for a minute, let’s go bigger picture: what impact does your response rate have on your CX program?

Just how important is survey response rate?

The question of survey response rate is tied to statistical best practices for survey sample size – just how much feedback do you need to have an accurate understanding of your audience. The more pieces of feedback you have, the better, right?

Not quite. Just because you’re able to increase the number of people who respond to your survey, doesn’t mean the feedback you’re getting is more representative of the customers who matter most to you.

Here’s what research scientists in the space have to say:

“Concerns about response rates have solid theoretical grounding. You can’t make inferences about a larger group if people in the group won’t talk to you. Most of our statistics and margins of error assume having information from every member of a random sample.

But the reality for most research is that response rates are not high. And yet our findings are still accurate and there is evidence that sometimes lower rates give us more accurate findings. The reason? What matters is not how many people respond to a survey, but how representative they are of the groups to which they belong.”

Do response rates really matter, Versta Research

“What counts most, of course, is high response rates from your core or target customers – those who are most profitable and whom you would most like to become promoters. Retail banks, for example, find it helpful to survey their customers by segment, so that the responses of their most profitable clients aren’t drowned out by those who are only marginally profitable.”

Net Promoter System: Creating a reliable metric, Bain

Though your business situation may vary, one concept holds true. Segmenting your customers and ensuring you’re hearing the voices that matter most is the key to success. 

None of this is to discount the importance of having a strong, consistent flow of customer feedback. If you’re getting little to no feedback, try taking these steps to strategically improve your survey response rates.

What role does response rate play when you’re thinking of how you’ll set up your CX program?

Many of our customers are setting up a customer experience program for the first time, with questions on how response rate factors into their program optimization.

Sean Mancillas, the Head of the Delighted Customer Concierge Team, shares his guidance:

“Our role is definitely to help ramp up the volume of feedback by strategizing around best practices and sampling, but also to manage and report on the feedback in a way that can tell an effective story. If you’re getting a 20-25% response rate, but it’s scattered across totally disjointed and unrelated subsets of customers, the feedback is going to be all over the place and will produce super confusing net results – potentially worse than just a lower response rate.

When we’re helping on response rates, we’re also working with you to scope your CX program a bit more thoughtfully to specific customer segments (e.g. folks that are more active, with a recent purchase, from X location) – which naturally leads to better engagement and a quality of feedback that can really drive effective reporting and a strong customer narrative.”

For more specifics on building your program with statistical significance in mind, stay tuned for our next post on survey sample size.

If you’re itching to see how much customer feedback you can gather, sign up for a free Delighted trial. Your surveys will be out to customers in minutes.