Could you please tell us your title and a brief description of what you do at Research Square?
Sure. I am VP of Customer Service at Research Square, and my job is running the operations of our customer service team as well as feeding customer insights and feedback from my team through the rest of the company to help inform decisions and judge our success.
Would you also tell us a bit about what Research Square does?
We are a company of about 260 employees that provides services for academic researchers around the world. We do this within two branches of the company. The first is focused on providing services for academic publishers to increase the speed of publication for academic journals. And the second part of the company, which is the part that I’m mainly involved with, is the American Journal Experts brand.
Basically, we serve individual researchers that are having problems getting their research published. Most of those problems revolve around language. So these authors, say, in China, are trying to publish their research in English, but the language is not sophisticated enough for most English academic journals to accept. They then come to us mainly for editing services so we help them overcome that language barrier and they can get their research published.
Do you consider yourself B2C or B2B?
I consider us a B2C company. We do have some B2B clients, as well as B2B2C. But, the vast majority of our business is B2C.
What is the role of customer experience and feedback at Research Square? What’s the desired experience that you’re aiming at for your customers?
Because our customers are all over the world, we rely a lot on our website to connect with them. We want to make sure our site experience is really helpful and communicates our services and our value proposition. We want to make sure it’s really easy to use, especially because of language differences. We also pride ourselves on great customer service and helping customers with all parts of publication, not just the things that we provide services for.
We want them to feel like it’s easy and worthwhile to contact us and ask questions and that we will be there for them throughout their career, not just when they have a paper they need to publish.
How does customer feedback play a role in that experience quality? How do the two come together?
We rely on the Net Promoter System (NPS) survey as our main type of customer feedback. We are monitoring scores every week and checking comments every week. We use those as touchpoints with customers. If they complain on the NPS, we do follow-up with them as quickly as possible. We also just create a positive touchpoint with great feedback, as well.
It really helps us understand where we’re doing things really well, where we might have made a recent change that greatly affected customers, or, in general, the health of our operational systems. If we see our NPS going down, is it because we’re having problems finding great editors? We also use a lot of our metadata to see if we can parse out where specific problems might be.
And about how many customers are taking NPS surveys each week?
That’s a good question. I’m just going to quickly look. It’s really easy to pull up the Delighted dashboard, so I can get to those numbers very quickly. It looks like around 600 surveys are sent a week and we end up getting about 150 responses.
That’s great. Can you tell us how important this feedback is to the C-suite and the executive level? What’s their connection to it?
NPS is actually one of our company-level metrics. Our executive teams look at it regularly and it’s one of the major ways they judge our company’s success. If there is a big change in NPS, they want to know why.
So customer feedback come mainly via NPS survey, basically once a week. How do you share this feedback with other parts of the organization? How do you make sure the information gets acted on?
We mostly use Excel scorecards to track it. And we import the data into Domo, our data platform, so that other people can do analyses with it and break it down into the things that they have the most control over. We also have a channel set up in Slack so that the good responses get posted there for people to see.
How are the scores actually analyzed? Is it trended or is it compared to benchmarks or just to what your previous NPS scores were?
We look at trends over time. We’ve been using NPS for five years or so now, so, we have a good idea of what our benchmark is. We don’t know what the rest of the industry is, which would be really interesting to know, because we would like to be the company in the industry that has the best customer experience and customer service. But, without knowing those benchmarks, it’s hard to compare ourselves.
But it’s mostly trends over time, and we set targets for improvement, because we would like to continuously improve our NPS score.
What about the verbatim part of the NPS survey? How is that taken into consideration?
It’s reviewed, simply, by people reading them at least on a week-to-week basis. For instance, I’m very interested in the customers that comment on price. So I have a trend set up to pick up on key terms and then that gets emailed to me every time one of the comments mentions that trend, so I see all of the comments that have price as a topic. What’s more, we contact customers based on their comments, so we can resolve complaints, address confusion, and things like that.
Interesting. And is there a specific customer feedback group within the company?
We have one person who is focused on our entire voice of the customer program. A lot of this falls under her. And she has a number of people that are primarily part of our customer service fulfillment team who regularly answer customer questions. One of their other responsibilities is to do the NPS survey follow-ups. So, I pretty much have one person.
What were some of the challenges you faced when launching, managing, or growing the program?
I would say the original problems that we had were related to technology and people. When we started measuring NPS, we were not doing it in Delighted, but in a different system. That was a survey we sent out using just a normal survey and multipurpose survey platform. And the survey was not embedded in an email. There was just an image and link.
So, customer experience-wise, it wasn’t great, because you had to click a link to go somewhere to take the survey. And if we wanted to do any kind of analysis or follow-up, it was all manual. It was all downloading survey responses and creating cases all manually. When we implemented Delighted, we saw a huge increase in responses, because the survey can be taken right in the email. It’s just a better customer experience.
And it hit all of the languages we were trying to service much better than we were doing in the past. And because of its integrations, it was doing a lot of the things we were doing manually, automatically. So, that has allowed us, really, to scale a lot in our NPS feedback.
Do you think you’ll continue or grow the program in some way?
For the other parts of our customer feedback program that we already have in place, we do have customer satisfaction surveys that go out with via Zendesk. And just this year we implemented a customer relationship survey. It’s a much longer, in-depth survey that we’re going to start administering on a much longer timescale. So, those are really our main pieces right now.
Just out of curiosity, what do you consider NPS to measure, then? Customer Loyalty? Brand awareness? Satisfaction?
I think of NPS as the total experience on a transaction—that’s our transactional survey. We send it out every time a customer makes a purchase and gets their edited paper back. That’s telling us, on that individual piece of their experience, how we are doing.
Our relationship survey is much bigger. Over the course of all the interactions that this customer has had in the past, how do they feel? What is their feedback? And then the customer satisfaction survey is specific for the customer service team—on that particular interaction, how are we doing?
NPS is kind of in the middle of all of the collection of events that resulted in that transaction. And then the relationship is the highest of the entire loyalty experience that this customer has had and whether they’re going to continue coming to us for these services.
Is there one place in which all of this feedback data comes together, where you look at it as one big thing, or do you keep them all separate?
Right now, they’re pretty separate. I think it would be nice if we could bring them together better. But they’re in different systems, they’re in different stages of maturity. I think we’re getting closer to bringing our NPS data and our feedback data together in Zendesk. We actually just implemented Zendesk last month.
So we’re still figuring that out a little bit. But it provides us with a little bit more information in one place than we had before. And the relationship survey is also going in there. But, again, it’s new, and we haven’t conducted one since we got that system up and running.
I understand it’s still early days and you’re just getting all of this going, but what advice might you have for anybody who’s looking at doing this kind of customer feedback for their company? What have you learned in the time you’ve been working this through?
The things that we have learned are that the actual survey makes a huge difference, both in terms of the number of responses and the quality of responses. If the survey is easy to take, you’ll get a much broader representation of your customer base than if it’s difficult. In fact, if it’s difficult, you’ll end up with a lot more of the dissatisfied people willing to fight through the difficulty. The people who are happy or, you know, more neutral, will give up if it’s not super easy to take. So having a really great customer experience really changed the game for us.
And definitely look at trends over time, including who is responding and what their backgrounds are. We see a huge difference in responses from certain countries based on cultural differences. And that’s something that we have to consider with our NPS score. Do we have more customers from Japan responding this week that drove our score down? Or more people from Brazil responding driving our score up?
One of the things we’re considering as we grow and get more customers from certain parts of the world that do respond differently, is whether we want to have different definitions of promoters for these different groups. Some countries that we see are less likely to give a nine or a ten, so seven or eight could be considered promoters. It would be nice just to do it that way and then have a composite score that’s a little bit more dependable than it can be if you have these differences in where people are coming from.
Delighted helps some of the world’s most coveted brands gather actionable customer feedback and make customer satisfaction a competitive advantage.
Sign up now to send surveys for free. You’re minutes away from getting feedback!