Knowing what your users think, what they really truly think, is crucial to making sure you’re building what they actually want. That’s where Net Promotor Scores come in.

Now that our customer count is in the hundreds, it can get tough to just “know” that we’re working on the right things and making a product that our customers love. We love speaking with our users and encourage it whenever we can, but it’s a double-edged sword. The people who are keen for a chat, are typically optimistic and encouraging, meaning their feedback is naturally biased towards being good feedback.

That’s why it’s worth seeking out unbiased and honest feedback by contacting a random cross-section of users, and asking them quickly and without leading them on, what they currently think about you and our offering.

We decided a couple of weeks back to do this using Net Promotor Score (NPS) using Promoter.io, and wanted to make sure that we got the most out of it by using every response as a chance to start a conversation with the user to dive deeper and find out what they really thought.

We loosely followed what the team at Baremetrics had done in the past, as detailed in this post. The main difference was that we were a little more granular with the responses that we sent people based on the score they gave.

While we were fortunate that the most common response we got from people was a 10/10, there was a lot of fence sitters that took away from our score. Or so we thought.

After striking up a conversation (or at least trying to, not everybody wants to go into detail) with every single respondent, we learned a few key insights both into ourselves as well as the psychology of responding to a survey like this.

The common responses

The common trait across the emails that we sent out in response to a score, was “what do we need to do to improve our score next time around with you?”. While the responses varied, there was a few similar underlying threads.

“You don’t have feature X, which would make our life so much easier”

These fell into two buckets, in some cases they were right, we didn’t have that feature. Whether or not we decide to work on adding in that feature or not is a different story, but more concerning was when people complained that we didn’t have feature X… when we actually did. Meaning we were doing a poor job of onboarding users and teaching them what can be done with elevio. Ironic (and embarrassing) given that’s the purpose of our platform.

“Our users absolutely love it, it’s great!”

We saw a lot of people responding with something along these lines, but then giving a score of 7 or 8. (A 7 or 8 is a “passive” score). When we pushed a little deeper to find what then could we do to improve that score, we got responses that “Oh, I don’t give 10s, I need to leave room”. Reading between the lines, they do love the service… but there’s just something missing, something that will make them fall in love. These are the scores that are both the trickiest to grow since neither of you categorically know what will drive a higher score, once you do get them to a 9 or a 10, you’ve got yourself a super loyal customer.

“Feature X doesn’t work”

These are great quick wins. While there’s always more than just one feature in any piece of software, often any given user only really cares about one particular feature. And if that feature isn’t working for them, then the whole product in their eyes is broken.

“I thought it was spam”

Yeah so, when I setup the campaign I didn’t chance the sender details. Meaning the campaign was sent out from “John Doe <john@example.com>”. Doesn’t exactly inspire confidence huh.

So, what did we learn?

We learned more from the conversations that followed than from the raw scores and initial comments left. This is where the real gold is, it might take a few emails back and forth until you get to the true reason a person isn’t in love with your product, but once you get that nugget, then you can head off in the right direction rather than taking everything at face value.

As a result, I feel like we’re actually a lot closer to these customers than we were before. They know that we actually care about their experience, and that we truely do want to get to the bottom of their troubles and do what we can to help them. You even start to understand the way people communicate better, and can have a little fun with them. It goes a long way.

Just doing this step, responding, I feel will have boosted our NPS score if we were to run it on the same group of people a week later.

What would we change next time?

Probably, to send it to a larger pool of people but drip fed over 30 days or so from a common trigger point. So each customer was receiving the NPS survey at the same point in their journey with elevio. This would let us compare responses over time directly, as we know people were at the same point in their discovery.

Being drip fed, it would also spread the responses out over time, so there was always contact with new customers each day, making sure that the effort isn’t a one time bulk hit.

The most important thing to remember

The real score doesn’t matter, it’s the conversations that you have afterwards in getting to the heart of an individuals score, and then making sure that we improve the root cause moving forward.

The biggest thing to remember is that it’s not the score that’s important, it’s a baseline on which to improve, and a conversation starter to get to the real truths. The grow and improve your company.

Get posts like this straight to your inbox
Join 1,000+ other people getting our posts on customer care and the SaaS journey
100% Privacy. We don't spam.

What we learned running an NPS survey

by Chris Duell time to read: 4 min
0