See the agenda in the Events list on the sidebar.
Any best practices how to increase participation in NPS around Digital Learning Offerings?
We have seen a lower participation rate in NPS when selling Digital Learning Offerings that are not instructor lead compared to Instructor Lead Offerings. In todays times I would be interested to significantly increase the participation, so that NPS results are getting a higher value and we're capturing better the feedback from our Clients. Is there anybody who has best practices?
When asking someone to complete a survey I think we have to ask what value completing the survey provides to a person. That value can take several forms. The process of taking the survey itself can be valuable. Some surveys make me pause and reflect and think about things I don't normally think about. Another way to provide value is to immediately provide some feedback and analysis of the results for that responder. Then there is the report from the survey and the benchmarking one can provide.
The problem with NPS surveys is that they don't create any value for the user. They are all ask and no give.
Perhaps the solution is to embed the NPS in a survey that does create value for the user.
Without this, I expect response rates to NPS surveys to decline.
The other thing to do is to ask 'What am I trying to learn from the NPS survey and is there another way to learn this that creates more value for users?'7
GailPropson Founding Partner | Scholar ✭✭
Hi @Alexander Ziegler & @Steven Forth
Great exchange about improving response rates on NPS surveys. I agree with Steven, your customer needs to see the value of completing a survey. I would like to expand this thought and offer the following:
In the survey invitation provide your customers with "how" customer feedback from others has already improved their experience. What changes have you made to your eLearning platform prior to their experience that others had suggested? I.e.
Improved videos, real time examples, ability to for live interaction ....
This will show your customer that they being listened to and you take action on their feedback. The survey is about them and improvements for their next experience with you eLearning platform.
Additionally, I agree that you should expand your NPS survey beyound one question as suggested by Steven. You want to know why a customer gave a 8 or 10. That is where you find value and can take action.
We recommend this to all our clients when helping them design their CX survey questions.
All the best,
We are also seeing a lower NPS response from our customers in general. I am interested in hearing what others have done to increase participation.1
@Steven Forth I like your thoughts about "what do I get from this", as I'm convinced that lots of things in live work based on this. Your thought is triggering a simple question for me and maybe you already have experience with this: What if we would give something, lets say a free access to something that is usually paid etc - did you try somethign like this? Like a compensation for filling out NPS?0
Hi @Alexander Ziegler In some cases I think this could work. I would try to make what is offered intrinsic to the value provided by the solution. Perhaps in order to get insight into this NPS surveys should always have a follow up questions around what could happen to change the NPS score (of course even asking this is likely to change the original score).
Personally I would rather embed an NPS question into a slightly larger survey that encourages reflection and that will generate data useful to the person taking the survey.2
@Steven Forth interesting, that you're bringing up the topic of an additional question. We're doing this always, and I just created a bigger case study around this topic. I'll brainstorm with our team how to probably really implement a kind of give back.0
Are you able to share the case study @Alexander Ziegler ? I do think that both parts of this are necessary, (i) create value in the process of taking the survey and (ii) create value for the people taking the survey from the data collected.0
@Steven Forth sorry for the long tie to reply, but I had to review what I can do with the paper according to the rules the publisher is having. I'll send you the paper by email, this should work. If anybody else is interested please ping me, and I'll share by email as well.0
Thanks, received the email. Appreciate the effort.0
I agree with the "whats in it for me" position, you may also want to look at "when" you ask for the survey? It is part of the normal activity in the interaction flow? Is your customer aware at the beginning that this is a deliverable that you want from them? It has seemed to me if these surveys are delivered "after all interaction" is complete without pre-positioning of the customer...the response is generally low-1
@Alexander Ziegler Great question! At Splash, we're still small enough where we send people $5 Starbucks gift cards when they take our post-education survey and post a photo of themselves with their course certificate. We've automated most of the process so it's not that operationally heavy and it's a pretty fun and engaging program with our clients. We'll scale out of it someday and we will have to rethink our carrot rather than a monetary amount but in the meantime, we're getting good data!0
Hi, We recently did away with the traditional NPS question for our survey and substituted 5 questions around quality of content, material, instruction, and overall experience. Similar to @Steven Forth's statement--we didn't see what WE or our customers were getting from a simple NPS. Now that we've changed, we do provide an instant "thank you message" to all responders as well as on-going updates for how their responses are addressed. Finally, we personally reach out to any person who provides negative responses. We've gotten good response rates and strong feedback that's helped our internal processes1