Net Promoter Score is a great way of measuring and improving customer loyalty, and something that I believe that any businesse serious about their customers should use! Only, however if they use it properly!
For those that don’t know about it (check out the details here) it’s a simple question that finds out how likely customers are to recommend you to friends a colleagues. Those that rate you 9 or 10 (out of 10) are your Active Promoters (people who rave about you!) and those that rate you 0-6 are your ‘Detractors’ (the one’s who aren’t so keen!). Your Net Promoter Score (NPS) is the % of Active Promoters – % of Detractors.
It’s powerful because it doesn’t take into account the 7?s and 8?s (the ‘Passives’ – those who think you’re ok!) and can be (and should be) used as Key Performance Indicator that drives (and rewards) performance. You can also use it to benchmark your business against others and to measure improvements in performance. In the UK, Apple have an NPS of 67, First Direct has a NPS of 61 (and Andy Hanselman has an NPS of 64!)
However, it can be abused. A friend (who works for a large multi national business that will remain anonymous!) was telling me that they use it and she has to call 10 customers a week to get their rating. However, her line manager was putting her under huge pressure to call her customers to get their feedback score but insisted that she explain the implications of the different ratings i.e. why 9 is significantly better than an 8 and why they should give her a 9 or 10!!!
In other words, rather than obtain customer feedback ‘spontaneously’, they are ‘influencing’ their customers to provide ‘favourable’ responses.
So here is a business that is using NPS to measure performance of its people based on false and ‘forced’ information – talk about ‘lies, damned lies and statistics’ – It’s simply Not a Proper Score!!!!
Do you think these customers would really recommend this business? I wouldn’t ‘bank’ on it (oops that was a clue!!!)
Andy –
No matter how one views NPS as a performance metric (and here’s my perspective, as shared in a CustomerThink article from last year: http://www.customerthink.com/article/customer_advocacy_behavior_personal_brand_connection), it’s unfortunate to see any assessment of customer experience devolve, and become suspect, through employee gaming. This renders the results about as biased as the scores requested post-delivery of new automobiles by dealerships, namely “Is there any reason you can’t give us a 9 or 10 on a 10 point scale, where 10 is excellent and 1 is poor?”
Michael Lowenstein, Ph.D., CMC
Executive Vice President
Market Probe (www.marketprobe.com)