Uncategorized

The Leader of the Pack

Simon Wood

If, like me, you spend a lot of your time at conferences, reading research papers or attending webinars, then I’m sure you’ll have found a way of filtering out all the buzzwords and jargon you’re exposed to. My favourite is the classic ‘buzzword bingo’. For the uninitiated, this involves picking out a list of your favourite bits of jargonese (is ‘jargonese’ jargon itself?!) and then seeing how many conference minutes it takes for them to have all been mentioned by one speaker or another.

Worryingly, I’ve found myself playing the game with TV programmes too. ‘The Apprentice’ is especially good for this, but I’ve also noticed sporting programmes offer fantastic opportunities, with various clichés making an appearance every week. Recently, I’ve found one of sports’ favourite clichés seems to be making its way to the world of customer experience – ‘the league table doesn’t lie’.

Now I’ll accept that in the world of football, rugby and various other sports, this cliché sort of works. Everyone wants to be top of the league in sport, and generally the longer the time period, the more accurately the league table comes to reflect the true quality of the competitors – even if the league table actually reflects the spending power of a particular club/team more than anything else.

However, in the world of customer experience it’s unfortunately not that simple – the league table does lie. What am I talking about? Well, despite the way customer experience rankings and league tables are being pushed by some of the ‘experts’ in the sector, we need to acknowledge that when it comes to customer experience, being the best may not be the best policy! Unfortunately, the more we promote this idea of being ‘top of the league’, the more we risk setting our clients on the wrong road – and an expensive one at that!

The danger hidden in behind customer experience rankings
Now before I get hate mail or my Twitter feed gets trolled, I fully accept that a customer experience ranking/league table is a neat way of generating a bit of PR. Journalists love them and you can be almost guaranteed a call from the bottom 2 or 3 companies wanting to know why they are so ‘bad’. However, it’s when we go beyond their PR value that we enter a bit of a market research minefield!

So what is it about these rankings that gives me cause for concern? Well, my fears are twofold:

  1. Technically the rankings are often flawed and don’t actually reflect what they’re supposed to
  2. They give the wrong impression of what success looks like

So taking each of these in turn, what are my objections on a technical level? Well I have more than a few main ones – and I’m not even mentioning the robustness of the research used as a base for a lot of these!

  • Rankings are distorted by brand positioning and image. Take a look at the next set of rankings that you see. I can pretty much guarantee that most brands at the top will have a luxury/premium position. Those at the bottom will be the value brands. Why? Well luxury brands are supposed to offer a good customer experience – it’s kind of the point. If they don’t, they won’t be seen as luxury for very long. Similarly, value brands tend to focus more on price and less on experience – so they perform weakly in customer experience league tables. The phrase ’self-fulfilling prophecy’ springs to mind.
  • An uneven playing field. You don’t need to have worked in customer experience for long to realise that customer ratings are hugely influenced by the context of the experience and the level of cognitive interest a customer has in it. For example, making a major purchase face to face will be much more impactful and hopefully positive than paying your credit card bill online.   Similarly, taking a holiday is likely to be perceived much more positively than making an insurance claim due to the simple nature of the experience. As a result, brands whose customer interactions are broadly positive will always punch above their weight in a ranking table. For supposedly objective researchers, we do seem happy to compare apples with oranges when it comes to these rankings.
  • Halo effects. Banks must offer terrible customer service if you believe most of the league tables. And funnily enough, their service became worse overnight after the 2008 financial crisis. Similarly, energy companies give worse service the weeks just after their profit announcements. Maybe that’s true – but I suspect the truth is that there’s a halo effect going on here. These halo effects run through most of the rankings and are a major influence on who’s placed top, middle or bottom.
  • Experience misnomer. What is the definition of customer experience being used in these rankings? Often the commentary around them talks as if they reflect the quality of interactions – yet the league tables themselves are often measuring the customer relationship. Yes, you can stretch the definition of customer experience to cover both – but too often the distinction between interaction quality and customer relationships isn’t made, meaning the ranking isn’t really reflecting what it claims to. I don’t know about you, but if I saw that kind of slack and careless behaviour in a client’s project I’d be having strong words with the research team responsible.

So there are a ‘few’ technical issues I have with league tables, but to be honest, most of those can be eradicated by careful usage and reading. However, my biggest beef is with how they often distort what success looks like. What do I mean? Well here are a few ways in which the league table can lie…

  • Numerical rank in itself is meaningless – Most commentary around rankings implies that being on top is the place to be; however, it’s usually fine (certainly in the short term) to be number 78 if your competitors are ranked below you, whereas being ranked second is a pyrrhic victory if your key rival is first.
  • Little link to business performance – Rankings don’t link to business performance or consumer behaviour. Ryanair and easyJet are the most profitable airlines, but generally struggle on customer experience rankings.
  • Unreflective of consumer behaviour – Consumers choose based on preference, within a context of market realities – they can’t all afford to buy the luxury brand that sits high in the rankings. Being top may well make zero difference to your market share if the market factors are what drive consumer choice.

So what to do?
By now you will have spotted that I don’t really like the various customer experience rankings that are floating around the industry. So presumably I think they should all be abolished and the word ‘ranking’ scrubbed from the dictionary? Well, not entirely…

Realistically, one of the biggest challenges we face in customer experience is keeping senior stakeholders committed and bought into their customer experience initiatives. Published league tables have many flaws, but the one thing they do have going for them is that they get attention – especially from CEOs who don’t like to find their business languishing in the lower regions!   Rankings therefore offer us a great way of keeping senior stakeholders engaged and interested in customer experience. Used in that way they are absolutely fine.

It would also be remiss of me not to acknowledge that rankings do sometimes generate interesting findings. Every now and again a brand will appear somewhere that seems counter-intuitive, suggesting that there’s something afoot. For example, a few years ago, a supermarket ranking table I did showed Tesco very low down and Aldi much higher than expected. Jump forward 4 years and those findings no longer look so strange when you consider their recent financial performance.

The concept of ranking also has its merits – if used differently. Some of the best R&D work I’ve seen over the past few years has shown the power of ranking within competitive sets at an individual customer level. Behavioural economics has shown us that humans don’t go around with a long list of pros and cons for all the options available to them – they have a predefined preference and their actions follow this. By understanding the ranking your brand has within the competitive set at an individual level you can better understand customer preference and what is driving customer behaviour and your share of wallet.

Rankings can also be an excellent way of understanding how you compare across geographies and cultures. I’m sure most of you have, at some point, had the same question give different results depending on the geographical focus of the survey. If all you have to go on is the rating of one brand, this can make accurate analysis tricky – especially as everyone wants to compare the different geographies, and there really is no way of knowing how much difference is real and how much cultural. In that situation, being able to understand how your brand compares in a ranking makes life a lot easier – you can see if what you’re doing is good, bad or indifferent easily.

However, at the end of the day I’m still not a fan of customer experience league tables for one key reason (touched on above) – they ignore the business strategy. Successful businesses define their brand position and a proposition appropriate for it, and then deliver an experience in line with that. They then pursue that strategy unswervingly – regardless of whether it puts them top or bottom of any rankings. They deliver to customer expectations and customers trust them accordingly.

If that strategy is to differentiate by excellent customer experience e.g. First Direct, then being top of that league table is a real goal. However, what if that strategy isn’t to be the best? What if it’s to win on price/value, e.g. Aldi, or to use market factors to lock in your customers? Many businesses pursue those types of strategies successfully and profitably, but if we focus on improving their league position we will miss the point, and will incorrectly advise them on what they should be doing. Instead of congratulating them on how they manage their customer experiences to create vast profits, we’ll simply muddy the waters – or worse make ourselves irrelevant.

What does this mean in reality? Well, we can still look at the league tables, comment on who ranks where, discuss whether it matches our experience, and so on. But, crucially, we need to remember what our clients are trying to achieve and how they are trying to do so. Only when you overlay that consideration onto the league table can you detect whether there’s a clear misalignment – for better or worse – and that it’s time to start recommending action. And before we start recommending action, we need to go back to that brand proposition and look at what it promises!

Yet in conclusion, if I’m honest, I’m not sure it’s the league tables that are actually the problem. They are a symptom that vast majority of customer experience consultants seem to believe in a utopian improvement of customer experience scores.

We, as an industry, have to drop that view. We need to recognise that our clients live in a real world with limited budgets and brand propositions which are not always about ‘being the best’. That means we have to look at the behaviours they wish to create, and determine the level and features of a customer experience that delivers those. That’s a lot more nuanced, requiring a lot more thinking and careful interpretation – but it’s the only way we can truly add value to our clients, and prevent ourselves being irrelevant.

Simon Wood is Head of Customer Experience at TNS, UK

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.
Please note that your e-mail address will not be publicly displayed.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles