I’ve just had one of those frustratingly familiar online survey experiences. I’d gone online to check out my BTOpenzone account and struggled to find any details of it on either the www.btopenzone.com or bt.com sites.
And then I got a popup asking about taking a survey about the website. As someone who talks about the benefit of understanding the customer, I generally take these – it’s a relatively short investment of my time and if it improves the customer service or understanding of the organisation, it’s worth it.
But this one merely reinforced one of the issues I’ve talked about recently with a number of organisations. Surveys and statistics rarely actually change anything – because they’re designed in a rational but misguided framework.
The questions were eminently sensible and gradually wound me up further and further. What was I trying to do on the website? Was I successful?
Answered in full. Then it asked me twice more to explain why I wasn’t happy. And finally what one thing would make me so?
So, BT’s survey company makes no attempt to understand my thinking and my process, but instead builds its script to suit its information needs. Insisting that I fit my story within its structure and cognitive map and do so in abstract terms and multiple choice “radio buttons”.
And the result from my perspective? I was more angry at the end of the survey than I was at the end of the fruitless search on BT’s site.
And the result from BT’s perspective? Possible another slight blip in their numbers, a lot of wordage and explanation that’ll be lost in the survey results. And, I expect, no change in their actual understanding of the story from the customer perspective – which might actually inform their website and process design.
If what they wanted was to understand how people use the website or want to use the website, this survey does nothing. If it wants to improve customer satisfaction, this survey just increased their hurdle just that little bit more.