philwilson.org

Customer satisfaction and happiness

14 February, 2011

This post was originally published on The University of Bath's Web Services Blog.

So, everyone's favourite web company, 37signals , have started to publish their customer happiness rates at smiley.37signals.com. People can rate them as "great", "OK" or "not so good". At the time of writing, they have an 87% "great" rating, 5% "OK" and 8% "not so good".

Visit the 37signals Happiness Report

Visit the 37signals Happiness Report

Part of the work Web Services do (alongside everything else), is to act as frontline web support for internal and external users. When a ticket was marked as "closed" between June 2010 and January 2011, the submitter was asked to rate both us and the solution we provided. In those eight months we dealt with 1,251 support tickets, but received feedback on only 118 of these (just over 9%).

For both questions, we could be ranked one of five levels: very dissatisfied, dissatisfied, neutral, satisfied and very satisfied.

Problem solved satisfaction levels

Problem solved satisfaction levels

Interaction with Web Services satisfaction levels

Interaction with Web Services satisfaction levels

For the rating on the solution we provided, if we're harsh and say that "satisfied" counts as "neutral" and "dissatisfied" counts as "very dissatisfied", then we end up with scores of 71%, 23% and 6%. If we're a bit more positive and count satisfied as "very satisfied" instead (clearly distinguishing between positive, neutral and negative), then we score 86%, 8% and 6%. The results are very similar for the ratings on the interaction with Web Services, and are about the same as 37Signals'!

We stopped asking for feedback because we just weren't analysing the data enough. On their Happiness Report, 37Signals say "We study these ratings to help us improve." and although we should have been doing that too, we didn't. The feedback rate was just too infrequent to be a strong motivator. We could still do some analysis on the data we have (people were able to also leave a comment) and see where we could improve, in particular on the interaction front, but in order for the problem resolution stuff to be actionable, the feedback rates would need to go way up so that we could spot trends more easily. We didn't measure how many people actually clicked the link in our email (which would tell us how many were interested but put off by the form), but perhaps just a simpler question would help?

They have a very different audience to us, but which lessons should we learn from the 37Signals approach to their feedback? What do you think?

This post is part of the university of bath collection.

See other posts tagged with communication tools customers happiness satisfaction support and all posts made in February 2011.