Most of our clients are familiar with the work we do around staff engagement and leadership development. Increasingly though, we have also been conducting what we call ‘service quality’ surveys. Such surveys are usually focussed on the experience of clients, customers or stakeholders, who can be internal or external to the organisation. These types of surveys differ from more typical market research in that we are looking at existing customers or service consumers, and uncovering useful insights around how to improve the service provided. External examples of surveys we’ve conducted include family and carer surveys for aged care and disability service providers, and member and stakeholder surveys for community organisations. Internally, issues of service quality often emerge in our engagement surveys as dissatisfaction with cross-unit cooperation, or with internal supports such as technology and facilities. This can result in more focused surveys, for example, looking at satisfaction with facilities management across various sites within an organisation.
Measuring IT service quality
One of the larger initiatives we’re involved in is looking at satisfaction with the quality of service provided by internal IT support teams. Like most of our surveys, this is a standardised product that clients can use “off the shelf”, or tweak and adjust to meet their needs. Sometimes referred to as an “IT Customer Survey”, we’ve been using the tool primarily in universities, with 40 of them having taken part across Australia and New Zealand. The results allow us to benchmark different organisations on the various measures, and to draw insights about service quality across the sector. However, the tool can be used by any reasonably large organisation looking to improve the standard of service provided by their IT support teams.
We often find that the image of IT as a whole is not as great as the sum of its parts. That is, when asked about their initial impression of IT support, people tend to provide lower ratings than when they are asked about their specific experiences with support staff. This often suggests a role in ‘selling’ the function and achievements of the service better, as impression management is often key when most customers are primarily interacting with you when they have a problem.
Similarly, people’s first contact with IT support is often over the phone, and we find that the helpfulness, understanding, and analytical skills of the phone support staff can have a large impact on this impression. Another key item to consider is the response time when a support issue is raised. Whilst faster is generally better (particularly for urgent or high impact issues), this also brings a corresponding level of cost. Perhaps surprisingly, in many organisations we find the response time is actually faster than it needs to be, and that customers remain highly satisfied even when the response to urgent problems takes longer (e.g. by the next day,) than the typical expectation or service level agreement (say four hours). Hence, in many cases it’s better to allocate those resources to improving quality rather than speed.
Finding the focus
Our approach to measuring service quality is to try and understand the factors that drive satisfaction within your organisation. For example, in addition to high-quality phone support, we often find that one of the key drivers of overall satisfaction is having an effective online (i.e. email or support ticket) request process. In contrast, the quality of the experience for customers receiving remote desktop support tends to have less of an impact, even though many organisations rely strongly on this approach as a cost-saving measure. And, sometimes, it’s just about making sure the wifi works!
Communicating the results
In order to improve service quality and drive change, it’s important that the results are shared with, and accepted by, the people providing the service.Being able to cut the data by different service streams, locations, or other variables can be crucial here in determining exactly where and how problems are arising. Similarly, combining this approach with meaningful text comments from the relevant groups and areas can really shed light on the issues that customers are facing. At the same time, it’s important that people don’t get bogged down in statistics or focus on results that aren’t having a high impact. One of the challenges in communicating such results is dealing with defensiveness and overcoming ‘perfectly reasonable’ explanations of why poor results have occurred. When we present such results in a workshop or briefing session, we find that trying to shift people’s thinking to how an issue makes the customer feel (after acknowledging the technical reason that caused it) can be a useful approach to achieving positive changes.
Whatever drives service quality in your organisation, whether it’s your main business offering or an internal function, we have the tools and expertise to help you understand it and improve the way your business operates. For more information about Voice Project’s Service Quality Surveys, you can email me at firstname.lastname@example.org or contact our office on 1800 886 423 or email@example.com.