A process of continuous improvement reduces support calls (not forms)
Our tech team recently did some great work for IS Helpline, creating a bespoke webform that directs users to self-serve before submitting an enquiry. The form itself, though, isn’t what will ultimately help reduce support calls—it’s an iterative process of user testing, editorial improvements and analysis.
The form is now the main point of contact for visitors to the IS and IT Help site. It presents users with a list of help pages for common problem areas before they can submit an enquiry, thus encouraging users to try to self-serve first.
While this is something that could be done with EdWeb forms, Helpline required a bespoke solution to capture technical details from the user that can help reduce the amount of time spent resolving a call.
With every user who successfully resolves an issue by reading a help page, the form cuts down on unnecessary enquiries. However, the form can’t succeed with this unless the help pages linked to from the form:
- are clear, easy to follow and actually can help users resolve their issues
- reflect the most common problem areas
This isn’t a one-off action. To achieve this, we’ve been working with Helpline to continuingly improve their support materials through an iterative process of testing, revising and reviewing.
Each iteration of work we do with Helpline kicks off by watching students (our target audience) navigate their webpages to complete various tasks—specifically, ones Helpline receive frequent enquiries about but provide help pages to.
At the end of each usability testing review session, we come up with a list of the top usability issues we see participants experience.
These issues then inform the editorial improvements we make.
We’ve been making use of pair writing to improve the usability of Helpline webpages. This involves me working with Gavin from the Helpline team to sit down together to make changes to their site—which saves time from the traditional method of passing drafts back and forth before they’re ready to be published.
The changes we’ve been making have included:
- cutting down massive lists of FAQs
- pushing keywords to the left for users to easily spot
- breaking down long pages into sections with clear, focused subpages
- changing link text to reflect the title of their destination pages
Once we make editorial improvements, we can then monitor how users are interacting with these pages.
Analytics & analysis
We use Google Analytics to map user journeys through the Helpline form, which includes looking at:
- How many people arrive at the support form?
- Which help pages do they go to next?
- Do they then go back to the form to submit an enquiry?
With this, we can get an idea of which pages are or are not helping users self-serve.
We also need to measure what the most popular help pages are and how many support calls are coming through that could have been self-served (and what were they about). In other words, we’re looking for the ‘long neck’: the small number of things that are significantly more important than the rest.
This is to ensure we’re dealing with the help pages that cost Helpline the most in staff time. This also means that the help pages linked to in the form are not stagnant—they change as trends in support calls change. If Helpline starts to receive a large number of call about topic X, topic X should be linked to from the form.
We then finish each iteration off by conducting usability testing, which starts the process all over again when we meet to watch users interact with our improved materials and see what problem areas remain and/or surface.
Interested in this process?
You can read more about our work with Helpline on this blog.
If you have any questions about the work we’re doing or are interested in doing something similar for your department, get in touch.