Google Analytics for the Counselling Service – the limits of data
We’ve been looking at what automated tools can tell us about user behaviour on the Student Counselling Service site, as part of a suite of work we’ve been doing for Student Experience Services.
Analytics can only tell us so much. It can be fascinating to delve into, but is potentially a rabbit warren of useless information. So when we recently carried out a site review for the Student Counsellng Service – using Sitebeam, Google Analytics, and manual review – it was important to first define what questions we were trying to answer.
What are the site objectives?
I won’t go into too much detail about the aims and findings in this project given its sensitive nature, but we can look at some high level aims.
For example, SCS want it to be as easy as possible for students who need help to get it, and to understand the process. What can Analytics tell us – or more accurately, suggest – that helps to measure this?
Comparison, not raw numbers
One thing we did was to set up a segment in Analytics of people who had visited the referral page, and compare it to those who hadn’t. The key thing is not the raw numbers of visitors, time on page, etc, but a question of relativity. We can look for relative differences between the two segments in different measures:
Number of visitors
This is a pretty raw statistic, but if one segment shows a markedly lower number, it gives a direction for further usability testing. Is it because people can’t find the form? Or because that’s not what they were looking for?
New v returning visitors
For example, in this study we discovered that those visiting the referral page were significantly more likely to be returning visitors. This suggests people visit once to explore the site, then come back to actually refer themselves, highlighting the fallacy of assuming the two segments set up are of different people. It’s more likely that they are the same people at different times.
Are speakers of certain languages, or users from a particular area, more or less likely to visit the referral page? What can this suggest? What do we need to measure next to get a clearer picture?
This is always a controversial one. High bounce rate is not necessarily a bad thing. It shows people arriving on a certain page and leaving without clicking elsewhere. This could well mean they have just achieved their task.
An ideal webform, however, can be designed in such a way as to not give a bounce rate, but instead make it clear who completed the form, not just who looked at it. Page bounce rate can be useful to show us where we can change the structure of a site so we can get better answers out of Analytics next time user behaviour is being reviewed.
Taking down pages
The Analytics on pretty much any site will show a long neck and long tail. This can provide a great starting point for cleaning up your site. For the pages at the end of the long tail – those being viewed by very few people – one of two things will be true:
- They contain key content, but people aren’t looking at it. The chances are they can’t find it. The solution here is to do some restructuring of your site. Make sure the content itself is readable, so people know it’s what they were looking for.
- They contain irrelevant content. This one is more common. You can’t address every possible query your users might have, or you think they might have, on your site – it clutters the site and ultimately makes it less usable for the majority. Where you have pages few people access, containing information not relevant to top tasks, you’re much better off taking the content down.
For more on this, see Neil’s post on getting to know your ‘long neck’.