Collecting feedback and improving user experience

We described back in June how we were testing and using feedback to inform the content on the site in its beta stage.

This has continued post launch. We’ve been gathering comments from customer service staff, services and website users. features a short survey (only four questions) asking users what they came to the site to do and whether they found what they’re looking for.

Despite our careful proofing before publishing pages, this feedback has highlighted some ‘quick fixes’ such as spelling errors and broken links, as well as more substantial suggestions on the site design and navigation. We’re logging all comments on Trello, assigning them to team members to action and archiving them when complete.

We’ve also been using HotJar – a (paid-for) tool that measures user behaviour – to monitor how the pages are being used. From the heat maps it provides, we can see the most popular areas of a page and how users are scrolling and clicking through the site.

Hotjar screenshot

One example of how I’ve used this information is on the Rufford Abbey and Sherwood Forest Country Parks pages, where I could see that viewing the car parking charges was a hot area of activity. Although they were in a prominent position on the page, the information was only available as a PDF download. When I needed to create a new page for the parks’ festive opening hours, this gave me an opportunity to improve this content and create calendar views for car parking charges.

I’ve also been using the HotJar recordings to see how users are using our what’s on/events listings. Being able to see how users on different devices and browsers are navigating this section of the site has allowed our team to make improvements, such as reducing the default number of events shown when browsing on mobile to reduce the scrolling length.

These HotJar tools do have limitations; you can’t interact with the user or ask any follow up questions as you can when user testing in person. However, there’s also less chance that you will influence their behaviour. For our team, it has been an effective method to gather a significant amount of data about users’ actions and opinions of the website, which we are using to improve the overall user experience.

Posted by Lucy Pickering, Digital Content Officer

Taking a content inventory

In order to plan the steps we need to take toward building new online services for we need to know what we already have, where it’s coming from and who’s using it at the moment. This means one of the first tasks we’ve turned our attention to is a content inventory of the current website.

This isn’t a small undertaking – the current website is a hefty 25,000 pages and draws content from or provides a gateway to a number of online systems (both in-house developed and managed web apps and third party systems). We need to know what’s there, where it’s coming from, who’s looking at it and how it fits with access through other channels.

As Kristina Halvorson explains (in her book Content Strategy for the Web),

If you don’t know what content you have now, you can’t make smart decisions about what needs to happen next.

The first stage of this is a content inventory or audit – an Excel spreadsheet which captures all of this information.

What did we do?

Carrying out the audit has been a whole team effort. Digital officers have moved through the current website and captured the structure and some basic data about each page or link. They’ve added to this web stats about each page – how many visitors it’s had in the last 12 months, how many of those were using non-desktop devices and what percentage of the total is internal traffic.

Around this extra information has been supplied by colleagues in IT about where the content currently lives (much of it is in our content management system but some is not) and from Customer Insight we can begin to build a picture of contact about specific services through other channels (and know where additional content to enable this is stored, for example in a customer relationship management system).

What have we found out?

We now have a detailed map of our current content and a high level view of how its being used.

We’ve confirmed some things we already knew – which areas of the site are popular, how big the site has grown and the many means of navigating the site can lead to some dead ends or multiple routes to content.

By capturing information on which bits of our site are aimed at specialist audiences, and match this with information on origin of visitors and contact through other channels we’re beginning to understand which content we might want to handle or locate differently as we move toward a new

We’ve also been able to see ‘hot spots’ on the site where the majority of visits are coming from non-desktop devices (such as smartphones and tablets). We knew that overall traffic from these devices accounted for around 58% of visits, and that our current site isn’t responsive so was probably offering a pretty poor experience for them. What we’ve seen through looking closely while carrying out the audit is that some pages have up to 80-90% from non-desktop devices. This information will help to focus research, design and testing on the new website.

What will we do next?

We’ll be using the content inventory as a base document from which we’ll be taking information to carry out specific tasks (such as doing card sorting and other exercises to inform the structure of but we’ll also be expanding it so we know more about the content.

While most of the current inventory is quantitative we’ll be expanding the qualitative side – analysing the effectivness as well as the accuracy and currency of the content we have. Much of this will be done as we rebuild services or areas of content for the new site but capturing it, and then building audit into our ongoing process of managing the new digital services, will help us make informed and achievable recommendations about the online services we provide. 

(Posted by Sarah Lay, Senior Digital Officer)