Testing our design concepts

Web cam on screenWe’ve just completed the next step in our journey toward the new nottinghamshire.gov.uk website – usability testing of our initial three design concepts.

The three design prototypes challenged our assumptions about user behaviour and sought to discover which elements really help them complete visits to our site efficiently and satisfactorily.

Dr. Emily Webber and Dr. Vicky Shipp of The Insight Lab provide this detailed post on how we approached this phase of the build and testing.

~

Why do user testing?

When designing a website it is important to consider the different types of people who will be visiting it, and the type of experience they will have. A really effective way to do this is through user testing, where we can observe people carrying out set tasks on a website or prototype to understand what they do, and what they think. This will help to assess their reactions to the website and anything that they find particularly difficult to use, which will in turn help us to see how it could be improved.

User testing can be carried out on just one design, but the benefit of comparing different designs is that people will be able to notice which features they like or find easiest to use across a range of options, and we can observe which elements work well and those that don’t. This will help to identify the strengths and weaknesses of the different designs so that we can make a more informed decision about future designs.

To help with the development of the new Nottinghamshire County Council (NCC) website, The Insight Lab were asked to run user testing to evaluate three prototype interface designs.

These three in-browser prototypes were created especially for the testing, and included a range of different aesthetic and functional elements to allow for a rich experience, which aimed to mirror the use of a fully functional site. Subtle differences across the designs, such as the location and function of menus, the placement of key pieces of content, and the colours used, would allow us to see how this impacted on people’s performance and experiences and understand which elements allowed for more efficient and effective task completion.

Who did the testing and what did it involve?

Lab-based usability testingUsing the previously identified personas as a starting point, we recruited participants across a range of age groups, locations, and with different levels of computing experience. This ensured that the sample of people who took part in the study, and their feedback and performance, were representative of the actual users of the NCC website.

The individual testing sessions lasted just under an hour and participants were asked to use all three of the interfaces to complete a series of common tasks, such as finding out about school term dates, viewing local events, and reporting a pot hole. We made sure that the order in which people completed the tasks and used each interface were randomised so that the results wouldn’t be affected by people becoming familiar with them (we call these practice effects). In this way we could be sure that the results we got were as non-biased and as accurate as possible.

To allow us to understand exactly what people did, we used software to record what was happening on the screen – where people were clicking, hovering, and the route they took to complete each task. This software also captured a video of people’s reactions to the tasks via a webcam mounted on the monitor so that we could record when people were particularly confused or confident, and map these reactions to use of particular interface elements. We also recorded a variety of metrics such as how long it took people to complete the tasks, how many times they needed to click, and which navigation elements they used.

At the end of the testing sessions we asked participants to talk about their experiences, identify their preferred elements across the three designs, and fill in a quick survey that measured how usable they found each interface. By capturing this wide variety of data it would ensure rich and insightful results.

Making sense of the data

The sessions generated a lot of data, and we needed to spend time analysing it to make sense of it. This is one of the most enjoyable parts of our job as we begin to explore and uncover meaningful insights.

In this scenario we looked at the average task completion times across the interfaces, the usability score of each interface, and the different ways that people completed the tasks.

This not only allowed us to see which design performed the best overall, but also the different design components that people used and/or struggled with and any tasks that were more challenging than they should be.

What did we find out?

Usability testingThrough our analysis we found that although all three interfaces designs were rated as having above average usability, opinions about the preferred interface were varied. We were able to identify the elements that were frequently used and allowed for efficient task completion, or those that were particularly well received, and these should remain in the future design.

However, it does not mean that other interface elements should not be included, as these were still important to some of the users. What is key is combining these in ways that do not inhibit or distract other site visitors.

Our findings have produced a set of recommendations that NCC can use to ensure that future designs meet the various needs of people using the website. As the aim of the testing was not to choose one interface, but to understand the way people used different features of the designs, future designs are likely to involve incorporating aesthetic and navigational elements from each of the three prototypes.

The Insight Lab is an expert-led consultancy, implementing user-focused research methods to drive the design of digital products and services that are simple, efficient and a pleasure to use. Find out more about them on their website.

Card sorting: working out how to navigate our services online

Card sortingOne of the challenges we’re tackling as we build a new user-centred nottinghamshire.gov.uk is how to organise access to, and information about, our services.

We’re doing research on how people find their way to our website, if or how they move around it once they’re there, and how we can make our navigation intuitive to support their behaviour. We’ve delved into taxonomies and are investigating both on-site and external search. We’ve used data, analytics and undertaken user research too.

One of the activities we’ve carried out as part of this discovery is holding open card-sorting workshops with a range of our users. We contracted The Insight Lab to carry out this work with us and below their Head of Research, Dr Emily Webber, reveals the why, what and how of card sort workshops.

~

When The Insight Lab first connected with Nottinghamshire County Council, they were looking for a user experience consultancy to run a series of card-sorting workshops, to help inform the re-development of the content structure of nottinghamshire.gov.uk, as part of their Digital First project.

Card-sorting is a simple but incredibly effective way of obtaining valuable insight into the ways in which different types of users’ group information and content in order to inform the design of information architecture (IA) where information is structured intuitively, meaning that users can quickly and easily find what they are looking for.

cards on tableThere are a variety of card sorting techniques which can be used for this purpose, but for this project, we decided on open card-sort approach. This requires users to sort cards containing website content into groups that make sense to them, and give each group a title that summarises the cards that sit within it. Findings from this process then feed into further research and validation methods, and form a great foundation of evidence for a user-centred site structure, which meets the expectations of those using it.

We were really excited to take on this project particularly due to some of the unique challenges that it presented, such as the diverse range of users that the website must cater for, and the large and varied amounts of content to be presented. Participants for the workshops were therefore recruited from across the county (we ran workshops across Nottinghamshire from Worksop to West Bridgford) and came from a range of backgrounds and levels of computer experience. Content for the card sort was carefully selected to reflect the varied types of information and plethora of services available.

Following an audit of current content and consideration of existing documentation, such as priorities and key user journeys, 61 cards formed the basis of the card sort. During the workshop, each participant first sorted these cards (which had a title printed on one side, and a description of the content of the other) into groups that made sense to them, and then gave each group a heading using a Post-It note. For example, a participant may have grouped cards such as ‘Studying’ and ‘Apply for a School Place’ under a heading which they titled ‘Education’ or ‘Schools’.

Participants were also encouraged to indicate any sub-groupings, as well as any cards they felt fell into more than one category (more Post-Its!),– for example ‘Report a Pothole’ may have been grouped under a transport heading, but then also linked to a ‘Report a problem’ group.

Although this data alone provides valuable insight into a user-focused IA, we wanted to provide further support for the findings using rich, in-depth feedback from participants. Following the individual card sorts, participants were therefore led in an open discussion, exploring points such as what they had found particularly difficult or easy to sort, cards which they felt were missing and issues with labeling and understanding, for example.

Card layoutResults from the card sort were then supplemented with points arising from the post-sort discussion to provide rich insight and outline actionable recommendations for the creation of a user-focused site IA.

The card-sorting workshops have proved an invaluable exercise in gaining insight into how Nottinghamshire residents perceive Council services, and how they understand and group content. The results will provide an excellent base for future work into the re-development of the Council website and its underlying information architecture.

Research methods including closed-card sorts and tree testing could be used to provide additional insight to support and extend the findings of these initial workshops, with results from all sources then feeding into a new user-focused Council website, where visitors can quickly and easily find the information they are looking for.

Dr Emily Webber is Head of Research at The Insight Lab, an expert-led consultancy, implementing user-focused research methods to drive the design of digital products and services that are simple, efficient and a pleasure to use. Find out more about them on their website.

~

We’re looking at what our next step is now to design a clear information architecture for our website and we’ll update you on this as we do it!

Thanks to all who took part in the workshops. If you’d like to get involved with testing as part of our Digital First work then you can find out more here.