Collecting feedback and improving user experience

We described back in June how we were testing and using feedback to inform the content on the site in its beta stage.

This has continued post launch. We’ve been gathering comments from customer service staff, services and website users. Nottinghamshire.gov.uk features a short survey (only four questions) asking users what they came to the site to do and whether they found what they’re looking for.

Despite our careful proofing before publishing pages, this feedback has highlighted some ‘quick fixes’ such as spelling errors and broken links, as well as more substantial suggestions on the site design and navigation. We’re logging all comments on Trello, assigning them to team members to action and archiving them when complete.

We’ve also been using HotJar – a (paid-for) tool that measures user behaviour – to monitor how the pages are being used. From the heat maps it provides, we can see the most popular areas of a page and how users are scrolling and clicking through the site.

Hotjar screenshot

One example of how I’ve used this information is on the Rufford Abbey and Sherwood Forest Country Parks pages, where I could see that viewing the car parking charges was a hot area of activity. Although they were in a prominent position on the page, the information was only available as a PDF download. When I needed to create a new page for the parks’ festive opening hours, this gave me an opportunity to improve this content and create calendar views for car parking charges.

I’ve also been using the HotJar recordings to see how users are using our what’s on/events listings. Being able to see how users on different devices and browsers are navigating this section of the site has allowed our team to make improvements, such as reducing the default number of events shown when browsing on mobile to reduce the scrolling length.

These HotJar tools do have limitations; you can’t interact with the user or ask any follow up questions as you can when user testing in person. However, there’s also less chance that you will influence their behaviour. For our team, it has been an effective method to gather a significant amount of data about users’ actions and opinions of the website, which we are using to improve the overall user experience.

Posted by Lucy Pickering, Digital Content Officer

What are words worth?

Those of a certain age may recall the song ‘Wordy Rappinghood’ by American new wave band Tom Tom Club. Although I’ve defined Tom Tom Club as new wave, the track itself was less genre specific, challenging the perceptions of its early ’80’s audience with a heavy rap and funky disco influence.

Lyrically it also presents a challenge, especially if you overlook the rhyme and dexterity of Tina Weymouth’s vocal and actually begin to think about what she’s saying:
“What are words worth?” is the oft-repeated refrain. As a line in a song it’s easily overlooked but taken in context with the rest of the lyric and given some real thought you’re left thinking, well, what are words worth? What are words really worth?

It’s a question we in the digital team at Nottinghamshire County Council are continually asking ourselves as we develop content in the new website and it’s a thought process that, as I found the other day, can get the grey cells ticking over at the most unexpected of times.

Out for a lunchtime walk in the sunshine, I passed a cake shop (yes, colleagues, you read that right, I passed a cake shop!) with a small notice pinned in the window ‘Back in 30 minutes. Out on a delivery.’

My immediate thought was that the two sentences were the wrong way round. In my head the natural order began with the ‘where’ rather than the ‘when.’ But my digitised self then took over and quickly rationalised the thought process. What is it that the consumer (user) wants to know? The shop is closed, so what is most important to them? Is it the fact that the owner is out on a delivery, or is it the knowledge that the shop will be open again in 30 minutes?

Leaving aside for the moment the fact that the proprietor’s efforts to be helpful where immediately undone by the lack of an indicated start time to the 30 minute timeframe, the realisation was that the wording on the note, or in this case, more specifically, the order of the wording, was correct. Had I been visiting the shop, I would have wanted to know when it was reopening. I didn’t really care why it was closed. So, the order was right, but the ‘Missing words’ (“It just don’t make sense, the way you did the things you did” – there I go again, showing my age with musical references from my childhood) meant that unfortunately the shop ultimately failed its user test.

And it’s that way of thinking we’re continually engaged in as we review and rewrite content. We’re questioning each and every word, the way in which the words are phrased, the order in which they’re written, what’s needed and what’s not, all with the ultimate aim of enhancing the user experience and making the site as easy to use as possible. Is there a value to using the word? Is it the right word? Does including it make the overall content easier to understand?

“Hurried words, sensible words, words that tell the truth, cursed words, lying words, words that are missing the fruit of the mind”, sings Ms Weymouth (in French – more creativity in word usage!) as the song continues to provide the English semantics students amongst us with much to ponder.

Our aim at Nottinghamshire County Council is to build a website which provides that fruit and feeds the mind, leaving the user with a nutritional experience. That, to us, is what words are worth.

Posted by Andy Lowe, Senior Digital Officer.

A problem with behaviour OR Usability informing your content

Post-It notes on piece of paperWe blogged back in April about our work in usability testing which helped to shape the design of our beta site that you can see here: beta.nottinghamshire.gov.uk.

Our testing hasn’t stopped. We’re collecting user feedback which we’re implementing into future iterations of the site. The site is being developed for the people of Nottinghamshire and this blog post shows how their views are shaping the site in terms of usability and testing. Sometimes we think we know the correct label due to gut feeling. However, as we found out recently, you can’t beat a bit of good old fashioned consensus and testing.

As a Senior Digital Officer at the Council (and having worked across many different websites) I tend to think that I know what people want or expect to find behind a link, a button or a title on a page.

We were working on the Adoption content and were looking at what you might title the page on the process of adoption. ‘Adoption Process was my first instinct. Next I asked my peers what they thought it should be – and from there we formed a clear opinion of what it should be titled. This, however, is not the ending.

We have a clear process for these things within the Digital First team and that is why we always test assumptions. Sometimes assumptions on how people will navigate a site can be right, but it is often valuable to remove yourself from the process and find out what real users think. The website is after all being built for the people of Nottingham and not just the Council.

We decided to test out our assumption about the page title through a simple process, but one that was invaluable and has informed some of our user testing and decision making on additional elements of the website content.

We simply wrote down three choices on a post it note and sat in an area of the Council with high footfall. Post it notes in hand, I asked people under which heading they would expect to find out about the process of adopting. The three choices were:

  • Steps to adoption
  • How to adopt
  • I want to adopt.

Sitting there, I thought I knew that everyone would choose the right one in ‘How to adopt’ which of course is the best heading for this type of content. Wrong!

Out of all the people surveyed, they picked the heading that I thought no-one would choose which was ‘I want to adopt’. Not only this, but they also gave informed reasons without prompting as to why they would choose this heading to find the information provided. The distilling of this feedback was down to people having considered all of the information prior to adoption and now they want to physically go through the process of adoption and would expect to find the information on what to do next within this area.

Additionally, and most interestingly, one of the people I surveyed had previous experience of our adoption process having already adopted a child in Nottinghamshire (I didn’t know this until after I had surveyed them) and they chose the header of ‘I want to adopt’ also.

It just goes to show that you can sometimes rely too heavily on your opinion and experience, but it’s the people who use these tools that can surprise you with the answers they give and how they use the tools you’ve provided them.

Posted by Paul Roper, Senior Digital Officer

First steps in design

We blogged recently about the usability testing we’d undertaken on three design concepts with The Insight Lab detailing the methodology and headline results. Here’s some more detail about the three prototypes we put to the test.

~

An integral part of Digital first involves developing a new nottinghamshire.gov.uk. With this new version of our website we want to design something that looks good, works well across a range of devices and most importantly, is incredibly simple to use.

A local authority website is a container for a vast amount of information about diverse services which often have little relation to each other aside from the organisation delivering them. With more content and transactions moving to digital services, we set out to design an experience that lets the user concentrate on the task at hand, and giving them access to other content when relevant while getting everything else out of their way.

A website with so many pages needs many interface elements in order to help the user navigate round it all. Things like a main menu, a breadcrumb trail, a list of pages related to the content you’re looking at, a list of pages that are commonly accessed across the site, and many more!

We wanted to ensure our new website had everything the user needed, but remained simple and didn’t become overly complex, so we set about discovering which of these interface elements and design features our users found more useful and which ones were cluttering their experience.

Designing three concepts

We decided to develop three initial concept designs, all which could be used for the new nottinghamshire.gov.uk, but all were different in particular ways.

Each design had similar interface elements, but they would be shown in different places or interacted with in different ways. This was so that we could begin to understand what was so simple to a user it could be used without them having to think about it, what was too complex, or in the worst case scenario, prevented the user from completing their task.

We also wanted to design a site that was engaging, pleasing on the eye and conveyed not just our Digital Design Philosophy, but the character and feel of Nottinghamshire as a county.

We could then test how people interacted with each site, and discover how they used (or didn’t use) certain interface elements to navigate the site and complete the task they had come to do.

Design One – The “Contemporary” design

Screenshot of design concept homepage

This design was influenced by current local authority websites, and built to emulate the type of interface people would expect to see when arriving on a council website.

This included interface elements such as top tasks, search box, a top level navigation menu and positioning content in a traditional, vertically stacked page.

We included additional information about events and news in a big, clear way in an attempt to showcase content the user may not have intended to view when they first arrived, but may find useful (but still doing this without getting in their way!).

We positioned these elements in the lower half of the screen, “below the fold”, so that they could access it if they came across it, but so that it didn’t interfere with the task they had come to complete.

This concept was designed to look colourful, big and bold, with clear, large typography used to group related links.

We saw this design as an evolution of our existing site in terms of look, feel and user experience.

Design Two – The “Modern” design

Design concept screenshotThis design took on a slightly more modern interface and used horizontal columns to showcase different types of content simultaneously.

This design was built to fill the screen of the visitor’s browser, abandoning the traditional fixed width centre column design, allowing visitors on high resolution/large devices to have a more optimal experience.

As the screen width decreases, the columns can slide and expand horizontally, similar to how modern smartphone applications (for example Twitter) work, reducing the need for users to scroll down the page to consume content.

The two right hand columns serve secondary and tertiary content across all pages, so no matter where the visitor arrives on the website, they will always have access to over-arching elements.

This was an interesting point with this design – keeping a similarity between content pages and the homepage. We often found council websites had a homepage with lots of engaging and useful content, but that moving onto a specific page would remove those broader elements (such as news and events). And with more and more of our visitors arriving directly on content pages from Google, we wanted to explore the opportunity of showcasing this content from all pages, regardless of where they started their journey.

This design also explored moving the footer to the right hand side of the screen, again, using sliding panels to expose the content. We initially set this element to be a thin border, which would expand when interacted with by the user.

We also used icons to symbolise links to our social media channels, as well as contact details and address information.  We wanted to test whether icons alone would be enough of a prompt for users to find information, for example, an envelope to signify contact us, or the letter I to signify information.

Interestingly, icons that represented brands that visitors were very familiar with, for example Facebook, worked on their own but icons we had used exclusively, such as the envelope did not successfully prompt the user to interact with that element when looking to contact us.

Design Three – The “Clean” design

Design concept homepage screenshotThis design was the most simplistic of them all. We wanted to explore just how simple we could make our website before it started to have a negative impact on the visitor.

In this design, we removed all additional content from the interface, such as news and events and focused purely on the task the user had come to complete.

The homepage lists popular tasks and online services, alongside a navigational structure that lets the visitor browse the site from left to right (again using horizontal columns) in a directory structure similar to GOV.UK.

Content pages were also very simple, and only showed visitor content relevant to that section rather than content from a broader section of our website.

This particular design didn’t allow for much opportunity to engage with the user, in terms of broader Council content and while we recognised that this was not ideal from a business perspective, it would be useful to see if this approach significantly improved the user experience for the visitor.

What we found

The findings from the user testing of these design concepts were very interesting, and confirmed some of our assumptions while completely disproving others.

The testing experience provided us with a wealth of data, helping us identify the key interface elements that we would put into the beta design and also helping us get a feel for what level of complexity users were most comfortable with.

The general findings showed us that users found design two to be too complex. While they did find it useful having access to broader content across all pages, they did get lost on the page, spending a longer amount of time looking around the page to find the next step in their journey.

Designs one and three were scored fairly similar, although the simplistic nature of the design three did require visitors to return to the homepage to start a new task, as there was no links between different types of content.

Subjective questioning also found design three to be too simple, and that people preferred the look and feel of the first design, perhaps because this had a more familiar feel to it compared to the current website.

We also found that when performing a specific task, providing relevant links to other content significantly helped users browse the site and complete further tasks, for example having a link to ‘events in half term’ on the school term time page.

Certain interface elements, such as popular links and related pages were used very frequently, whereas elements such as the search box and main menu were, surprisingly, used much less frequently.

What’s next?

The next step was to take what we had learnt from both the practical experience of designing these concepts and the user testing results to design the first version of our beta site.

This design is almost an amalgamation of the three design concepts, borrowing the strongest elements (and most successful as determined in the user testing) from each design, and rebuilding them into one design that will go live in our public beta test very soon.

Once this is live, we will continue to monitor user behaviour and iterate the design to continuously improve it, so that we can move closer to the version we replace our current website with later in the year.

The design process is allowing us to discover what really works, very quickly. With constant feedback and iterations, we are able to test ideas and challenge our own assumptions very quickly, helping us get closer to a site the residents of Nottinghamshire find highly usable and (perhaps!) enjoyable too.

Posted by Carl Bembridge, Digital Design Officer

Testing our design concepts

Web cam on screenWe’ve just completed the next step in our journey toward the new nottinghamshire.gov.uk website – usability testing of our initial three design concepts.

The three design prototypes challenged our assumptions about user behaviour and sought to discover which elements really help them complete visits to our site efficiently and satisfactorily.

Dr. Emily Webber and Dr. Vicky Shipp of The Insight Lab provide this detailed post on how we approached this phase of the build and testing.

~

Why do user testing?

When designing a website it is important to consider the different types of people who will be visiting it, and the type of experience they will have. A really effective way to do this is through user testing, where we can observe people carrying out set tasks on a website or prototype to understand what they do, and what they think. This will help to assess their reactions to the website and anything that they find particularly difficult to use, which will in turn help us to see how it could be improved.

User testing can be carried out on just one design, but the benefit of comparing different designs is that people will be able to notice which features they like or find easiest to use across a range of options, and we can observe which elements work well and those that don’t. This will help to identify the strengths and weaknesses of the different designs so that we can make a more informed decision about future designs.

To help with the development of the new Nottinghamshire County Council (NCC) website, The Insight Lab were asked to run user testing to evaluate three prototype interface designs.

These three in-browser prototypes were created especially for the testing, and included a range of different aesthetic and functional elements to allow for a rich experience, which aimed to mirror the use of a fully functional site. Subtle differences across the designs, such as the location and function of menus, the placement of key pieces of content, and the colours used, would allow us to see how this impacted on people’s performance and experiences and understand which elements allowed for more efficient and effective task completion.

Who did the testing and what did it involve?

Lab-based usability testingUsing the previously identified personas as a starting point, we recruited participants across a range of age groups, locations, and with different levels of computing experience. This ensured that the sample of people who took part in the study, and their feedback and performance, were representative of the actual users of the NCC website.

The individual testing sessions lasted just under an hour and participants were asked to use all three of the interfaces to complete a series of common tasks, such as finding out about school term dates, viewing local events, and reporting a pot hole. We made sure that the order in which people completed the tasks and used each interface were randomised so that the results wouldn’t be affected by people becoming familiar with them (we call these practice effects). In this way we could be sure that the results we got were as non-biased and as accurate as possible.

To allow us to understand exactly what people did, we used software to record what was happening on the screen – where people were clicking, hovering, and the route they took to complete each task. This software also captured a video of people’s reactions to the tasks via a webcam mounted on the monitor so that we could record when people were particularly confused or confident, and map these reactions to use of particular interface elements. We also recorded a variety of metrics such as how long it took people to complete the tasks, how many times they needed to click, and which navigation elements they used.

At the end of the testing sessions we asked participants to talk about their experiences, identify their preferred elements across the three designs, and fill in a quick survey that measured how usable they found each interface. By capturing this wide variety of data it would ensure rich and insightful results.

Making sense of the data

The sessions generated a lot of data, and we needed to spend time analysing it to make sense of it. This is one of the most enjoyable parts of our job as we begin to explore and uncover meaningful insights.

In this scenario we looked at the average task completion times across the interfaces, the usability score of each interface, and the different ways that people completed the tasks.

This not only allowed us to see which design performed the best overall, but also the different design components that people used and/or struggled with and any tasks that were more challenging than they should be.

What did we find out?

Usability testingThrough our analysis we found that although all three interfaces designs were rated as having above average usability, opinions about the preferred interface were varied. We were able to identify the elements that were frequently used and allowed for efficient task completion, or those that were particularly well received, and these should remain in the future design.

However, it does not mean that other interface elements should not be included, as these were still important to some of the users. What is key is combining these in ways that do not inhibit or distract other site visitors.

Our findings have produced a set of recommendations that NCC can use to ensure that future designs meet the various needs of people using the website. As the aim of the testing was not to choose one interface, but to understand the way people used different features of the designs, future designs are likely to involve incorporating aesthetic and navigational elements from each of the three prototypes.

The Insight Lab is an expert-led consultancy, implementing user-focused research methods to drive the design of digital products and services that are simple, efficient and a pleasure to use. Find out more about them on their website.

Personas – representing our customers

Sample persona - Beryl CumberlandWe’ve added a new tool to our user-centered design toolkit in the last couple of weeks: personas.

Personas offer a way to realistically represent our key customer groups online and can be used to help make informed decisions on design.

They’re a good way of keeping the customer and their needs in mind as we build our digital services, although they don’t replace contact with real people to research and test what we build.

We developed our set as part of the work we did with The Insight Lab (read their post for us on open card-sorting here) and represent the major primary, secondary and tertiary groups across nottinghamshire.gov.uk as a whole.

In order to develop them we used data about current usage, contact through other channels and experience from service areas. Through a workshop we got relevant colleagues together and created a huge set of potential personas before distilling these down to a smaller set by combining characteristics and looking for shared needs or themes.

There’s some great background reading about personas and how to create them on Usability.gov.

While a small set for the site overall is useful we may develop additional personas as we build our digital services.

You can see and download an example from our persona set here (PDF).

Posted by Sarah Lay, Senior Digital Officer

Card sorting: working out how to navigate our services online

Card sortingOne of the challenges we’re tackling as we build a new user-centred nottinghamshire.gov.uk is how to organise access to, and information about, our services.

We’re doing research on how people find their way to our website, if or how they move around it once they’re there, and how we can make our navigation intuitive to support their behaviour. We’ve delved into taxonomies and are investigating both on-site and external search. We’ve used data, analytics and undertaken user research too.

One of the activities we’ve carried out as part of this discovery is holding open card-sorting workshops with a range of our users. We contracted The Insight Lab to carry out this work with us and below their Head of Research, Dr Emily Webber, reveals the why, what and how of card sort workshops.

~

When The Insight Lab first connected with Nottinghamshire County Council, they were looking for a user experience consultancy to run a series of card-sorting workshops, to help inform the re-development of the content structure of nottinghamshire.gov.uk, as part of their Digital First project.

Card-sorting is a simple but incredibly effective way of obtaining valuable insight into the ways in which different types of users’ group information and content in order to inform the design of information architecture (IA) where information is structured intuitively, meaning that users can quickly and easily find what they are looking for.

cards on tableThere are a variety of card sorting techniques which can be used for this purpose, but for this project, we decided on open card-sort approach. This requires users to sort cards containing website content into groups that make sense to them, and give each group a title that summarises the cards that sit within it. Findings from this process then feed into further research and validation methods, and form a great foundation of evidence for a user-centred site structure, which meets the expectations of those using it.

We were really excited to take on this project particularly due to some of the unique challenges that it presented, such as the diverse range of users that the website must cater for, and the large and varied amounts of content to be presented. Participants for the workshops were therefore recruited from across the county (we ran workshops across Nottinghamshire from Worksop to West Bridgford) and came from a range of backgrounds and levels of computer experience. Content for the card sort was carefully selected to reflect the varied types of information and plethora of services available.

Following an audit of current content and consideration of existing documentation, such as priorities and key user journeys, 61 cards formed the basis of the card sort. During the workshop, each participant first sorted these cards (which had a title printed on one side, and a description of the content of the other) into groups that made sense to them, and then gave each group a heading using a Post-It note. For example, a participant may have grouped cards such as ‘Studying’ and ‘Apply for a School Place’ under a heading which they titled ‘Education’ or ‘Schools’.

Participants were also encouraged to indicate any sub-groupings, as well as any cards they felt fell into more than one category (more Post-Its!),– for example ‘Report a Pothole’ may have been grouped under a transport heading, but then also linked to a ‘Report a problem’ group.

Although this data alone provides valuable insight into a user-focused IA, we wanted to provide further support for the findings using rich, in-depth feedback from participants. Following the individual card sorts, participants were therefore led in an open discussion, exploring points such as what they had found particularly difficult or easy to sort, cards which they felt were missing and issues with labeling and understanding, for example.

Card layoutResults from the card sort were then supplemented with points arising from the post-sort discussion to provide rich insight and outline actionable recommendations for the creation of a user-focused site IA.

The card-sorting workshops have proved an invaluable exercise in gaining insight into how Nottinghamshire residents perceive Council services, and how they understand and group content. The results will provide an excellent base for future work into the re-development of the Council website and its underlying information architecture.

Research methods including closed-card sorts and tree testing could be used to provide additional insight to support and extend the findings of these initial workshops, with results from all sources then feeding into a new user-focused Council website, where visitors can quickly and easily find the information they are looking for.

Dr Emily Webber is Head of Research at The Insight Lab, an expert-led consultancy, implementing user-focused research methods to drive the design of digital products and services that are simple, efficient and a pleasure to use. Find out more about them on their website.

~

We’re looking at what our next step is now to design a clear information architecture for our website and we’ll update you on this as we do it!

Thanks to all who took part in the workshops. If you’d like to get involved with testing as part of our Digital First work then you can find out more here.