usability and ux at the university of liverpool

16
Usability and UX at the University of Liverpool Jeff Woods University of Liverpool Library

Upload: northerncollaboration

Post on 13-Apr-2017

23 views

Category:

Education


0 download

TRANSCRIPT

Usability and UX at the University of Liverpool

Jeff WoodsUniversity of Liverpool Library

Will be looking at:• Why did we do it• How did we do it• What did we learn• Where to next

Usability study

Presenter
Presentation Notes
INTRO – why did we do it, how did we do it, what did we learn, where to next.

Began early 2015

Cross-Library project group formed• Academic Liaison• Content, Collections and Delivery• Customer Services• Systems

Initial focus on EDS branded locally as ‘DISCOVER’

Usability study

Presenter
Presentation Notes
In 2015 a cross-Library project group was assembled to embark upon a large scale usability study of the Library’s main resource discovery platforms. Bringing together representatives from the Library’s Academic Liaison; Collections, Content and Discovery; Customer Services and our Systems teams. Allowed for input and expertise from these particular areas but also a means of feeding back progress to their respective sections and soliciting opinion and consensus. The main focus of the study was on our Electronic Discovery Service, branded locally as ‘DISCOVER’ although we would also be incorporating and exploring aspects of Catalogue use. Originally introduced in September 2010 Sit alongside the Library catalogue both on the Library webpages and on site ‘InfoPoints’ Although some features and functionality had been added over time little had been done in the way of the look and layout of the interface itself so this was the first time we’d attempted a systematic review, at least on the scale planned. It was being used … statistics showing year on year real-term increases (sessions and full text views) the statistics alone only tell us so much. What they don’t tell really tell you is how and to what extent our users were engaging with the platform, what they were using it for nor how easy they found using it. Was use growing because they love it and it meets all their needs, or because they find it so difficult to use that they need to perform multiple searches to locate the content they want?!

“to make informed, evidence based changes to our main resource discovery platforms, improving their usability and effectiveness, and

therefore the Library service and student experience, in line with objectives identified in Library’s strategic plan”

• How and to what extent our users were engaging with these platforms

• How easy, efficient and effective they found it to locate and access content

• If it met their information needs• What they liked and disliked

We wanted to find out:

Why?

Presenter
Presentation Notes
The aim of the project then was to gain a better understanding of user engagement. We wanted to find out what they liked and disliked about the platform. Where the pain points and barriers lay and to assess the extent to which it currently met their information needs. This in turn would allow us to make informed, evidence based changes to the interface and improving usability and making it a more intuitive, effective and efficient resource. In doing so this would ultimately result in an improved Library service and enhanced student experience in line with a number of objectives identified in the Library’s Strategic Plan. We followed this in 2016 with studies focusing on library catalogue and e-book usability.

3 stage approach

• Survey• Usability test sessions• Focus group discussions

Philip, M. (2010) Do students want a one-stop-shop to help them navigate their way around the maze of library resources? A usability study looking at the beta version of Summon, the new library search engine at the University of Huddersfield. M.A. University of Sheffield. http://eprints.hud.ac.uk/9824/McManamon, C. and Smith, S. (2014). Library Search: survey, usability study and focus group report. Manchester Metropolitan University. https://www.academia.edu/28313928/Library_Search_survey_usability_study_and_focus_group_report

How?

Presenter
Presentation Notes
How: Borrowing from similar, earlier studies undertaken here at Huddersfield in 2010 and Manchester Metropolitan University in 2014, we employed a 3 stage approach involving an initial survey to be followed by usability test sessions and focus group discussions.

How?1. Survey

Survey• QuickTap survey app• Library staff with iPads roaming library social areas• Pop-up Library event• Online version (“soft launch”)

Presenter
Presentation Notes
Survey Quickest means of capturing key information on particular aspects of our user’s information seeking behaviour from a relatively large sample of the student population. Survey itself created using QuickTap software and responses collected using iPads with staff mobilised to roam the various social areas of the university’s two main libraries between 11 am and 3 pm each weekday over a two week period. We also made use of a ‘pop-up Library’ event that had been scheduled during the time the survey was live. An online version was also created - not only to help boost participation but also as a means of capturing responses from elements of the student population who rarely, if ever, used the physical library. Soft launch – wary of survey fatigue (NSS) along with a tangible reluctance at the institutional level to allow use of staff and student mailing lists. Instead online version promoted via Library’s news blog and popular social media accounts.

How?1. Survey

Analysis• 719 responses (1 every 3 minutes!)• Healthy representation of wider n• Free text comments coded and categorised• Excel based dashboard• Post survey staff de-briefing session

Presenter
Presentation Notes
Analysis: 719 responses (671 on site, 34 through online version) - 1 every three minutes of time spent actively collecting – not so good for online.   Relatively healthy representation of the wider university student population in terms of user group, discipline and year of study. Excel based dashboard created to aid analysis of the responses, allowed them to be broken down or limited to particular subject areas, user groups, year of study and/or any other sections of the sample population as defined by their response to a particular question or questions. Post-survey staff de-briefing sessions scheduled.

How?2. Usability sessions

Semi-structured search tasks• Task 1 – researching a topic

• Task 2 – searching for specific, known items

• Task 3 – researching a topic using an alternate version of ‘DISCOVER’

• 5 sessions, 20 participants in total

Presenter
Presentation Notes
Usability sessions: Test sessions themselves involved three semi-structured search tasks involving both topic based and known item searching of journal titles, articles and both print and electronic book titles using the current set up. For the final task using an alternate, test version of DISCOVER being developed by members of the Library’s Systems and CCD team in conjunction with EBSCO support staff. 5 sessions scheduled (including an initial pilot session) with 20 participants in total taking part. During the sessions the participants were asked to ‘think aloud’ to provide a running commentary on their thought processes as they performed each task. In particular, we wanted to know: what they were doing or trying to do what they were looking at or for why they were using a particular feature/function what they found confusing, frustrating or not doing what they thought it would or should Headsets with microphones were provided. We used CamStudio screen/audio recording software to capture both the on-screen activity and the accompanying audio narrative (available on the University network). Each task was to last approx 10 minutes – timed. Again, this was purely a consequence of limitations experienced with CamStudio than anything else. Used one of the Library’s larger training rooms - gave us enough space between each participant so they wouldn’t feel too self-conscious about the commentaries they were providing. In framing the sessions, we stressed that that there were no right or wrong answers and emphasised that it was the website that was being tested and not the participants themselves. We also asked the participants to try and perform the tasks in a manner as naturalistic and representative of their usual information seeking behaviour as possible.

How? 2. Usability sessions

Observation checklist:Systematically recorded the occurrence of particular, pre-defined search techniques and the use of specific features, facets and functionality

Anything else of interest

Presenter
Presentation Notes
To help promote and preserve this sense of realism, other than framing the sessions and being on hand in case of any problems we didn’t observe what the participants were doing or interact with them as they went about each task. Instead, the project team would later analyse the recordings using an observation checklist. Systematically recording the occurrence of particular, pre-defined search behaviours and techniques. The use of specific facets and functionality and noting anything else they thought would be of interest.

How?3. Focus Groups

• Followed usability sessions

• Participants asked to reflect upon their typical information

seeking behaviour both during the test sessions and in a wider,

everyday context

• PGR facilitators

Presenter
Presentation Notes
Focus groups: Focus group discussions were scheduled to immediately follow the test sessions, basically a case of gathering participants together around a table as soon as the final task completed. In the discussions themselves, participants were asked to reflect upon their typical information seeking behaviour and their experiences of using DISCOVER both during the test sessions and in a wider, everyday context. Postgraduate research students with previous experience facilitated the discussions with one or two members of the project group also sitting in to explore individual responses in more detail, ask for clarification on particular aspects or issues and build a more comprehensive picture. Again, the audio of the discussions were recorded using an iPad, transcribed and analysed.

• Experimented with different approaches to survey• Twitter poll• “Coffee and Chat” format• Link to survey from OPAC

• Focus group discussions but no usability test sessions

How?OPAC review

Presenter
Presentation Notes
OPAC: For the OPAC survey, experimented with a different approaches. Informal initial twitter poll preceded the main survey – single question to get impression of how users perceived OPAC, was it Essential, Useful, Useless or Unnecessary Also hoped this may be a way of priming users for the survey that followed. Appropriated the ‘Coffee and Chat’ format that Customer Services section had been using as part of their own UX work. Again, utilising the social area’s in each Library with free tea and coffee (with biscuits) in exchange for survey responses. Again, QuickTap software and iPads used. 4 sessions scheduled, two in each library over two week period. Used posters, digital displays and social media to promote these events, tweets during the event itself. Online version created - banner linking to survey on OPAC. Decided against usability sessions. Ageing system and extremely limited in terms of the changes we could actually make – subtle rather than substantial, certainly not enough to present to users with an alternative version and expect them to notice. We were also wary of raising expectations of what we could do.

What?

Lessons learned:• Survey approaches

• Recruitment woes

• Focus group facilitation

• Challenging assumptions – branding & awareness and need for open mind

• Technical knowledge within the team

Presenter
Presentation Notes
Reflection: Lessons learnt of what worked well which we have taken forward and shared with staff involved in other aspects of UX work within the library. Informal approach to the collection of survey responses – handing users the iPad to fill in themselves - made them more likely to participate and boosted the response rate. Staff remaining on hand to help if confused or clarify. Post-survey staff de-briefing sessions a good way of feeding back the survey findings to the staff involved in collecting responses but also as a forum for identifying particular problems encountered and determining future best practice. For example, approaching groups of students proved to be a particularly effective strategy – get one to fill it in and they all would. Also helped identify ambiguities with one question which we later used the focus group sessions to investigate further and clarify. Something which a more robust cognitive interviewing process might have flagged up when we were initially piloting the survey. Coffee and chat format was much less staff intensive and worked well - so much so that repeated approach during recent LibQUAL+ survey. Recruiting participants for the test sessions and focus groups was (and has continued to be) a real struggle despite employing a variety of approaches – I mean we tried! Financial reward Used survey to capture expressions of interest in taking further part in the study - followed these up Mobilised the Library’s popular social media accounts Members of the newly-formed Student-Library Partnership group e-mailed. Flyers left on study spaces within Library and digital displays. In desperation, resorted to directly approaching/soliciting library users All very ad hoc, certainly not providing the ideal random stratified sample but a case of beggars not choosers. As mentioned earlier, we decided to employ research postgraduate students with previous experience of facilitating focus groups to lead the group discussions. Not just for their experience but also their impartiality. Hoped that the participants better relate to them, feel more comfortable and open up more. Feedback from the survey and focus group discussions challenged a fundamental assumption we were guilty of taking into the project - that all students knew what DISCOVER was because of its prominence on the homepage. In fact, the branding had passed many by and often more influenced by their tutor’s recommendations of what to use than the fact that we had DISCOVER front and centre on the home page. Pressed home the need to approach future studies with an open-mind. Also afforded a good opportunity to fully explore the ‘out of the box’ customisation options by simply taking the time to experiment, switching different settings around to see what the effect would be. However, some of the things we were looking to implement went beyond what could be done here alone. We were lucky enough to have staff with high levels of technical expertise on hand to liaise with the supplier and clearly articulate what it was that we were trying to achieve. For their part, EBSCO were willing to work with us to develop enhancements that would address the issues raised.

Where to next?

• Need for iterative review• Responsive to the developing and changing needs of users• Applying ethnographic approaches to UX

• different user groups• online content and spaces

• Determine and develop methodology• diaries• cognitive mapping• semi-structured interviews

Presenter
Presentation Notes
Features and functionality had been added to the Discover interface over time, for reasons that had been justified at that time. The 2015 review led to reconfiguration and removal of certain features – because the needs of our users had changed and developed. Students needs in three years will have nuances that set them apart from students’ needs now. This demonstrates that system reviews need to be iterative – in the future we will work towards a fairly regular process of review so that we can be responsive to the changing needs of our students and perhaps even anticipate their needs – continually seeking feedback about our systems and user needs. Things we are still waiting for: Single sign on Incorporation of library records

More information:

Woods, J (2014). Discover: Survey, Usability Testing and Focus

Group Report. University of Liverpool.

http://livrepository.liverpool.ac.uk/3003105/

Woods, J., Gillespie, E. & McManamon, C., (2016). Discovering

discovery: lessons learnt from a usability study at the University of

Liverpool. Insights. 29(3), pp.258–265. DOI:

http://doi.org/10.1629/uksg.320

IMAGEShttps://unsplash.com/

Jeremy Thomas: https://unsplash.com/photos/E0AHdsENmDgJesse Sewell: https://unsplash.com/photos/q75_AMCgsZUGreg Rakozy: https://unsplash.com/photos/oMpAz-DN-9ILevi Price: https://unsplash.com/photos/8FwiZcXiX_gNASA: https://unsplash.com/photos/NuE8Nu3otjoRodion Kutsaev: https://unsplash.com/photos/OQ0zP6AS2DINASA: https://unsplash.com/photos/Yj1M5riCKk4Jeremey Thomas: https://unsplash.com/photos/kFy1Aip0eEo

Presenter
Presentation Notes
Where to next: The changes implemented on the back of these reviews reflected the needs of our users at the time. No doubt these needs will continue to evolve and change over time, with nuances in a year or two that that will set them apart from the needs of the user of today. This demonstrated to us there is a definite need for these reviews to be iterative. That we should aim to work towards a fairly regular process of review, continually seeking feedback about our systems and our user’s needs. Doing so will allow us to at least be mean that we can be responsive to, if not anticipate those needs as they change and as new developments and technologies become available. At Liverpool we’ve recently taken decision to remain with our current LMS and EDS for the next couple of years at least so there will be challenges in terms of the limitations of where we actually go with the systems themselves and the extent to which we can overhaul or reconfigure their interfaces. That said, there are options available to explore: Encore (Catalogue interface), linked data to push OPAC holdings to Google, integration of Catalogue functionality into EDS In the usability work to date, we’ve adopted quite a structured approach – as in we were specifically asking participants to use DISCOVER to research this or the Catalogue to find that. The next stage is to use ethnographic approaches to take a step back and gain a better understanding of the wider user journey and experience rather than focusing on the usability of the interfaces themselves. To better understand the journeys our users take when locating and accessing online content. How they navigate through our online spaces and how this might differ across our different user groups. All of which will ultimately inform our service delivery and the resourcing of future developments in content discovery. We are still in the planning stages, we’re considering using user diaries, cognitive mapping and semi-structured interviewing. In this respect it’s been interesting to day to see how other institutions have approached their studies and the different techniques being used.

Any questions..?