As the new year begins, I’ve been reflecting on the exciting changes 2019 will bring to both our physical and online spaces. Projects that previously felt distant or nebulous for some of us, like the creation of a new library website or move to Charles Library, are now fast-approaching. As we consider how to design and assess the user experience of our changing web presence and physical spaces, much of what I learned at last month’s Library Assessment Conference feels well-timed.
The sessions on user experience offered practical insights on methods and tools that can bring user feedback into digital projects like the website redesign and Library Search. Amy Deschenes (Harvard University) described her library’s website redesign project from start to finish including early discovery research (user interviews) and the creation of personas that informed decisions used throughout the entirety of the design process. Her explanation of integrating UX into agile sprints with technology staff felt particularly germane to our own technology development environment. Zoe Chao (Penn State University) made a persuasive case for using comparative analysis to test website navigational structures, and Andrew Darby and Kineret Ben-Knaan (University of Miami) shared online usability tools that make it fast and easy to conduct remote testing of webpage prototypes and site architecture with large numbers of participants. In the pre-conference workshop I attended, Kim Duckett and Joan Lippincott shared strategies and examples of post-occupancy space assessments for understanding the effectiveness of spaces for students, such as learning commons and digital technology spaces.
The presentation I keep coming back to, though, is one that questioned the emphasis on “practical” conference takeaways and argued for a more critically reflective approach to assessment and user experience work. In “A Consideration of Power Structures (and the Tension They Create) in Library Assessment Activities,” presenters, Ebony Magnus (Southern Alberta Institute of Technology), Jackie Belanger and Maggie Faber (University of Washington), highlighted the pervasiveness of the “practical” in librarianship, pointing to how often we create “how to’s” and “best practices” that are intended to make work easier and efficient. In the library assessment world the focus is often on specific tools and research methods. The presenters argued not that learning the practical— best practices, tools, and methods— has no value, but that we look at our assessment approaches through a “lens focused on power and equity.” Doing so allows us to interrogate how our own positions of power shape the assessment work we do.
We might, for instance, look critically at how we prioritize projects, the questions we ask, how we recruit participants, the research methods we use, and so on. During the presentation, I reflected on how we recruit participants for impromptu or “guerrilla style” usability testing of our website. The advantage to this type of recruitment is that it’s easy and convenient, but we’re missing feedback from populations who do not physically visit the library. Questioning our assessment approaches in this way can guide us towards a more inclusive assessment practice that benefits all users, particularly traditionally underrepresented groups. In a related blog post, the presenters outline additional strategies for incorporating more inclusive methods into assessment practice.
My favorite conference experiences are those where what I learn feels useful. I want to leave a conference with practical ideas that are easily adaptable in my own institutional context. I’m eager to attend any session that promises attendees a concrete takeaway to the point of skimming program schedules for phrases like “attendees will learn methods, tools, or strategies to use at their own institution.” The Library Assessment Conference didn’t disappoint in this regard.
The website redesign and the move to Charles Library will significantly impact library users and library staff, and now is the time to plan for how we will design and assess the user experience of both. I hope to see us engage in the critical approaches, because those are the ones that will allow us to work towards the best user experience across our student populations.
Finally, I encourage everyone to have a look at Magnus, Belanger, and Faber’s beautifully written post, “Towards a Critical Assessment Practice,” in In the Library with The Lead Pipe which provides more detail about critical assessment practices.