What Counts

To count (verb): to tally, to add up, to total, to recite numerals in ascending order

To count (verb) : to matter, to be considered, to be included, to have importance

I have posted multiple times in this space about metrics – the challenges and the seemingly arbitrary decisions we make when quantifying the libraries’ work for surveys like ARL, IPEDS, AASHL and ACRL.  Given my ambivalent feelings about counting, I appreciated a recent Curious Minds interview with Deborah Stone about her book, Counting. A social scientist, Stone reminds us: 

There is no such thing as a raw number.  At least in human affairs. Every number is the result of a decision about what is important; what is worth paying attention to.

Stone’s book is an exploration of the ways this plays out in the social world, with vivid examples of how seemingly precise figures for “unemployment” and “ethnicity” are replete with value judgements, arbitrary decision making, and historically based judgement about what is countable and how.  

Counting forces us to classify things, to categorize them. Because being countable is a value statement, counting is a way to exert power. We are familiar with how this impacts voting and in census-taking. To be “accountable” is to take responsibility for counting that is fair and honest.   

Libraries are big counters, and hence classifiers, asking “Is this a reference question or a directional question?” The former counts, the latter does not, at least as far as the NISO (our information standards organization) has dictated.   Asking a library staff person an “informational” question counts, but receiving help to find a book on the shelf or assistance with placing an item on reserve, those transactions are not counted as reference.  Reference transactions, that special kind of service,  are valued in a different way. 

Yet, isn’t the service of helping a patron to access resources on their own at least as much value as that transaction that provides the answer? It seems that helping a community user log on to a computer in order to apply for a job should count for something. Our standard surveys don’t ask about those transactions. What patrons ask of library staff is changing rapidly, as are the skills required to provide those services. 

The challenge is finding measures that truly gauge the value we provide to our communities. These metrics need to be applicable over time and relevant to libraries of many types. And measurable with systems (or less reliably, people) that apply them accurately and consistently. 

Stone’s final words remind us of the dangers of equating numbers with facts:

When we decide what to count, we frame an issue as surely as the painter composes a scene. Our numbers embody the concerns, priorities, and values that guide us as we decide who or what belongs in the categories we’re counting …We should count as if we’ll soon be infected by our own numbers. For in the end, what numbers do to others, they do to us as well. (Stone, D. Counting. New York: W.W. Norton, 2020) 

Posted in statistics | Tagged , | 1 Comment

The Ways of the Teacher, Leader, and Assessment Practitioner

 

Ways of the teacher

This summer I am teaching a class in leadership at Drexel’s College of Computing and Infomatics, a required course in their masters program for Library and Information Science. What a thrill! To be in the role of instructor, helping soon-to-be information professionals as they contribute in new ways to their organizations – it’s an exciting challenge, but also scary. As that class wraps up,  I am reflecting on my own practice as a teacher, leader, and practitioner of assessment.  It turns out they connect in some unexpected ways.

The reflection on teaching was encouraged as part of the perfectly-timed Teaching Challenge designed and hosted by the Learning & Student Success Strategic Steering team here at Temple Libraries.  It was so beneficial to join a cohort of other librarians engaged with instruction, helping me to feel less isolated in my role as adjunct. I had much to learn from my supportive, and more experienced colleagues.

The challenge encouraged me to ask hard questions:  “What’s important to me as an instructor? What is my teaching philosophy? ” It solidified for me the idea that I didn’t want to be the “sage on the stage.” My desired role was to facilitate a learning experience for my students in which we learned from one another.

But I also needed to consider the students’ expectations for the class. What do they need from me as an instructor? How do I balance their needs and my responsibilities to them? How best do I assess their learning needs in this asynchronous, online environment? How do I connect to them in a way that feels authentic?

I may have had more questions than answers, but these are things I’ve learned about teaching so far:

  • Take time to reflect on your practice
  • To be prepared to learn and be prepared to change. Be humble.
  • Be respectful of others and their voices, their experiences.
  • Be available to make changes based on students’ feedback, but remember that you are the instructor – you are being paid to make hard decisions about course policies and procedures.
  • While students are responsible for their own learning, you have the responsibility  to facilitate and foster that learning.

Ways of the leader

The students’ favorite part of the class was a series of “practitioner” interviews I conducted with colleagues. They have many different roles, including deans and directors of research libraries, public libraries, IT and organizational performance.   I asked them to define management versus leadership, and what they experience as opportunities and challenges in their practice as managers and leaders.

From these interviews, I learned some things too:

  • Good managers and leaders see the value of self-reflection and knowing yourself – your strengths and where you have challenges (and of course the willingness to work at those challenges). ​
  • Managers and leaders have a passionate desire to learn continuously.​
  • Good managers are good listeners, cultivating and coaching their staff, oftentimes acknowledging strengths that were not recognized before.
  • Good managers are able to think outside their managerial “domain” to consider the needs and goals of the organization (and the organization’s parent institution) in a holistic way. ​
  • Good leaders are able to see a bigger, longer term picture.  Metaphors for the leaders’ view were expressed in terms of height, distance, and time: the “30,000-foot view”, the “long view”, the “5-10 year vision”.​

Connecting to assessment practice

These activities surfaced for me many parallels between teaching and leadership. And of course good assessment practice incorporates many of the same maxims.  (Maybe these are life strategies as well,  but that’s a different blog!)

  • Curiosity: Always be asking questions. Never assume that the current way is the only way.
  • Learning: Always be seeking to improve.
  • Engagement: Always be curious and engaged with the user experience. They are a large part of what we’re about.
  • Self-reflection: Always be aware of your own biases. Be willing to listen to diverse voices. That diversity makes us stronger in thinking about solutions.
  • Vision: Be patient. Cultivating a culture of assessment takes time. Take the long view.

Reflecting on these connections provides me with a renewed sense of purpose as we begin the academic year. Building a culture of assessment here at Temple Libraries/Press isn’t just about insuring we count reference transactions the same way. The practice is also one of teaching and leadership as we work with the organization, helping it to grow and support the University community in new ways.

Posted in organization culture and assessment | Tagged , | Leave a comment

Assessment in Uncertain Times

We assessment professionals are moving forward through uncertain territory as we adjust to new realities on campuses and communities and in our libraries. I had the privilege of talking with my colleagues about these issues by convening the ACRL Assessment Discussion Group last week as part of the online ALA annual conference.

One of the advantages of the virtual discussion, other than hosting from the comfort of my kitchen, was participation across the gamut of geography and library types. Attendance was triple the average in person meeting.  But there are new kinds of risks: technology fail, time zone confusion, and the challenge of facilitating a participatory discussion with a large group. 

I set the stage with these prompts:

 The last year has occupied us all with the many changes to insure ongoing library operations during the COVID pandemic.

  • How are libraries best adapting to these new campus realities, requiring us to reconsider how we provide services, access to resources and manage our physical resources?
  • What new assessment opportunities does this provide us?
  • Have circumstances changed our planned assessment projects in terms of method, access or questions to be asked? 
  • Alternatively, has our assessment program had to take a temporary back seat as we deal with other, more immediate concerns?

In six small groups we first shared with one another the current state of affairs at our libraries. As to be expected, this continues to be a mix, with varying degrees of open or reduced hours, service models for delivery of physical materials, and staffing work-from-home policies. Some libraries continue to enforce a masking policy, others do not. In many libraries, mandates against masking policies are set externally, often from state or local governments. Many libraries have removed signage, social distancing policies, and physical barriers (like Plexiglass) at service desks. Others continue to provide more limited seating or reservation-only study space, although most libraries enforcing restricted access anticipate fewer restrictions in the fall. 

There are prevalent concerns expressed by our library staff that any expectation of a  “return to normal” in the fall is unrealistic.  Perhaps, at the extreme end of these concerns is that libraries have “lost” a generation of students — many students have  come to rely on services provided remotely rather than the physical spaces and in-person services provided by the library. 

We discussed how the pandemic impacted our organizations and staff. In addition to changes in remote work policies and hiring freezes brought on by budgetary uncertainty, we reported an increase in early retirements and resignations.  

Of course, the pandemic has had a significant impact on transaction numbers, sometimes in surprising ways. For instance, not all libraries are seeing the increase in use of electronic resources they might have expected.  While it’s important that we don’t assume all anomalies to be caused by the pandemic, many of us have questions as to how these dips will be explained in our trend analyses — how surveys like ACRL and ARL will take this exceptional period into account. Jeannette Pierce, on the editorial board of the ACRL Academic Library Trends and Statistics survey, filled us in on adjustments made to that instrument. 

These transitions also provide us with opportunities and challenges.

The pandemic has changed some approaches we use to conduct assessment. At the same time that we want to know more about student satisfaction with pandemic-driven changes, we have observed “survey fatigue” in both students and faculty. Our current efforts may need to be focused more on outreach and communication than “true”  assessment. We may be seeking user feedback in more ad hoc ways. In practice, Zoom has proved useful for user testing. 

The pandemic prompted many libraries to employ counters for measuring space occupancy. Software used for this purpose includes:

  • SMS Store Traffic
  • SenSource SafeSpace
  • Occuspace 

These tools also allow for public dashboards that are useful to students in locating available study space. It was noted that requirements for reserving space in advance at the library was a barrier to students, in this case commuter students, who think of the library as community space, a place to “be” in between classes. 

We are closely evaluating our collections, considering the impact of shifting dollars spent for print collections into electronic format. In some cases this has led to a push for decreasing our print collection footprint. As noted above, the expected increase in usage of those e-formats has not happened across the board — is this, perhaps,  a consequence of “electronic interaction burnout”? 

The pandemic has provided multiple opportunities for us to assess the effectiveness of online instruction as well as the use of online research guides. We are monitoring the use of social media and the web site, but recognize the limitations to these counts as measures of effectiveness.  

We concluded on a positive note: In many ways, the pandemic allowed libraries to “show their stuff” —  re-working service models to accommodate safety protocols in agile ways, rapidly transitioning to electronic course reserves, digitizing special collections materials at a new pace. As we assess user acceptance of these changes, many adaptations will carry over into our regular work when we return to a “more like normal” workspace in the fall. 

Posted in conference reports | Tagged , , | Leave a comment

LibGuide Assessment from the Ground Up

Librarian Rick Lezenby authors many Libguides. In this guest post, Rick shares some insights about assessment and the value of listening to users as we collaborate on tools that support their instruction.

Libguides at Temple Libraries are guides to library resources and related information skills built on the web-authoring platform from Springshare. These are used mainly as introductions to degree program subject guides and for guides created for specific courses. After a number of years of a laissez-faire approach to their look and use,  the Libraries in 2017 developed detailed standards, based on usability testing conducted at Temple University Libraries and other institutions, for a uniform look and purpose. Guides also go through a review process to avoid duplication with similar guides. Then there is a checklist review of required usability format standards once the guide is submitted for publication. Beyond that, the content of guides continues to be left to the discretion of subject-specialist librarians.

We in the Libraries have not yet developed a good way to assess the level of satisfaction users have with these guides. Getting detailed feedback has been hard. For years, I have been creating libguides for subjects, topics and courses with little feedback from users or faculty beyond “Thanks!”, “Great!” when asked directly. The daily hit counts provided by Springshare do indicate how much a guide is accessed and how many of the sub-pages of a guide are viewed if at all. There is no tracking where users go next.  It has always been a bit of a guessing game as to what should go into a guide beyond a standard list of likely tools and general advice.

Over the summer of 2020, I had the pleasant surprise of receiving two full plates of unsolicited recommendations, one from the faculty in the Global Studies department and the other from the Political Science faculty. Both were lengthy documents full of titles of what was important to them. It also gave insight into what library resources they were aware of.

In the case of Global Studies, I had created a subject guide when the department was first created. I was now given a chance here to compare my original guide based on an outsider’s perspective with what faculty thought independently of what I had created. It gave me grounds for comparing what I thought should go into the guide versus what faculty thought independently of that. Global Studies at Temple strives to be an interdisciplinary program that ranges across Arts & Humanities and Social Sciences, using the areas of global security, economy and cultures as touchstones. The senior capstone projects could be on just about anything situated in global, or at least, multi-foreign context. 

Global Studies was first headed in the mid 2010’s by a faculty member out of the Political Science department, which was my subject librarian area, and with whom I had a good working relationship for a number of years prior to that. In 2020, the new chair came out of the Sociology department, which had not been one of my areas at the time. A group headed by the new chair sent me a document for a proposed research guide, with specifics on each section of the guide.

Goals:  The guide should provide:

  • Resources for students (touches on all three tracks: culture, economy, security)
  • Highlights issues/themes of human security, human development, gender, race, language, cultures, terrorism, environmental concerns, international trade, international financial institutions
  • Mainstream Global South
  • Highlight source type/variety
  • Feature access to primary sources
  • Perhaps a guide on citations

Contents:

  • Dictionaries, Encyclopedias 
    • Provide an explanation of “using reference sources” 
  • Handbooks/encyclopedias
    • For example, The Oxford Handbook of Global Studies

The faculty member took time to review other guides in Temple’s system and pointed out to me those that might serve as “models.” They were specific about how resources should be organized, using examples from other guides.

The advice had me looking at sources in a completely different way. Faculty in Global Studies think about tracks in that program: culture, economy, security – and expect their students will identify best resources in that way. They preferred listing resources as specific titles with links to the library’s catalog entry.  The suggested Articles and Databases showed awareness of those and some lack of knowledge of databases that could serve some purposes better than others. And, the unorganized list of Great Sources of Data presented me with a challenge to organize it. 

Seeing what they liked about guides and what they wanted was probably a unique experience, almost impossible to replicate to this detail for other departments. It was their motivation driving it. But, it does suggest a framework for getting feedback from other departments.

Similarly departmentally-motivated, in the summer of 2020 I received a list from the Political Science department Chair with the title: Books TU Polisci Faculty think Undergraduates Should Read – May 2020. It was created by faculty in the midst of the pandemic when there was some uncertainty about what the university would be doing going forward and intensifying street protests. 

Political Science Reading List Guide

The list ran to 12 pages,  mainly of political classics important to faculty along with a section on race. At the time, the library was closed to all, so I offered to organize and turn the list into a libguide with links to ebooks where possible. The titles were listed under each professor’s name, so it became much like the soon-to-become notorious practice of analyzing bookshelves behind Zoom participants. My goal was to include a brief description of the book from available web sources. The process of putting all these titles together on a libguide with links was a bit mundane, but it did force me to attend in great detail to the titles and summaries of their content.

In collaborating with faculty on these guides, I acquired significant insight into how professors would direct students in a way I would not otherwise be privy to, and a way to think how well my guides reflected that and the department overall. It made me aware that as a librarian, my interest has been in providing resources primarily to assist with immediate projects. Faculty have longer range goals in mind to develop students beyond the assigned essay or term paper project, not necessarily tied to a semester course. Finding a way to link the two approaches in a guide requires much more communication between us.

 

Posted in instruction and student learning, research work practice, service assessment | Tagged , | Leave a comment

Discovering sources in Library Search: key takeaways from remote user interviews with history students

As a followup to last year’s Browse Prototyping project, Rebecca Lloyd and I conducted remote user interviews with upper level history students in December 2020, just as the fall semester was wrapping up.

Using a semi-structured interview technique, we talked to four students to find out how they discover and use sources generally, and how they use Library Search, including how they use filters, and how they use metadata such as call number, author, and subject. All of the students were in the midst of substantial capstone projects that required finding and using at least two books. We asked them to describe their projects and to tell us about specific search strategies and tools. After the interview, we asked the students to review the Library Search interface. We were particularly interested in their use of search facets, including the new Library of Congress classification filter.

A comprehensive overview of our findings and recommendations can be found in our full report, presentation, or recording of April’s Assessment Community of practice. For this post, we’re focusing on a few of the observations that we found most interesting and most critical for consideration as the Libraries continue to develop discovery features in Library Search.

Students do “browse,” in Library Search now, just not in the way library staff may think of browsing.

All of the students we talked to reported using simple keyword searches when looking for books on their topics. These searches usually resulted in a lot of hits, but long lists of results were not a deterrent. Rather than using filters or more specific keywords to narrow a search, the students usually scrolled through long lists of search results to find the books that were most relevant to their topic. They evaluated sources quickly; most focused on scanning titles to determine whether a source met their needs. Some mentioned looking at chapter lists or other metadata to get a sense of the book’s contents or usefulness, but title and author seemed to be the most useful indicators of whether something was worth further reading.

Library Search was only one tool used to evaluate and select relevant sources.

To find sources, the students we talked to, unsurprisingly, relied on resources beyond Library Search. More surprising was that recommendations from faculty and librarians were one of the primary ways that all of the students identified key sources, especially early in their research process. One student even reported checking with their faculty advisor about a book before deciding to use it. They also relied heavily on bibliographies from past research projects as well as their previous knowledge of key authors who had written about their topics.

Most of the students preferred print books to electronic, and browsing the shelves is a valuable experience.

Being able to access electronic sources was critical during the COVID pandemic. However, most told us that they preferred using print books in general. Three of the four told us they preferred print materials for reading, one sharing that, “I feel like I’m doing, like, professional research when I’m actually looking at a [print] book. Whereas … if I’m doing it through my screen, I often feel like I’m just doing, like, busy work for classes.”

The students also told us that they liked requesting materials from the Bookbot. While they appreciated the convenience of using an ASRS, all of the students shared stories of browsing the stacks in Paley or the fourth floor of Charles. They recognized that materials physically co-located were topically similar and felt compelled to browse nearby items when they visited the stacks.

One student shared that they only came to see the need and value for browsing the stacks once they were in more advanced history courses and doing more self-directed research. They didn’t do much shelf browsing in Paley but have found the open stacks in Charles to be very useful. “It’s underrated for me because I’m a history major, the stacks on the fourth floor [are] nice to us.” Especially when doing a comprehensive research project like a capstone, the opportunity to go to the shelf to retrieve a book and then, as one student said, “look around and [see] if anything like the other title, like, on the spine caught my interest” is valuable to students.

Conducting interviews over Zoom worked really well. 

Finally, we wanted to include some thoughts about conducting student interviews over Zoom (thank you to Katie Westbrook for her question about remote interviews during the Community of Practice!). Surprisingly, Zoom turned out to be a perfect tool for user interviews. Logistical tasks like securing a private interview location, providing directions, and setting up a laptop and recording software were suddenly not necessary. Zoom made it easy to capture everything including audio and video recordings and transcripts in one place. Transcript cleanup was time consuming, but far less so than if we’d transcribed the interviews ourselves. 

From our perspective, conducting the interviews remotely mitigated the feeling of unnaturalness that can come with doing user research in a formal space. The students could talk with us from their own locations and use their own devices to show us how they used Library Search. Most noticeably, the uncomfortable feeling of watching and being watched that accompanies user testing was absent; the students shared their screens with us as they explored the Library Search interface, and we were able to easily see their screen interactions without looking over their shoulders.

Posted in instruction and student learning, qualitative research, technology use, uncategorized, usability, user experience | Tagged , , , | Comments Off on Discovering sources in Library Search: key takeaways from remote user interviews with history students

On Inquiry, Innovation and Leadership

Say the word innovation, particularly in libraries, and we tend to think of technology. At the Ginsburg Library, this association is explicit — the space set aside for technology-rich services like 3-D printing and virtual reality application is called the Innovation Space. That’s not a bad way of helping patrons to understand libraries as more than books. 

In gathering evidence of “innovation” as part of the Values & Culture team’s work on Flying Further, the  University’s Strategic Planning Steering Committee, we went first to Temple’s Office of Research for data on innovation, assuming that research grants and patents serve as a proxy for innovation.  Temple excels in this area as well. 

But innovation comes in many flavors, and our steering committee sought to broaden our thinking about innovation. In the context of libraries, innovation may take the form of new approaches to teaching. A new way of delivering services. A fresh approach to reaching new audiences. From public programming to instruction to delivering physical materials to users — even  to how we work with users and understand their needs — the libraries and press staff demonstrate over and over how innovative they can be. Particularly when a goal is shared. 

Sometimes innovation is a good thing. Other times, it is more effective to build on strengths, to do more of what is working well. This is where inquiry comes into play. By asking how we might do things differently, or how we might do things better, or asking why we do it at all – that’s inquiry. 

At the 2018 Library Assessment Conference, Jeremy Butler (University of British Columbia) asked us to consider aiming for a culture of “inquiry” rather than a culture of “assessment”.  By asking questions, we develop a practice of continuously improving, of not taking the current workflows and staff models as “givens”. We collect, analyze, and share data with the intent of making decisions based on sound research and towards shared goals. We open ourselves up to change by looking closely at staffing and training needs, revisiting policies and procedures based on data rather than historical precedent.  While the word “assessment” may connote criticism and personal performance, the “inquiry” is less threatening, more palatable, a practice everyone can engage with.  

And where does leadership come in? There are some that maintain a status quo, and make sure that policies and procedures are followed consistently. This job is critically important in the library. Operations must run smoothly. The library doors must stay open.   When  managers and staff members take on roles as leaders, they do something a bit different. They encourage and support inquiry.  They look towards continuous improvement, build on strengths, and are unafraid to test innovative (new?) ideas and do things differently. They are willing to take risks. They may even rock the boat. 

Of course the roles of managers and leaders are important and should be equally valued. Still, I like to advocate for assessment and inquiry as ways we support innovation and leadership throughout the organization. 

Posted in organization culture and assessment | Tagged , | Leave a comment

The Year in Assessment at TULUP: A Celebration

This week I submitted the Libraries’ annual report on assessment activities to the University’s Office of Assessment and Evaluation . It’s a requirement that I don’t particularly relish, as I often feel our approach to assessment at the Libraries is somewhat haphazard and often “just in time”. We’ve never had a formal assessment “plan”. 

But I was wrong to be discouraged. In fact, our assessment capacity has grown tremendously, with full time librarians in user experience (Jackie Sipes) and  collections analysis (Karen Kohn). As importantly, many, many staff from across the organization have contributed to assessment efforts this year. So celebration and appreciation are well deserved. 

Many staff were involved in the Envisioning Our Future project, conducted under the umbrella of ARL’s Assessment Framework. We interviewed staff to learn how they envisioned working in the new spaces at Charles, and conducted a second set of interviews after the move. Research team members in Phase I included Olivia Given Castello, Rachel Cox, Jessica Martin, Urooj Nizami, Jenny Pierce, Jackie Sipes, Caitlin Shanley, and Stephanie Roth.  In Phase II, Karen Kohn, Rebecca Lloyd, and Caitlin Shanley made up the research team. Over 40 staff members agreed to be interviewed, many participating in both phases.   The project has received wide recognition, most recently at the Library Assessment Conference as part of the session on Critical/Theoretical Assessment and Space.                

The Furniture Study was a multi-method approach, using a survey to students and daily observations to determine what types of furniture best supported the work they do at Charles Library. The project was led by Jackie Sipes and Rachel Cox and resulted in  several changes, including the positioning tables to improve privacy and quiet for student work.  This assessment was featured when the Middle States Accreditation Committee came to campus.  

Rachel Cox and Jackie Sipes also led a signage and wayfinding project, working with staff from Access, LDSS and LRS to  identify the top wayfinding issues in the building and determine content and placement of third floor directory signs. Many of our student workers in those departments, plus LTS students, also responded to surveys and provided feedback on the re-envisioned Charles floor maps.

Gabe Galson and Katie Westbrook conducted usability testing for the ongoing work on Library Search.

Kaitlyn Semborski and Geneva Heffernan continually monitor the usage our multiple  Social Media accounts (Instagram, Twitter and Facebook) to understand what works where, using that data to engage our various audiences effectively.  

The Virtual Reference Assessment was one of the Libraries’ many responses to the closing of the physical collections due to COVID-19. We put into place a more visible chat widget and a request button for getting help finding digital copies for inaccessible items. Olivia Given Castello, Kristina DeVoe, Tom Ipri, and Jackie Sipes worked on this popular service. Their assessment has led to multiple changes, including refining the routing of email requests and chat follow-up tickets. The work has also enhanced the FAQ system, engineered to come up automatically when staff answer an email ticket. This saves staff time as they can easily insert and customize the text in their replies to patrons. The Digital Copy Request system is more effective through coordination with Brian Schoolar (collections) and Joe Idell (document delivery). 

We improved the user experience for Request and Retrieval through our Library Search system. The project was led by Karen Kohn with team members Brian Boling, Carly Hustedt, Karen Kohn, John Oram, Jackie Sipes, and Emily Toner. With a goal of considering the entire experience, from making an online request to physically picking up a book, each team member brought important expertise to the project. Working remotely created challenges for some aspects of this project, like visualizing the pick up area at Charles. but they persisted. The clearer signage and instructions for use of self-checkout  improves the experience of staff as well. 

In addition to these projects, all profiled on our blog, Assessment on the Ground, there is much assessment work that goes on behind the scenes. For instance, we are in the process of reviewing our data collection practice through the Springshare forms. Staff involved in this initiative are Andrew Diamond, Katie Westbrook, Carly Hustedt and Tiffany Ellis, with input from Steven Bell, Olivia Given Castello, Justin Hill, Tom Ipri and Jenny Pierce

Richie Holland, Marianne Moore and Royce Sargent provided insights as I refined our approach to calculating and reporting expenditures for our many survey responses (IPEDS, ACRL, Temple University Fact Sheet, AASHL). 

Evan Weinstein, Margery Sly, and Josue Hurtado helped me access data collected in their work areas to better understand how our physical spaces and services were being used this fall, particularly important as we evaluate the use of the library buildings. 

Dave Lacy and I collaborate in our work with central IT staff for understanding Charles swipe data and how best we might connect Banner and library datasets to develop visualization dashboards in Tableau. 

Beckie Dashiell and Sara Wilson are patient collaborators as we continue to streamline our workflows with the University’s  Data Verification Unit. As essential as this function is, we all need patience when addressing their myriad questions like, “Where is your documentation for the 80 goat watchers you report attending the Instagram Philly Goat Project?” 

And there are important projects on the horizon. Gretchen Sneff is leading a team (Fred Rowland, Will Dean, and Adam Shambaugh) in an interview project with faculty working in the data science field. This important research, coordinated by Ithaka S+R, will combine our local data with findings from other institutions to understand research practice and potential for library services in this emerging area of need.

We are supporting our Library’s Student Advisory Board in a new way, providing a stipend for members.  This sends a powerful message to students about how we value their voice.  Thanks go to Jackie Sipes and Caitlin Shanley for leading this effort.

Finally, the Assessment Community of Practice sessions continue to be well-attended. Open to all staff, the forum provides a space for sharing our assessment work and asking new questions.

So… in spite of no formal plan, we continue to engage more staff in assessment projects, understand user needs in new ways, and develop our own expertise through team work.  All in all, a very good year for assessment here at TULUP. Thanks to all of my colleagues who contributed.

 

 

 

Posted in assessment methods, organization culture and assessment, service assessment | Tagged | Comments Off on The Year in Assessment at TULUP: A Celebration

The User Experience of Request and Retrieval

Earlier this year, a group was formed to consider ways to improve the user experience of requesting and retrieving items from the Charles Library BookBot. The group was composed of Brian Boling, Carly Hustedt, Karen Kohn, John Oram, Jackie Sipes, and Emily Toner. Karen led the group with UX support from Jackie. Our goal was to consider all aspects of the request/retrieval experience, from making an online request to physically picking up the book. We each brought different expertise, including knowledge of the service desk, the technology behind the request process, and the field of user experience research.

By the time the group convened, we already had a fairly long list of issues that we might address, identified by previous usability testing and staff focus groups. We reviewed the list and ranked each issue according to the potential impact on the user and the effort required by library staff to address it.  Shortly after we began meeting, the library closed its buildings due to covid-19, which affected our priorities and how we were able to work. However, with some adjustments we were able to complete three projects: Retrieval Times, Request Button Clarification, and Encouraging Self-Service at the One-Stop Desk.

Retrieval Times

We began with a relatively simple project to come up with language that would appropriately set patron expectations around retrieving a book from the BookBot. We knew from focus groups we had conducted with public services staff that they were fielding many questions about how long retrievals would take, and that their responses ranged from ten minutes to an hour.

To decide on a message, we first needed to learn how long requests were actually taking, which we did by looking at data that Karen had compiled on requests and retrievals made from the start of the Fall semester 2019 to the March closure. Looking only at requests made while the crane was in operation, we saw that more than half of requests were delivered within 5 minutes and 87% within 20 minutes. The average retrieval time was just under 23 minutes.

Next we needed to turn the numbers into a concise message for patrons. We conducted a structured group brainstorm, where each of us wrote our own version of a message that reflected the average retrieval times we saw in the data. We then shared our individual messages with the group. The chat window in Zoom works well for this process, which became a very familiar one for us! By noting what we liked about each other’s wording, we came to consensus on the following message:

“The Bookbot typically delivers items within 20 minutes. Requests placed outside of operating hours will take longer.”

Unfortunately, because the library was closed at the time, it did not make sense to add this message to the website. We expect that the time it takes to retrieve an item may be somewhat different now, due to the absence of student workers to help with the process and the lower volume of requests. We hope, however, that we can use similar phrasing in the future with an updated estimate of retrieval times.

Request Button Clarification

Our next project addressed an ongoing problem with the Request feature in Library Search. Users often could not tell which items were requestable and which were not, and the website did not explain the logic behind why certain items could not be requested. We’d heard from public services staff that items in the fourth floor open stacks were particularly problematic; users try to request those items and can get frustrated or confused when that option turns out not to be available. After Emily explained some of the technical constraints that impact how request options are presented, we had a better sense of the scope of potential changes we could suggest. At this point, we left open the possibility that the solution could be either a change in wording or in the functionality of the Request button.

Because there were several different ways we could potentially approach this problem, the group took some preparatory steps before brainstorming solutions. First we wrote a problem statement, which defined the problem as being related to both user expectations and communication. Next we reviewed logs of virtual reference questions. Karen arranged the logs on a virtual whiteboard, which allowed us to cluster “sticky notes,” putting similar questions near each other. The reference questions confirmed what we’d heard from staff already – they were indeed getting a lot of questions from users attempting to use the Request button for items on the 4th floor of Charles! Reading through these reference transactions also provided us with some interesting new information. Patrons do not actually mind retrieving items themselves from the fourth floor; they just don’t know that this is what they are expected to do. Self-service does not need to be presented apologetically. Another finding is that while we’d initially seen communication as an issue, staff had many successful ways of communicating to patrons the need to retrieve items themselves.

Patron questions represented on clustered virtual post-it notes

Patron questions represented on clustered virtual post-it notes

Our next step was to clarify for ourselves the policy regarding self-service, using a Five Whys exercise. Using several use cases, we took turns asking “Why can’t I request this?” and then countering the answer with another “But why?” We had fun pretending to be challenging patrons, and as we did so we started to see the logic of why certain items or locations are requestable and others not. We realized that, despite the complicated programming logic behind how the Request button worked, the human logic was relatively simple: an item is not requestable if we believe the patron can get it themselves (i.e., it is in open stacks on the patron’s home campus).

With the situation clearer in our minds, we were able to brainstorm solutions. We changed the text on the button from Request to How to get this. We wanted to use language that conveyed that requesting is not the only way to get an item. For much of our collection, there are a variety of ways to obtain a desired title.

a screenshot of the availability section of an item Library Search that shows the new button language how to get this

How to get this button in Library Search

With design support from Rachel Cox and more group brainstorming (we got very good at brainstorming phrasing together) we added information about retrieving items from open stacks. When a user clicks the How to get this button for an item in any open stacks location, one of the options they now see is Find item on the shelves. The text instructs the user to “Close this window to view the location and call number, then find the item using this information.” An added benefit of this new design is that the How to get this button provides a place to offer a range of options for obtaining an item. After the building reopened in August and our books were once again available in physical form, we continued to offer the popular Get Help Finding a Digital Copy service alongside the options for getting a physical copy. This service is now offered as a link within the How to get this menu.

Screenshot of the menu that appears in Library Search after a user clicks the How To Get this button

Menu that appears after clicking the How to get this button

Future assessment is needed to determine if these changes helped to clarify the request menu options for patrons.

Encouraging Self-Service at the OSAD and Hold Shelf

Several of the issues we had previously identified as high-priority related to patrons not realizing certain services were designed to be self-service, such as picking up requests from the Hold Shelf. As Charles Library reopened to patrons in August, the group looked for ways to encourage self-service in order to reduce person-to-person contact between patrons and library staff.

Because we wanted to move quickly on this project, we did not follow all the steps of a formal design-thinking process. We identified the most critical information for successful self-checkout and then brainstormed how to communicate that information at key touchpoints. To encourage self-service, we wanted to communicate five messages to patrons:

  1. Go directly to the Hold Shelf
  2. Books are alphabetical by the first four letters of your last name and last 4 digits of your TUID
  3. Books are not yet checked out to you
  4. Please use the self-checkout machine
  5. Return items on the cart

These messages were incorporated into the whiteboard signs near the desk, which Katerina Montaniel and Emily Schiller redesigned. Carly also arranged for the paper sleeves on Resources Sharing books to contain a note telling patrons to please check out the item. She also designed 8.5” x 11” signs to sit in plastic holders on the Hold Shelf saying “Please remember to check out your items.” Jackie and Rachel Cox worked on signs for the self-checkout machines identifying them as such. As most of our team members were not working on-site, we relied heavily on photographs from John Oram of the OSAD/Hold Shelf area, as well as assistance from Carly and Cynthia Schwarz for sign placement.

Whiteboard sign next to the One Stop desk that explains to users how to locate and checkout their items on the hold shelf

Hold Shelf sign created by Emily Schiller

Unlike the previous project, we started this one with a clear sense of the problem and did not need to spend time defining one. Our goal was to nudge patrons toward self-service in the hopes of limiting contact and creating a safe and healthy environment for everyone in the building. However, data from LibInsight questions recorded at our service desks was helpful in understanding which parts of the pickup and checkout experience were confusing for patrons.

We have already begun to assess the effectiveness of our solutions with a few different strategies. We surveyed OSAD staff about the perceived effectiveness of the whiteboard signs and made some changes based on this feedback. Brian Boling created a report using Alma Analytics of checkouts from Spring and Fall 2020, with a breakdown of staff-mediated vs self-checkouts. The reports showed us that even before our interventions, patrons were already substantially more likely to use the self-checkout machines than they were in the Spring semester. We plan to use this report as a baseline to see if future changes will make the percentage of staff-mediated checkouts decrease even further.

The group is on pause right now, but some of our recommendations will be passed on to others in the Libraries, and we hope to keep assessing the effectiveness of the changes we’ve made. Learning about and following the design thinking process has been enjoyable and using data to make improvements to our services feels satisfying. We hope our work has benefited our patrons and colleagues.

Posted in access, assessment methods, data-driven decision making, library spaces, user experience | Tagged , | Leave a comment

Working Together for Improvement: The Digital Access Workflow

When the library closed its physical doors in March, new doors of the digital sort opened up. Yet the disruption of access service for physical materials, lasting several months,  has yielded a re-working of processes for how we get our students and faculty the resources they need for their teaching and learning. 

For this month’s post, the heads of Charles Library Access Services, Acquisitions & Collection Development, and Learning & Research Services’ social science unit (Justin Hill, Brian Schoolar & Olivia Given Castello) sat down with me to discuss recent improvements in how we provide patrons access to digital materials.

It started with the Get Help Finding a Digital Copy service, initiated when we closed the library buildings. When a patron is searching the library catalog and discovers a physical item of interest, Get Help Finding a Digital Copy appears as an option.  

The request is routed to a virtual reference staff member who reviews multiple sources to find and point the patron to that electronic version of their desired item. When the Libraries had access to the HathiTrust Emergency Temporary Access Service, over 40% of our print collection was available digitally. And of course, there are other sources for e-books, both open access and for purchase.  Learning & Research Services (LRS) librarians, and other virtual reference team members, were busy fielding dozens of requests each day for these digital copies.  This service continues to be incredibly popular. 

How did this success lead to a change in workflow?  As summer went on and emergency access options were expiring, the success rate for Get Help Finding a Digital Copy request fulfillment declined. LRS and Collections Management staff collaborated to design a new workflow that involved Acquisitions staff more directly in the fulfillment process. This allowed them to maximize the possible purchase options and improve the fulfillment success rate.

At about the same time, the Access Services department was moving to provide all digital copies for course reserves. In the course of providing faculty with options for their course reserves, they also took advantage of this new workflow by steering the requests for e-books to Acquisitions.  

Moving to electronic course reserves opened up other opportunities, like introducing both faculty, staff and students to our services for scanning book chapters and sending them directly (and quickly) to the patron via document delivery.  Even better, faculty will learn how to get their course reserves on Canvas so that students have ready access to the materials. 

What made these collaborations between departments work?  

  • Good communications between the departments to facilitate the best solution to a problem.
  • Willingness of staff to bring their expertise to develop the most efficient workflow and to work together in new ways. 
  • And of course, shared value for creating an excellent experience for users.

So how is this assessment? Reflecting on our work and how it might be improved is an important kind of assessment. There are also numbers to show increasing requests and improved turn-around time for those requests.  Additionally, we can see success in the many thank you notes received via email, high satisfaction ratings on virtual reference, and most importantly, the pride of continually improving our services to patrons, even when challenged by disruption.  

 

 

 

Posted in access, digital collections, organization culture and assessment | Tagged , , | Leave a comment

Steering Straight: Continuous Improvement and the SSTs

It’s been almost four years since we established the first Strategic Steering Teams at Temple University Libraries/Press. Those first two groups, Research Data Services and Scholarly Communication, are now part of a group of six including: Outreach and Communications, Learning and Student Success, Collections Strategy, and Community Engagement. Over 60 staff members from throughout the organization have participated as a team member or leader, and many more have been engaged with subgroup projects. 

One of the things that we do annually is an informal “assessment” of how the teams are doing.  We’ve done this in different ways. I have regular one-on-one conversations with team leads, we meet together, and the team leads conduct check-ins with their teams. While these are not formal assessments, we strive to be open to discussing what’s working and what’s not working so smoothly. 

Here’s a summary of recent conversations with Will Dean, Annie Johnson, Vitalina Nova, Brian Schoolar, Caitlin Shanley, and Sara Wilson. 

How is the team going? What’s working well for you as a team leader?

For the most part, teams are going well. Activity slowed down during the summer, and the pandemic has also had a real impact, particularly for those with children or other additional responsibilities while working from home. More time is being spent at meetings checking in with one another. One of the values expressed more than once was the team members’ comfort level with one another, so that these meetings serve as “safe” spaces for sharing concerns and anxieties about what’s going on. 

This is a time when new members are brought into the group, and this means adjustment and re-grouping. Strategies for doing this are:

  • Review of the charge and reworking of goals
  • Evaluation of goals and projects with an eye towards deciding what to continue and what to let go of
  • Establishing new project groups, particularly for new members with new interests, to take on

What, if any, are the challenges?

In this environment, it may be hard to feel connected to how the university is functioning when we are so far apart. 

The membership structure for the teams is designed to allow for new members to join each year, although there is no fixed term for staying on the team. The teams may find that balancing new initiatives with ongoing work can be tricky, particularly as new members come on board. Some members may want to stick with the “tried and true” and others want to start new projects. 

Where do you see the group’s work focusing in the next year? What kind of support would be useful to your team in moving forward with its goals?

Most groups are finalizing priorities and goals for the upcoming year now. It was agreed that having a clear sense of the Library/Press’ strategic directions and priorities will be important for the teams’ planning. The leads confirm that the Strategic Steering Teams are an effective way of moving forward on strategic initiatives without the “administrative overhead” of a department. 

There are areas, like research data services and scholarly communication, where the services and training would just not happen with the “legwork” of the team. 

For team leaders, who do not formally supervise team members, it can be a challenge to delegate tasks, and to ensure that team members do the tasks they commit to. There is not an agreed-upon time commitment. It varies by group and by individual. While the team leads serve on the Libraries/Press Administrative Council, they are leading teams, not departments. They lack a “clear path” for acquiring budget resources to do their work. 

In spite of these challenges, the effectiveness and value of the teams’ contribution to the organization is most clearly demonstrated by their work supporting our strategic objectives.  Take a moment to review all they are doing, at:

Strategic Steering Teams on Confluence

 

Posted in organization culture and assessment | Tagged , | Leave a comment