The Continuous Process of Keeping Our Student Workers “Up to Speed”    

The One Stop Assistance Desk could not function without our student workers. Making sure they are supported in the highly visible and essential work they do, managing desk duties when we’re at home asleep, is critical. Stefan Del Cotto, the student supervisor, is continually finding new ways to connect with them and keep track with what they’re doing — during business hours and late into the night. 

The Student Interaction Tracker helps with this.  It’s a Google sheet that students use to quickly document any type of interaction, from a book check out, to printing issues, to reference and referral. 

The Tracker is important for understanding trends using real numbers rather than anecdote as we make decisions about scheduling needs. But Stefan goes beyond the number crunching and employs the Tracker as a tool for improving the training of students. 

I spoke last week with Stefan about the Student Interaction Tracker. Here are some excerpts from our conversation. 

Tell me a little bit about the Student Transaction Tracker that you use at the desk. When did it start? What is its purpose? 

The Student Interaction Tracker originated as a tool to see how busy the students were during the night shift – what kinds of questions they were getting. We wanted to know if it was worthwhile keeping the desk staffed during those late-night hours. It provided useful insight into what was going on.  

We opened up the Tracker to all students in the Fall of 2021. We used to have 10-15 students, now we’re down to 6. We have always had to look at data to plan optimal times for scheduling those students.  

The ultimate purpose of the Tracker is to gauge what questions are being asked, and this information translates into our training. We ask the students about the questions that were troublesome to identify areas for improvement. It’s important for students to know that their performance isn’t evaluated with this data — we have other processes for that.  

What were your design considerations to make sure that the students used the form? 

The categories of interaction type are color-coded. Purple for tech questions; green for informational. We want students to easily get their check marks in the right place. Designing a good form is a fine balance of having enough categories that data is captured accurately but use of the form isn’t overwhelming. We’ve added some data entry short-cuts to get the day/time stamp into the form, and  we try to make it as seamless as possible.  

How has it been received by the student workers? 

The biggest surprise was actually the high compliance in use of the Tracker! I thought I’d have to remind them, but students are using it faithfully.  We ask for feedback from students at the end of each semester, and they say the tracker is easy to use. And they provide suggestions, like noting when they help a patron with self-checkout.  

Students know that we look at the form regularly, because we follow up with them on their questions, and we meet with them weekly to review.   

As the student supervisor, how have you used this data? 

I use the data to make sure students are getting accurate information when they need it. I’m also continually refining our training program for students. For instance, course reserves may get less attention than in the past, and technical questions related to printing and computers gets much more.  

Tell me about a specific improvement that you’ve made, based on something you’ve learned 

We’ve refined the form over time. In the beginning, we were getting a big percentage of transactions in the “other” category. So things are more split out. In terms of our practice at the desk, we’re re-considering our seating positions at the One Stop – so that students are approached first. We’d like them to be the first to address questions, with staff serving as back up when necessary.  

They do well with patrons who are learning to use the Self-Check. But they can pretty much handle most everything that comes their way. 

Thanks, Stefan. This is a great example of something we do that supports students in a way that really enhances the user experience too. It’s a win win.  

Posted in access, process improvement, service assessment | Tagged , , | Leave a comment

Envisioning our Future: The Constant is Change

Last week I shared some highlights of the Envisioning our Future interview project with TULUP staff. In this post I begin to pull together themes that emerged over the course of the project’s three phases: organizational communication, change and connection.  

We began this research journey in early 2019 as library staff at Paley Library were planning the move to Charles. In proposing this assessment project to the Association of Research Libraries, I referenced the Atheneum21 report on successful implementation of digital strategies in libraries. Success, they said, is about the people and the culture, not the technologies themselves.  This new space, Charles Library, would be full of exciting technologies for collaboration and research, for staff and for users. We wanted to explore how our people, the TULUP staff, imagined the spaces changing their work. What are the opportunities, and what are the envisioned challenges?

The research design was qualitative; semi-structured interviews. We asked about the spaces that staff used for work as individuals, with colleagues and users, and how they felt supported in making those changes.

In reflecting on the new space, one interviewee says,   

In order to survive and thrive in the new building, we’re going to have to work together more closely. I expect that some people will be resistant to that sort of change, but I thing by and large people, especially in the beginning, will be receptive, because we’ll be in the new, really unique space. (p. 9) 

I think it will be a nice opportunity for me to chat with some people that I don’t normally interact with” (p. 9)  

From: Envisioning our Future, Report on Findings and Staff Conversations, August 12, 2019

In Phase II we conducted a second round of interviews about six months after the move to Charles, but just before the pandemic hit. Findings were reported in July 2020 as The Future in Now: How We’re Working at Charles. Not surprisingly, change and communication were central themes emerging from this round of interviews. We noted that clear and regular communication about operations during the change was key – staff needed ready access to information about known problem areas in the building and technology. Staff working on the front lines, using instruction and consultation spaces, and helping users at the service desks, felt particularly frustrated by unexpected disruptions. Noise! Technology glitches! Help!

But staff recognized these early days as transitional and once again, there was a sense of optimism about the future:  

Change is hard. I feel like in a couple of years it’s going to be fine. I think sometimes it’s a little demoralizing that people aren’t seeing the potential of the building. They’re expecting everything to be perfect right away. 

After three, four months I’m starting to feel like this is where we are. This is our future. (p23) 

We reported on perceived changes in organizational culture. Many interviewees spoke to the new physical spaces necessitating a shift in how staff work together, “from shared workspaces to increased need for inter-departmental cooperation in providing positive service experiences for our users.”  

I think that we need to be talking a lot more about how to create a workplace culture that is going to work, where we have respect for each other’s boundaries and space in a way that we’ve never had to talk about before.

From: The Future is Now: How We’re Working at Charles, July 23, 2020

We were getting used to the new spaces, and finding out how it worked for us most effectively.

But of course the story does not end there. Just as we were wrapping up our interviews for Phase II of research in spring of 2020, the library buildings closed due to COVID. As an organization, we faced a new kind of hardship that brought us together to solve challenges, but divided us in other ways. Envisioning our Future: The Pandemic Changed Everything (completed in March 2022) provides a third “installment” to this exploration into how space supports out work. Again we asked staff, from across the library/press system, about physical and virtual spaces, onsite and remote. This hybrid work environment adds an additional dimension to our use of space, particularly in how we communicate and collaborate with colleagues.

Key findings relate to these “cultural” implications. For instance, several staff described a perceived loosening of collegial ties, particularly with those outside their immediate department or project teams. 

I worry what Covid has done to our cohesion as a staff and as an organization. I worry that it has taken an organization that was already pretty siloed and made it even more siloed because you’re not having those chance encounters. (p.15) 

There’s a lot of flexibility in our schedule and it’s very nice for life. But it’s weird for any kind of collegiality. (p.15) 

From: Envisioning our Future Phase III: The Pandemic Changed Everything, March 29, 2022

We’ve now been envisioning our future for three years. What a privilege it has been to work with three great research teams, to conduct, read and reflect on 86 interviews, and to get at this deep level of organizational self-study over three years. As we look to the future now, we need to look closely at how the hybrid work environment impacts our organization – from basics of technology and communication practices to how we collaborate effectively in our work. How do we move forward together, in unison and in service to the University? How do we create a sense of community and organizational cohesion, particularly as we onboard new staff members? And importantly, how do we best support one another during these periods of stress and disruption – as we learn that this change is not temporary, but a constant.

Posted in library spaces, organization culture and assessment, qualitative research, research work practice | Tagged , , , | Leave a comment

Strategies for Growing Our Institutional Repository: Evaluating TUScholarShare’s CV Review Service Training for Liaison Librarians

A guest post from Alicia Pucci and Annie Johnson 

Since the launch of Temple University’s first institutional repository (IR) TUScholarShare in fall 2020, Alicia Pucci,  TUScholarShare’s administrator, and Annie Johnson, assistant director for open publishing initiatives and scholarly communications have tested strategies for raising awareness of the service and depositing more content. In this guest post, they describe a project with liaison librarians, undertaken with support from the Scholarly Communications Strategic Steering Team, that was designed to address some of these concerns.


The CV review service is one of four mediated TUScholarShare deposit services that are offered to Temple faculty and staff. This is where TUScholarShare staff review a CV or list of publications, determine copyright and permissions for every work, and deposit permitted scholarship to the repository on the author’s behalf. By adopting a “we’ll do it for you” service model, depositors do not need to do any heavy lifting to self-archive their scholarship, which also makes this deposit option a good first step toward familiarizing themselves with TUScholarShare. Like other institutions that have a similar service in place for their IR, we currently rely on a student assistant to help with this review work. Yet, in spite of this support, our most popular service among faculty depositors remains a labor and time-intensive workflow for the TUScholarShare team. How do we actively promote the use of this service to faculty while making the workflow more sustainable? We decided to try turning to our colleagues in Learning and Research Services for additional support.

In fall 2021, we asked for liaison librarians to volunteer to review at least one CV for a faculty member from their department by the end of the fall 2021 semester. In order to support them in undertaking this work, we conducted a two-part training on the CV review service for liaisons and other interested library staff. The greater part of the training focused on showing liaisons how to do the first two parts of the CV review workflow: conducting targeted outreach to their faculty to obtain a CV and reviewing copyright and self-archiving permissions of the faculty member’s scholarship. 


Our objectives for this project were twofold. First, we aimed to raise awareness of TUScholarShare among faculty. By drawing on the existing rapport between liaisons and their departments, we hoped to initiate a conversation about the CV review service and establish a level of trust in TUScholarShare. Second, by asking liaisons to conduct targeted outreach to their departments and review CVs, we aimed to create a more streamlined and sustainable workflow for depositing new content into the repository.     

Key Findings 

Overall, 11 liaisons participated in the training and 10 were able to obtain at least one faculty CV to review. By the end of the project, 10 CVs (across 5 different schools/ colleges) were reviewed and a total of 93 new items were deposited to the repository. Once each liaison completed their review, they were asked to complete a short survey. The survey found that: 

  • Faculty were more responsive and willing to submit their CV for review if they had a previous interaction or an existing relationship with the liaison. Only 1 liaison was unable to receive a response from the faculty they reached out to.
  • The total time it took for liaisons to review 1 CV varied greatly, from 1 hour to 4 days. However, the time frame for liaisons participating in this project was stretched out over a span of 1-3 months. 
  • Liaisons felt more confident talking about the service and related topics (i.e. Creative Commons licenses, pre-prints, post-prints, permissions) after completing the review process.
  • Some of the benefits of participating in this project noted by liaisons included: “opened a channel of communication with faculty on publishing topics,” “gained insights into faculty publishing habits,” and “learned how to track down hard-to-find materials.”
  • The most common challenge among liaisons was understanding how to read and navigate the publisher policy database Sherpa Romeo. An in-depth review of this tool was not covered in the training.

While the project was a success in that liaisons are now more prepared to educate faculty about the service, their targeted outreach efforts did not garner much interest and no additional CV requests were received outside of the project. In addition, while new content was deposited to the repository, the workflow was more slow-going when compared to its typical turnaround time because it required more oversight and support from the TUScholarShare Administrator, Alicia Pucci. Finally, many liaisons understandably found that it was hard to make the time, with all of their other responsibilities, to complete this extra work. Although all the CV reviews were supposed to be completed by the end of December 2021, many liaisons were not able to finish until spring 2022.

Recommendations for Future Involvement

Members of the SCSST sat down to discuss the results of the survey and to brainstorm strategies for moving forward with this work. Some members of the team had also participated in the project reviewing CVs, so they were able to expand on their responses to the survey. Ultimately, the group concluded that this model would not work as a permanent method to grow the content in the repository and make the CV review service more sustainable. More staffing, whether in the form of student workers or permanent staff members, is needed to continue to grow this service.

Other ideas and thoughts that came out of the SCSST’s assessment of the project include:

  • Create more personalized outreach emails about the service to faculty that appeal to them (e.g. their discipline, their role in academia) and explain the benefits of self-archiving their work beyond discovery (e.g. built-in metrics to reflect a scholar’s influence and impact, see the work of their colleagues in the repository).
  • Identify liaisons from the project that are interested in doing more CV review work in the future. If TUScholarShare receives a CV review request from a faculty member in their department, and the liaison has time, they can do the review themselves.
  • Consider other ways to scale up the service workflow:
    • Approach a specific department on campus to work with to review CVs for all of their faculty members. A large-scale project allows for a more flexible turnaround time to complete this work.
    • Review CVs for faculty members from similar disciplines, especially if they are known to publish in the same outlets. This would cut down on the time it takes to check publisher permissions. 
  • Consider alternative ways for liaisons to get involved in this work without having to review CVs:
    • Instead of having liaisons do the initial full CV review, they will do upkeep on CVs of faculty from their departments (i.e. check for new publications to deposit to the repository).
    • Outline different levels of involvement that liaisons can choose from that works best with their current workload/ schedules. 

In light of our results, we are currently reassessing how to continue to grow the repository and how to involve liaisons in repository work. As a nascent repository, should we prioritize creating a sustainable flow of content into the repository to demonstrate its relevance to faculty? Or should our focus be on educating liaisons about the benefits of the service and the type of work it entails so they may become better advocates of the repository? Currently, with our limited staffing, it is simply not possible to do both.

Special thanks to Michelle Cosby, Kristina De Voe, Lauri Fennell, Keena Hilliard, Rebecca Lloyd, Ryan Mulligan, and Natalie Tagge from the SCSST, as well as to all the liaisons who volunteered to participate in this project.

Posted in process improvement, research work practice, service assessment | Tagged , , , | Leave a comment

Furniture feedback in Charles Library: part II

Excerpts from the report on the fall 2021 Charles Library furniture study

In October of 2021, a variety of new furniture options for Charles Library were placed on display in the first-floor event space. From 8 am to 8 pm, for three days, we invited students to test the new furniture and give us their feedback. In coordination with Libraries Facilities Services, I gathered and documented student feedback. Library staff from various departments volunteered to help in the event space and talk with students during the busier times of the day, and the Outreach and Communications team helped me get the word out about the event.

In a previous furniture study in 2019, we learned that students were dissatisfied with the primarily open table/group study seating in Charles. Students wanted seating options that supported focused, individual work – such as private or semi-private carrels – and furniture that was comfortable and supportive enough to work in over long periods. “Cozier” lounge furniture was also desired.

The furniture display featured a study carrel, semi-private booths, laptop tables, and various other seating options.


Taking inspiration from a study at the University of Arizona Libraries, I used a mixed-methods approach to gathering feedback. In the event space, students could fill out comment cards, write on large whiteboards, or chat with us about the furniture. Each item was tagged with a letter for easy identification, e.g. the high-top table and chairs were assigned the letter “H.”

Using different methods allowed us to get a sense of how students liked the pieces individually and to compare pieces that were similar. With COVID safety measures still in place, we also offered online options, Instagram polling and online comment cards, so that students could participate remotely. In-person participants were offered individually packaged snacks in exchange for their time and feedback.


Overall students were pleased with the new furniture choices and excited to give their feedback.

Students liked large work surfaces that can accommodate many study materials at once.
Students want a space where they can spread out laptops, books, notebooks, etc. The laptop table on display was way too small.

laptop table with identifying label D on display

laptop table D

“No. Please. No. No. Please. Where am I supposed to put any of my books?” (laptop table D)

The booths, carrel, high-back sofa, and grey chair were favorites.
The most popular pieces were those that felt comfortable, but supportive, offered some privacy, and could accommodate a range of activities including individual work or socializing.

high top booth on display in event space

booth G

“Super fun, I feel like I’m in a cafe without the pressure of ordering so that I don’t have to leave. Has space for a drink, laptop, books, your backpack and jacket, everything! and I can imagine it being great for convos or studying with a friend” (booth G)


booth labeled K on display in event space

booth K

“I love this one! the privacy and comfort levels are amazing in this booth! I am comfortable yet focused and can work with somebody else” (booth K)

“I love this piece of furniture, the cushion is well designed on the chairs just about every angle I’m sitting in is comfortable — and the table is a perfect size for just about anything” (booth K)


soft seating chair labeled A on display in event space

soft seating A

“Comfort with structured back for support and study, love that…” (chair A)

Soft seating got mixed reviews.
Many students found F to be super comfy, but saw it as less conducive to studying and focused work.

soft seating sofa on display in charles library

soft seating F

E was one of the least popular pieces. Some found it too upright and firm, though others did like the color and noted that it could be a good nap spot.

soft seating labeled E on display in event space

soft seating E

“Feels like I’m awaiting trial in a sci-fi film.” (soft seating E)


Accessibility was a critical factor for those with and without disabilities.
While booth G was one of the most liked pieces, students noted that it was not accessible for those in wheelchairs. Because it wasn’t adjustable, maneuvering in and out of the high-top booth seating seemed potentially challenging for all bodies.

There’s a lot of variation in student preferences for study spaces.
Talking directly with users gave us more insight into preferences around comfort and seating types. While most students loved the privacy of semi-enclosed furniture like the carrel, others described the carrel as too isolating. Being able to look up and see other students at work was motivating, and for some, open study areas felt more productive.

For more details

Take a look at the full report and full list of the furniture that was on display.

Posted in library spaces, qualitative research, surveys, user experience | Tagged , , | Comments Off on Furniture feedback in Charles Library: part II

Supporting Big Data Research at Temple: Reflections on the Research Process

Congratulations to the research team that just completed its report on Supporting Big Data Research at Temple University.  Freely available on TU Scholarshare, I think the report will be widely read and have real impact in contributing to the conversation on support for data research at Temple.  

Rather than use this space to regurgitate the report’s important findings, I asked the team about the research experience itself. Here is an edited version of that “interview” conducted via email.  Contributing comments are from:  Will Dean, Fred Rowland, Adam Shambaugh and Gretchen Sneff, as well as Bernadette Mulvey (executive director, Information Technology Services) who assisted in the work.

From your perspective, what was the value of using this semi-structure interview method for the questions you had about support for big data research at Temple? 

GS: We asked big data researchers and researchers who used data science methods about their research methods, their training, their data sharing and publishing practices, etc., These questions are relevant to us as a university unit that supports academics and research. The specific questions we asked had been tested and modified, as needed, by the project coordinating team at Ithaka S+R.   

Four of us on the study team conducted interviews so using a semi-structured interview method helped ensure that we covered all the topics of interest in each interview. Because it was semi-structured, we could also go into more depth or breadth, as needed.    

The transcripts of our conversations with Temple researchers were shared with a central coordinating research team at Ithaka S+R.  Because the study teams at all 21 participating institutions used the same semi-structured interview questions, Ithaka S+R was more easily able to work with responses across institutions.   

AS:  The semi-structured approach allowed me and the other interviewers an opportunity to understand the complexities of data science research. We were able to ask follow-up questions when necessary, and the researchers had the opportunity to reflect on the specific circumstances surrounding their research. 

FR:  Since I was unfamiliar with this very complex topic, it helped to be able to have a free-ranging conversation to tease out the many, many things I did not understand. It also allowed me to pursue some issues that I found interesting. It allowed me (and us) to get a sense of how the researchers thought about and processed their work. 

What did you find challenging about using a qualitative approach in understanding how big data research is supported at Temple?  

WD: It was hard to know how much weight to give a participant’s comments, suggestions, and explanations, since we were only interviewing a small number of people and only those who agreed to speak with us. Though we tried to recruit a variety of participants from across disciplines, it was hard to know whose valuable insights we did not hear. 

BM: The qualitative approach, to me, left a lot of room for interpretation.  It was interesting for me as a technology veteran and how I interpreted the interview transcripts as opposed to the other team members.  It was difficult to understand the science that was being called out on top of understanding their use of technology and the resources that they have available in the science community. 

In terms of the process of participant recruitment, data collection and management, the analysis and report writing – what aspects of the process were most challenging 

AS: The biggest challenge came from synthesizing the report into a cohesive document that did justice to the varied landscape of data science research at Temple. 

GS: Surprisingly, collaborative writing and document sharing was challenging.  We were all used to using Google Drive to collaborate on writing documents.  We switched to using Microsoft OneDrive at the start of the project. Because document sharing operated differently than expected, a couple of us lost work that had taken us hours to do.    

Our study was done during the pandemic and interviews were conducted over Zoom. Connectivity and audio quality problems were sometimes an issue.   Recruiting participants who met study criteria—big data researchers or researchers using data science methods—required more time and effort than expected.   

What was the most interesting, or surprising thing, that you learned from the project? 

FR: Thinking back on that first pass through the 14 interviews, when we understood so little, it is interesting – though not altogether surprising – that we accomplished so much with this report. It shows what a long persistent slog through dense material – step by step sense-making – can reveal. 

AS: I’m not sure I found anything particularly surprising, but all of it was interesting! As a liaison librarian, I found this work invaluable in understanding the evolving research needs of faculty on Temple’s campus. 

WD: The most interesting thing, to me, was the window into the work of the researchers that our interviews provided. Research products are often compact and cohesive (a paper delving into a single intervention, a clean dataset of a clear variables), and it was enlightening to hear researchers talk about the complexities of their work and the essential labor of many collaborators that make it happen. 

GS: The Temple researchers we spoke with, for the most part, expressed a commitment to sharing their research data.  Their interest in sharing data went beyond meeting the mandates and requirements of funders and publishers, as many expressed it as an ethical principal or a value they held, a practice of importance not only to them but to the scientific research community.   

BM:  For me, it was interesting to not only hear about the research itself and the resources that the researchers have and don’t have but I also learned a lot about resources that the Temple University Library offers. 

Again, a big thank you to the team and their willingness to share some reflections on the process. Nice work!

Posted in qualitative research, research work practice | Tagged , | Leave a comment

What Counts

To count (verb): to tally, to add up, to total, to recite numerals in ascending order

To count (verb) : to matter, to be considered, to be included, to have importance

I have posted multiple times in this space about metrics – the challenges and the seemingly arbitrary decisions we make when quantifying the libraries’ work for surveys like ARL, IPEDS, AASHL and ACRL.  Given my ambivalent feelings about counting, I appreciated a recent Curious Minds interview with Deborah Stone about her book, Counting. A social scientist, Stone reminds us: 

There is no such thing as a raw number.  At least in human affairs. Every number is the result of a decision about what is important; what is worth paying attention to.

Stone’s book is an exploration of the ways this plays out in the social world, with vivid examples of how seemingly precise figures for “unemployment” and “ethnicity” are replete with value judgements, arbitrary decision making, and historically based judgement about what is countable and how.  

Counting forces us to classify things, to categorize them. Because being countable is a value statement, counting is a way to exert power. We are familiar with how this impacts voting and in census-taking. To be “accountable” is to take responsibility for counting that is fair and honest.   

Libraries are big counters, and hence classifiers, asking “Is this a reference question or a directional question?” The former counts, the latter does not, at least as far as the NISO (our information standards organization) has dictated.   Asking a library staff person an “informational” question counts, but receiving help to find a book on the shelf or assistance with placing an item on reserve, those transactions are not counted as reference.  Reference transactions, that special kind of service,  are valued in a different way. 

Yet, isn’t the service of helping a patron to access resources on their own at least as much value as that transaction that provides the answer? It seems that helping a community user log on to a computer in order to apply for a job should count for something. Our standard surveys don’t ask about those transactions. What patrons ask of library staff is changing rapidly, as are the skills required to provide those services. 

The challenge is finding measures that truly gauge the value we provide to our communities. These metrics need to be applicable over time and relevant to libraries of many types. And measurable with systems (or less reliably, people) that apply them accurately and consistently. 

Stone’s final words remind us of the dangers of equating numbers with facts:

When we decide what to count, we frame an issue as surely as the painter composes a scene. Our numbers embody the concerns, priorities, and values that guide us as we decide who or what belongs in the categories we’re counting …We should count as if we’ll soon be infected by our own numbers. For in the end, what numbers do to others, they do to us as well. (Stone, D. Counting. New York: W.W. Norton, 2020) 

Posted in statistics | Tagged , | 1 Comment

The Ways of the Teacher, Leader, and Assessment Practitioner


Ways of the teacher

This summer I am teaching a class in leadership at Drexel’s College of Computing and Infomatics, a required course in their masters program for Library and Information Science. What a thrill! To be in the role of instructor, helping soon-to-be information professionals as they contribute in new ways to their organizations – it’s an exciting challenge, but also scary. As that class wraps up,  I am reflecting on my own practice as a teacher, leader, and practitioner of assessment.  It turns out they connect in some unexpected ways.

The reflection on teaching was encouraged as part of the perfectly-timed Teaching Challenge designed and hosted by the Learning & Student Success Strategic Steering team here at Temple Libraries.  It was so beneficial to join a cohort of other librarians engaged with instruction, helping me to feel less isolated in my role as adjunct. I had much to learn from my supportive, and more experienced colleagues.

The challenge encouraged me to ask hard questions:  “What’s important to me as an instructor? What is my teaching philosophy? ” It solidified for me the idea that I didn’t want to be the “sage on the stage.” My desired role was to facilitate a learning experience for my students in which we learned from one another.

But I also needed to consider the students’ expectations for the class. What do they need from me as an instructor? How do I balance their needs and my responsibilities to them? How best do I assess their learning needs in this asynchronous, online environment? How do I connect to them in a way that feels authentic?

I may have had more questions than answers, but these are things I’ve learned about teaching so far:

  • Take time to reflect on your practice
  • To be prepared to learn and be prepared to change. Be humble.
  • Be respectful of others and their voices, their experiences.
  • Be available to make changes based on students’ feedback, but remember that you are the instructor – you are being paid to make hard decisions about course policies and procedures.
  • While students are responsible for their own learning, you have the responsibility  to facilitate and foster that learning.

Ways of the leader

The students’ favorite part of the class was a series of “practitioner” interviews I conducted with colleagues. They have many different roles, including deans and directors of research libraries, public libraries, IT and organizational performance.   I asked them to define management versus leadership, and what they experience as opportunities and challenges in their practice as managers and leaders.

From these interviews, I learned some things too:

  • Good managers and leaders see the value of self-reflection and knowing yourself – your strengths and where you have challenges (and of course the willingness to work at those challenges). ​
  • Managers and leaders have a passionate desire to learn continuously.​
  • Good managers are good listeners, cultivating and coaching their staff, oftentimes acknowledging strengths that were not recognized before.
  • Good managers are able to think outside their managerial “domain” to consider the needs and goals of the organization (and the organization’s parent institution) in a holistic way. ​
  • Good leaders are able to see a bigger, longer term picture.  Metaphors for the leaders’ view were expressed in terms of height, distance, and time: the “30,000-foot view”, the “long view”, the “5-10 year vision”.​

Connecting to assessment practice

These activities surfaced for me many parallels between teaching and leadership. And of course good assessment practice incorporates many of the same maxims.  (Maybe these are life strategies as well,  but that’s a different blog!)

  • Curiosity: Always be asking questions. Never assume that the current way is the only way.
  • Learning: Always be seeking to improve.
  • Engagement: Always be curious and engaged with the user experience. They are a large part of what we’re about.
  • Self-reflection: Always be aware of your own biases. Be willing to listen to diverse voices. That diversity makes us stronger in thinking about solutions.
  • Vision: Be patient. Cultivating a culture of assessment takes time. Take the long view.

Reflecting on these connections provides me with a renewed sense of purpose as we begin the academic year. Building a culture of assessment here at Temple Libraries/Press isn’t just about insuring we count reference transactions the same way. The practice is also one of teaching and leadership as we work with the organization, helping it to grow and support the University community in new ways.

Posted in organization culture and assessment | Tagged , | Leave a comment

Assessment in Uncertain Times

We assessment professionals are moving forward through uncertain territory as we adjust to new realities on campuses and communities and in our libraries. I had the privilege of talking with my colleagues about these issues by convening the ACRL Assessment Discussion Group last week as part of the online ALA annual conference.

One of the advantages of the virtual discussion, other than hosting from the comfort of my kitchen, was participation across the gamut of geography and library types. Attendance was triple the average in person meeting.  But there are new kinds of risks: technology fail, time zone confusion, and the challenge of facilitating a participatory discussion with a large group. 

I set the stage with these prompts:

 The last year has occupied us all with the many changes to insure ongoing library operations during the COVID pandemic.

  • How are libraries best adapting to these new campus realities, requiring us to reconsider how we provide services, access to resources and manage our physical resources?
  • What new assessment opportunities does this provide us?
  • Have circumstances changed our planned assessment projects in terms of method, access or questions to be asked? 
  • Alternatively, has our assessment program had to take a temporary back seat as we deal with other, more immediate concerns?

In six small groups we first shared with one another the current state of affairs at our libraries. As to be expected, this continues to be a mix, with varying degrees of open or reduced hours, service models for delivery of physical materials, and staffing work-from-home policies. Some libraries continue to enforce a masking policy, others do not. In many libraries, mandates against masking policies are set externally, often from state or local governments. Many libraries have removed signage, social distancing policies, and physical barriers (like Plexiglass) at service desks. Others continue to provide more limited seating or reservation-only study space, although most libraries enforcing restricted access anticipate fewer restrictions in the fall. 

There are prevalent concerns expressed by our library staff that any expectation of a  “return to normal” in the fall is unrealistic.  Perhaps, at the extreme end of these concerns is that libraries have “lost” a generation of students — many students have  come to rely on services provided remotely rather than the physical spaces and in-person services provided by the library. 

We discussed how the pandemic impacted our organizations and staff. In addition to changes in remote work policies and hiring freezes brought on by budgetary uncertainty, we reported an increase in early retirements and resignations.  

Of course, the pandemic has had a significant impact on transaction numbers, sometimes in surprising ways. For instance, not all libraries are seeing the increase in use of electronic resources they might have expected.  While it’s important that we don’t assume all anomalies to be caused by the pandemic, many of us have questions as to how these dips will be explained in our trend analyses — how surveys like ACRL and ARL will take this exceptional period into account. Jeannette Pierce, on the editorial board of the ACRL Academic Library Trends and Statistics survey, filled us in on adjustments made to that instrument. 

These transitions also provide us with opportunities and challenges.

The pandemic has changed some approaches we use to conduct assessment. At the same time that we want to know more about student satisfaction with pandemic-driven changes, we have observed “survey fatigue” in both students and faculty. Our current efforts may need to be focused more on outreach and communication than “true”  assessment. We may be seeking user feedback in more ad hoc ways. In practice, Zoom has proved useful for user testing. 

The pandemic prompted many libraries to employ counters for measuring space occupancy. Software used for this purpose includes:

  • SMS Store Traffic
  • SenSource SafeSpace
  • Occuspace 

These tools also allow for public dashboards that are useful to students in locating available study space. It was noted that requirements for reserving space in advance at the library was a barrier to students, in this case commuter students, who think of the library as community space, a place to “be” in between classes. 

We are closely evaluating our collections, considering the impact of shifting dollars spent for print collections into electronic format. In some cases this has led to a push for decreasing our print collection footprint. As noted above, the expected increase in usage of those e-formats has not happened across the board — is this, perhaps,  a consequence of “electronic interaction burnout”? 

The pandemic has provided multiple opportunities for us to assess the effectiveness of online instruction as well as the use of online research guides. We are monitoring the use of social media and the web site, but recognize the limitations to these counts as measures of effectiveness.  

We concluded on a positive note: In many ways, the pandemic allowed libraries to “show their stuff” —  re-working service models to accommodate safety protocols in agile ways, rapidly transitioning to electronic course reserves, digitizing special collections materials at a new pace. As we assess user acceptance of these changes, many adaptations will carry over into our regular work when we return to a “more like normal” workspace in the fall. 

Posted in conference reports | Tagged , , | Leave a comment

LibGuide Assessment from the Ground Up

Librarian Rick Lezenby authors many Libguides. In this guest post, Rick shares some insights about assessment and the value of listening to users as we collaborate on tools that support their instruction.

Libguides at Temple Libraries are guides to library resources and related information skills built on the web-authoring platform from Springshare. These are used mainly as introductions to degree program subject guides and for guides created for specific courses. After a number of years of a laissez-faire approach to their look and use,  the Libraries in 2017 developed detailed standards, based on usability testing conducted at Temple University Libraries and other institutions, for a uniform look and purpose. Guides also go through a review process to avoid duplication with similar guides. Then there is a checklist review of required usability format standards once the guide is submitted for publication. Beyond that, the content of guides continues to be left to the discretion of subject-specialist librarians.

We in the Libraries have not yet developed a good way to assess the level of satisfaction users have with these guides. Getting detailed feedback has been hard. For years, I have been creating libguides for subjects, topics and courses with little feedback from users or faculty beyond “Thanks!”, “Great!” when asked directly. The daily hit counts provided by Springshare do indicate how much a guide is accessed and how many of the sub-pages of a guide are viewed if at all. There is no tracking where users go next.  It has always been a bit of a guessing game as to what should go into a guide beyond a standard list of likely tools and general advice.

Over the summer of 2020, I had the pleasant surprise of receiving two full plates of unsolicited recommendations, one from the faculty in the Global Studies department and the other from the Political Science faculty. Both were lengthy documents full of titles of what was important to them. It also gave insight into what library resources they were aware of.

In the case of Global Studies, I had created a subject guide when the department was first created. I was now given a chance here to compare my original guide based on an outsider’s perspective with what faculty thought independently of what I had created. It gave me grounds for comparing what I thought should go into the guide versus what faculty thought independently of that. Global Studies at Temple strives to be an interdisciplinary program that ranges across Arts & Humanities and Social Sciences, using the areas of global security, economy and cultures as touchstones. The senior capstone projects could be on just about anything situated in global, or at least, multi-foreign context. 

Global Studies was first headed in the mid 2010’s by a faculty member out of the Political Science department, which was my subject librarian area, and with whom I had a good working relationship for a number of years prior to that. In 2020, the new chair came out of the Sociology department, which had not been one of my areas at the time. A group headed by the new chair sent me a document for a proposed research guide, with specifics on each section of the guide.

Goals:  The guide should provide:

  • Resources for students (touches on all three tracks: culture, economy, security)
  • Highlights issues/themes of human security, human development, gender, race, language, cultures, terrorism, environmental concerns, international trade, international financial institutions
  • Mainstream Global South
  • Highlight source type/variety
  • Feature access to primary sources
  • Perhaps a guide on citations


  • Dictionaries, Encyclopedias 
    • Provide an explanation of “using reference sources” 
  • Handbooks/encyclopedias
    • For example, The Oxford Handbook of Global Studies

The faculty member took time to review other guides in Temple’s system and pointed out to me those that might serve as “models.” They were specific about how resources should be organized, using examples from other guides.

The advice had me looking at sources in a completely different way. Faculty in Global Studies think about tracks in that program: culture, economy, security – and expect their students will identify best resources in that way. They preferred listing resources as specific titles with links to the library’s catalog entry.  The suggested Articles and Databases showed awareness of those and some lack of knowledge of databases that could serve some purposes better than others. And, the unorganized list of Great Sources of Data presented me with a challenge to organize it. 

Seeing what they liked about guides and what they wanted was probably a unique experience, almost impossible to replicate to this detail for other departments. It was their motivation driving it. But, it does suggest a framework for getting feedback from other departments.

Similarly departmentally-motivated, in the summer of 2020 I received a list from the Political Science department Chair with the title: Books TU Polisci Faculty think Undergraduates Should Read – May 2020. It was created by faculty in the midst of the pandemic when there was some uncertainty about what the university would be doing going forward and intensifying street protests. 

Political Science Reading List Guide

The list ran to 12 pages,  mainly of political classics important to faculty along with a section on race. At the time, the library was closed to all, so I offered to organize and turn the list into a libguide with links to ebooks where possible. The titles were listed under each professor’s name, so it became much like the soon-to-become notorious practice of analyzing bookshelves behind Zoom participants. My goal was to include a brief description of the book from available web sources. The process of putting all these titles together on a libguide with links was a bit mundane, but it did force me to attend in great detail to the titles and summaries of their content.

In collaborating with faculty on these guides, I acquired significant insight into how professors would direct students in a way I would not otherwise be privy to, and a way to think how well my guides reflected that and the department overall. It made me aware that as a librarian, my interest has been in providing resources primarily to assist with immediate projects. Faculty have longer range goals in mind to develop students beyond the assigned essay or term paper project, not necessarily tied to a semester course. Finding a way to link the two approaches in a guide requires much more communication between us.


Posted in instruction and student learning, research work practice, service assessment | Tagged , | Leave a comment

Discovering sources in Library Search: key takeaways from remote user interviews with history students

As a followup to last year’s Browse Prototyping project, Rebecca Lloyd and I conducted remote user interviews with upper level history students in December 2020, just as the fall semester was wrapping up.

Using a semi-structured interview technique, we talked to four students to find out how they discover and use sources generally, and how they use Library Search, including how they use filters, and how they use metadata such as call number, author, and subject. All of the students were in the midst of substantial capstone projects that required finding and using at least two books. We asked them to describe their projects and to tell us about specific search strategies and tools. After the interview, we asked the students to review the Library Search interface. We were particularly interested in their use of search facets, including the new Library of Congress classification filter.

A comprehensive overview of our findings and recommendations can be found in our full report, presentation, or recording of April’s Assessment Community of practice. For this post, we’re focusing on a few of the observations that we found most interesting and most critical for consideration as the Libraries continue to develop discovery features in Library Search.

Students do “browse,” in Library Search now, just not in the way library staff may think of browsing.

All of the students we talked to reported using simple keyword searches when looking for books on their topics. These searches usually resulted in a lot of hits, but long lists of results were not a deterrent. Rather than using filters or more specific keywords to narrow a search, the students usually scrolled through long lists of search results to find the books that were most relevant to their topic. They evaluated sources quickly; most focused on scanning titles to determine whether a source met their needs. Some mentioned looking at chapter lists or other metadata to get a sense of the book’s contents or usefulness, but title and author seemed to be the most useful indicators of whether something was worth further reading.

Library Search was only one tool used to evaluate and select relevant sources.

To find sources, the students we talked to, unsurprisingly, relied on resources beyond Library Search. More surprising was that recommendations from faculty and librarians were one of the primary ways that all of the students identified key sources, especially early in their research process. One student even reported checking with their faculty advisor about a book before deciding to use it. They also relied heavily on bibliographies from past research projects as well as their previous knowledge of key authors who had written about their topics.

Most of the students preferred print books to electronic, and browsing the shelves is a valuable experience.

Being able to access electronic sources was critical during the COVID pandemic. However, most told us that they preferred using print books in general. Three of the four told us they preferred print materials for reading, one sharing that, “I feel like I’m doing, like, professional research when I’m actually looking at a [print] book. Whereas … if I’m doing it through my screen, I often feel like I’m just doing, like, busy work for classes.”

The students also told us that they liked requesting materials from the Bookbot. While they appreciated the convenience of using an ASRS, all of the students shared stories of browsing the stacks in Paley or the fourth floor of Charles. They recognized that materials physically co-located were topically similar and felt compelled to browse nearby items when they visited the stacks.

One student shared that they only came to see the need and value for browsing the stacks once they were in more advanced history courses and doing more self-directed research. They didn’t do much shelf browsing in Paley but have found the open stacks in Charles to be very useful. “It’s underrated for me because I’m a history major, the stacks on the fourth floor [are] nice to us.” Especially when doing a comprehensive research project like a capstone, the opportunity to go to the shelf to retrieve a book and then, as one student said, “look around and [see] if anything like the other title, like, on the spine caught my interest” is valuable to students.

Conducting interviews over Zoom worked really well. 

Finally, we wanted to include some thoughts about conducting student interviews over Zoom (thank you to Katie Westbrook for her question about remote interviews during the Community of Practice!). Surprisingly, Zoom turned out to be a perfect tool for user interviews. Logistical tasks like securing a private interview location, providing directions, and setting up a laptop and recording software were suddenly not necessary. Zoom made it easy to capture everything including audio and video recordings and transcripts in one place. Transcript cleanup was time consuming, but far less so than if we’d transcribed the interviews ourselves. 

From our perspective, conducting the interviews remotely mitigated the feeling of unnaturalness that can come with doing user research in a formal space. The students could talk with us from their own locations and use their own devices to show us how they used Library Search. Most noticeably, the uncomfortable feeling of watching and being watched that accompanies user testing was absent; the students shared their screens with us as they explored the Library Search interface, and we were able to easily see their screen interactions without looking over their shoulders.

Posted in instruction and student learning, qualitative research, technology use, uncategorized, usability, user experience | Tagged , , , | Comments Off on Discovering sources in Library Search: key takeaways from remote user interviews with history students