Students Asking Students, “How Do You Use the Library?”  

The Student Library Advisory Board program provides a natural opportunity for library staff to learn from students and to gather feedback about their experience with the libraries’ spaces and services. Board members, paid a stipend each semester, are regular users and through tours and meetings with staff, have above-average knowledge of library operations. As their spring 2025 project, this year’s SLAB conducted an independent project – interviewing fellow students, friends and roommates about their experience with the library. We anticipated that this peer-to-peer approach would yield responses that would be different, perhaps more honest, than the feedback library staff hear directly.  The exercise proved useful to students as well – providing them an expanded perspective on the library and its use.   

Image from Adobe Stock

How Did We Organize the Project?

Students were given a set of detailed instructions for conducting semi-structured interviews, including a “script.” We provided tips on how to be neutral in asking questions as well as suggestions for probes. Students were each tasked with interviewing three peers and to document responses on a shared slide deck. Our meetings for the spring were structured around this project – with test interviews, debrief, and opportunities to gather and share findings together. Our aim was the identification of themes and from this, potential recommendations to library administration for improving student experience with the library spaces and services.  

5 out of the 6 SLAB members participated, conducting a total of 16 interviews.  Thanks go to these students:

  • Joseph Lendacky, Fox School of Business (Marketing)
  • Lana Lerner, College of Science and technology (Computer Science) 
  • Katelyn Man Freshman, College of Science and technology (Biochemistry)   
  • Deyana Tabatabaei, College of Science and Technology (Genomic Medicine)
  • Sehaj Venugopal Varma, College of Liberal Arts (Neuroscience) 

What Did We Learn about Library Space Use?

  • Goes to the library to print stuff out and check out any textbooks needed for class
  • Does not use the library and doesn’t feel like going
  • Uses the library almost every day and goes in between classes

To summarize the findings about space use:

The library is valued for its quiet spaces to study, both for individual work and as part of a group. Study rooms and individual carrels are most popular. One student expressed discomfort with large tables requiring them to sit with people they don’t know. And students also report socializing at the library. Some use the space every day, others have no interest in the library at all. The library is used for also for practical necessities – the printers are popular, as are the charging stations, and for items placed on course reserve. 

Students asked their peers about other favorite places to do school work. Dorms were cited, particularly if a printer is available, as well as Alter Hall, the J&H basement, and the Tech Center. There may be an appeal for study spaces provided in classroom buildings, where students in similar classes can gather. The Founder’s Garden was mentioned twice, in addition to the rooftop of Mazur Hall, indicating that students enjoy studying outdoors if the weather is fine. Coffee shops and the cafeteria provide a “change of scenery” and a positive “vibe.”  

As noted above, study rooms at Charles and the 24/7 space are cited as facilitating group work, with white boards mentioned as a useful room amenity.  

What Did We Learn about Collections Use?

  • Found a paper through Google Scholar and then logged in through Temple which redirected to the Temple library website. 
    • Primarily uses Google and isn’t aware of library resources
    • 1st step they usually take is to type keywords into the library catalog and then going to sources cited in good papers. Problem: sometimes papers aren’t actually available through the catalog.
      • Mainly uses online resources, finds physical resources time-consuming and at times difficult.

Students asked their peers about the use of the library catalog, the library web site, and how they located materials when doing research.  To summarize:

Google or Google Scholar are the typical starting place for research. Respondents did not seem to distinguish between the library catalog and the library website more generally. There is wide variability around awareness of the library’s resources.   One student describes following a citation trail but expressed frustration when the full text article was unavailable through the library catalog. Another seems oblivious to the availability of online resources entirely.  

Students are divided on format, some express preference of print, others enjoy the accessibility of online resources. Some find it easy to locate materials through the catalog, others find it frustrating to locate physical materials.

Instructors are frequently the key initiators of library materials use by providing a direct link to the catalog for course materials or placing an item on reserve. 

What Did We Learn about Library Assessment?

We did our best to ensure quality data by providing students with clear instructions and a script to guide them in conducting interviews. But our own vested interest in the project idea and its potential for wider data collection may have clouded our expectations regarding the depth of the interviews, the documentation, and students’ enthusiasm for sharing. We imagined a lively final meeting with our students, excited to discuss their findings and potential recommendations to library administration.

We learned, of course, that students are not assessment librarians. And not all data gathering projects need to be tightly controlled assessment projects, rigorous in method and analysis. We, as library staff, know more than we knew before. We have some interesting insights to consider and we have learned from our own experience with the project and how students engaged with it. And that can be enough.

Thanks to co-leads for SLAB 24-25, Justin Hill and Evan Weinstein.

Posted in library spaces, qualitative research, research work practice, user experience | Tagged , , | Leave a comment

Partnering with ChatGPT to Analyze User Feedback

As regular readers of this blog know, I like to explore the use of generative AI as a collaborator in my assessment work – today I experimented with using ChatGPT to categorize a set of 100 open-ended responses providing feedback on our libraries’ public programs. The analysis includes raw textual data categorization, the production of a graphic, and a “chain of thought” exchange around how ChatGPT decided what responses belonged where. “Decided” may not be the most accurate word – it implies a consciousness that AI tools do not have. But still…

The feedback form is sent via email to every registrant to our programs – not all attendees register and the percentage of feedback to actual attendance is only about 5%. Of those, 48% are from community members, 15% from staff, 13% from alumni, 15% from students and 9% from faculty. It’s interesting to see that friends and community members provide the most feedback on our programs, although our attendance favors undergraduate students.

The identity question is a fixed option response and easily analyzed with Excel. The additional feedback we receive is free text, and that analysis is a little more complicated. We ask,
“How did you hear about the program?.”

This is where I called upon ChatGPT to assist:

Prompt: These are responses to a question about how people found out about a program. What are themes here?

Class
email
The Charles Library
Library Newsletter
From the artist
E-mail
Email
e-mail blast
Colleague

[edited]

Within seconds ChatGPT categorized the 100 responses into categories with an offer to create a visual:

How attendees learn about programs

How do we evaluate this output? On the surface (and with a small amount of data) it looks good. And so fast! ChatGPT also shows us how it populated the categories, allowing us to assess the scheme. Digging into the biggest category, Email Communication (42.4%) ChatGPT explains:

1. Email Communication

This is by far the most prominent theme, appearing in many forms.

  • General email: Email, Via Email, E-mail, e-mail blast, Email from Temple
  • Specific sources: Temple Libraries email, Library email, Temple email, Becca’s email, Email from Charles L. Blockson Afro-American Collection staff
  • Organizational lists: Temple List Serv, Library newsletter / AAR newsletter, OVPR, Forecast

To get additional clarification, I asked,

Explain why e-blast is categorized as Email Communication not Academic Communication. ChatGPT responds:

Great question! Here’s the reasoning behind categorizing e-blast under Email Communication rather than Academic/Classroom Communication:

Why “e-blast” is considered Email Communication:

  • Definition of an e-blast: An “e-blast” typically refers to a mass email sent out to a broad audience, often through mailing list software or institutional communication tools.
  • Primary characteristic: The defining feature of an e-blast is its delivery method—email—not the content or sender’s academic affiliation.
  • Content-neutral: While it may be sent by an academic department or contain academic information, the categorization is based on how people received the information, not who authored it.
  • Consistency: Other responses like “Email from Temple,” “Temple Libraries email,” or “Library Newsletter” also fall under this category, regardless of their academic origin, because they share the same communication channel.

How Academic Communication differs:

“Academic/Classroom Channels” refers more narrowly to:

  • Learning environments (e.g., in-class announcements)
  • Direct teacher or instructor communication (e.g., “Voice teacher,” “Professor,” “Class”)
  • Departmental word-of-mouth or instruction-related messages, where the context is a classroom or academic engagement—not generalized outreach

So, even if a professor sends out an e-blast, it still fits best under Email Communication because it functions more as broad outreach than individual academic instruction or class-based communication.

Would you like a brief footnote or caption explaining this added to your chart or presentation slide?


I was impressed with this output on several fronts. If I were to categorize these responses myself, it would have taken up quite some time. I would probably not be able to so clearly articulate my reasoning as to what fit where. The distinction ChatGPT makes between Email Communication and Academic Communication – based on the how it is received, seems rather nuanced to me, and not a division I would necessarily have made. Understanding this may lead to improved branding of our communication and outreach. Perhaps our strategies in reaching our different communities should be evaluated. And here we have evidence of personal networks in effectively getting the word out.

ChatGPT never has the last word. There is opportunity for refining the prompts and extending the dialog, and the request to explain through “chain of thought reasoning” is informative. But there are takeaways from this modest example. The analysis starts a conversation with colleagues around these data, what they might mean, and potential actions we might take to respond. The exercise is also a learning experience around AI tools and their increasing sophistication to analyze text, calculate numbers and visualize the results in powerfully convincing ways.

Posted in surveys | Tagged , | Leave a comment

Quick Poll Leads to Rapid Improvement

The University’s office of Institutional Research and Assessment launched a new assessment tool last fall, and the libraries was one of the first departments to try it out. It’s called a Quick Poll, and we used it to gather data on the use of study rooms at Charles.  

These polls are short 1-2 question surveys that appear when students, faculty or staff log onto their Temple portal. The visibility and ease of completion mean that participation rates are very high. Our poll was fielded between November 8 and November 22 2025, and in that brief period, we received 4954 responses!  As those responses roll in, they are available immediately to the research team (myself, Caitlin Shanley, Jackie Sipes, Steven Bell, Justin Hill and Evan Weinstein).

The expectation for quick polls is that we take quick action too – our team has already put together some next steps to improve the student experience with study room 

Our poll was just three questions. We first asked whether or not students use the study rooms at Charles. 

Of the 4954 students responding to the poll, about 2/3 do use the study rooms at Charles. That’s good news.  

Do you use the study rooms?  Count % 
Yes 3246 65.52% 
No 1708 34.48% 
Grand Total 4954 100.00% 

For those that do not use the study rooms, we asked why not. This was a fixed response question, and provides some insights towards improvements we might make.  

            Why not?                  Count % 
Do not use the library 393 28.46% 
Other study spaces are sufficient for my needs 374 27.08% 
Did not know about them 217 15.71% 
Other 199 14.41% 
The reservation system is too confusing 198 14.34% 
Grand Total 1381 100.00% 

For instance, over 15% of the students responding say they don’t use the rooms because they don’t know about them. There’s a messaging opportunity. About 14% don’t use the study rooms because the system is too confusing. Here’s an oppotunity for Jackie Sipes, our User Experience Librarian to collaborate with Justin Hill in Access Services to streamline the process – to the extent possible in Springshare’s reservation software system.

Our third question had students tell us how they used the rooms. This information aligns with what we observe. About 43% use the rooms as part of a group, and a large portion also use the spaces for quiet study.  

Activity in Study Room (Up to 3) Count % 
Studying alone in a quiet space 1745 27.65% 
Working on class assignments with others 1496 23.70% 
Working independently alongside friends (co-working) 1211 19.19% 
Using special features in the rooms (for example, whiteboard, large monitor, electrical outlet) 794 12.58% 
Participating in online classes or meetings (for example, via Zoom) 644 10.20% 
Meeting with a student organization 207 3.28% 
Eating 191 3.03% 
Other 23 0.36% 
 6311 100.00% 

As noted, the poll has provided us with solid numbers to guide us in tweaking our reservation system. In addition to communication and clarifying the reservation process, we will tighten up the reservation system by reminding students of upcoming room reservations and use QR codes in the individual study rooms to make reservations easy.

Concrete changes that should maximize our space utilization and enhance the student experience at the same time.

Posted in library spaces, service assessment, statistics, surveys | Tagged , , , | Leave a comment

What Care Looks Like

I have just returned from the 2024 Library Assessment Conference in Portland, Oregon.  I came away from that meeting with a true appreciation for what care can look like when we work together as colleagues and with our communities to provide engaging learning experiences.  As I head back to my day job, I am reflecting on what it takes to cultivate such a caring space and how that can be applied to our work with students, the library organization and our broader community at Temple.    

Full Disclosure – I served as co-chair of the conference with an amazing team: Jackie Belanger (University of Washington), ARL staff Jaymey Butler and Angela Pappalardo, and the planning committee membership.  

Physical Space

Conference planners are more attuned than ever to the diverse needs that attendees may have in accessing conference spaces and programs. Many conference goers were traveling election day, so we anticipated the additional attention required to provide spaces (both physical and non-physical) that felt safe and supportive. Providing for the physical is straightforward. We established: 

  •  A Quiet Room was designated for those desiring some respite from the fray.  
  • A Sensory Station was a welcome attraction with fidgets, coloring books, and other tactile experiences to support individual sensory processing needs.
  • Meeting rooms with clearly designated accessible seats set aside with a volunteer room manager to ensure that all attendees were comfortable.  
    • Screen texting of all sessions, so attendees could receive real time text on their personal device 
  • Readily available slides for each presentation available through the conference app.   
  • And something new to me: In the larger rooms special tables were designated as “Low Engagement” These details were noticed. Not every attendee wants to be networking all of the time. 

The ARL guidelines for co-creating a Community of Care are a useful model for us all.

Planning and Collaboration 

Tremendous planning contributes to the experience of participants feeling “cared for” in other ways. We started our planning work in January of 2023 (shortly after the 2022 conference)  –  selecting a venue that was accessible, affordable and in a state supportive of our professional values. From the selection and structuring of a planning committee, delegation of tasks to subcommittees, program development and program review, sponsorship, budgeting and registration – the myriad of details was immense. Our “run of show” for plenary sessions alone was 26 pages in length (thank you, Jaymey)! Every step of the way, attention to details and intentional design contributed to a positive user experience. Equally important was the willingness to flex and change course when things did not go as planned. 

Our planning team became a well-oiled machine. Not a single email was sent out to attendees nor webpage published, without some level of consultation across the leadership team. To do this effectively requires a good deal of trust and goodwill, as we were continually suggesting rephrasing of our messaging. We counted on one another to be responsive and timely in these suggestions – another essential aspect of working together. 

Lessons Learned

In planning our physical spaces here in the libraries, we do lots to support student wellbeing. Our study rooms provide space for quiet work, as well as gathering. We are generous in our support of students during potentially stressful times, providing puzzles, snacks and therapy dogs. Through programs like the Student Library Advisory Board and user experience activity, we build mechanisms for connecting and listening to students about their needs.

At the organizational level, how do we develop that community of care? What I learned in planning and implementing this conference: the creating a community of care is not just a policy, words on a webpage or space accommodations. It is a practice that takes tremendous work, planning, attention to (often tedious) detail, and trust in collaboration across a team. It’s a value I hope to practice even more now here at home.

These flowers welcomed me upon arrival at hotel in Portland.

Posted in conference reports, organization culture and assessment | Tagged , | Leave a comment

On Citizen Science and Library Assessment

I first got hooked on watching birds when living on Peaks Island, Maine. Sitting at my kitchen table one morning and gazing out a window to the back yard, a flock of cedar waxwings settled onto a tree. I never saw those birds again, but I was captivated by the moment and its natural beauty.

My back yard in Syracuse butted up against a waterway too – the Erie Canal. We installed multiple bird feeders, allowing me to participate in my first “citizen science” project out of Cornell’s Lab of Ornithology. For an hour or so a week, volunteers count every bird that visits the feeders, submitting the identification and numbers to the project. The instructions are easy to follow but are strict. “Report the highest number of birds seen at a single time. Don’t count a bird twice.” Contributors submit data from all over the country. Not only is this vast corpus of sighting data a treasure for ornithologists, but it is also a great way to engage many people in science, in thinking differently about climate, and paying attention to the natural world.  

Bird watching in the city is quite different. Here in the city, I’ve started up again with bird watching and signed up to participate in Cornell’s eBird project. Again, the instructions for submitting checklists (birding experiences) are quite strict, with detailed instructions on how to report bird sightings. And all data submitted is subject to review for accuracy, checked to make sure that the identifications, time of year, and location make sense.

What is the connection with library assessment? We collect data too – the more the better. But it needs to be thorough and accurate.

Last week Olivia Given Castello and I provided training to staff in the Blockson Collection and the Special Collections Research Center on the use of the data entry form for tracking reference transactions. As part of the training, we reviewed the definitions for “reference”, “consultation” and “referral.” Josue Hurtado and I will monitor the data entry for the first month or so, ensuring that we have a common understanding of the definitions and documentation process.

A big thank you to all the staff at Blockson and SCRC for contributing to this work. We are THRILLED to have their data standardized and available as part of the larger LibInsight dataset. Like the eBird and Feeder Watch projects, having reliable data from a broad set of contributors improves the opportunity for analysis of trends in reference. The data will be put to good use by Temple libraries as well as other libraries and organizations (ARL and ACRL) to better understand patterns and changes in reference – a core library service.

Ezra likes bird watching too.
Posted in data-driven decision making, service assessment, statistics | Tagged , , , | Comments Off on On Citizen Science and Library Assessment

We’re Getting Social

Reaching out to students where they are is a critical part of our communication and outreach program, and our new assistant director for editorial, Ella Lathan, is expert at creatively using social media platforms to connect with our community. She and student worker Nika Svirinavichyus are really paying attention to our numbers too. I asked Ella to share with me how they were using this data to assess the effectiveness of our social media campaign.

Nancy: Tell us about the social media landscape here at TULUP (Temple University Libraries and University Press)? What are the platforms we post to?  

Ella: We currently maintain an active presence on several popular social media platforms, including Facebook, Instagram, LinkedIn, X (formerly known as Twitter), and we have recently launched a TikTok account. Our strategic approach involves leveraging a diverse range of social media channels to effectively disseminate information and engage with both students and professionals within our community. 

Nancy: I know that you recently compiled the usage data from these. Can you tell us something about what you learned? What trends are you seeing?  What is popular and what does not get so much attention? 

Ella: The recent compilation of usage data from these platforms has provided valuable insights into emerging trends and audience preferences. One notable observation is the increasing popularity of video content, particularly on platforms like TikTok. The most successful posts on our social media accounts tend to be collaborative efforts with other Temple University accounts, such as @Templeuniv, @templedss, and @TUsustainability. Additionally, our audience demonstrates a strong affinity for digital archive photos, especially those featuring nostalgic or historically significant events, such as an eclipse that occurred in Philadelphia in the 1920s. Interestingly, posts featuring pets also garner significant engagement on our Instagram account. Conversely, posts promoting specific events tend to receive comparatively less attention, which is unfortunate given the potential interest in these workshops. 

Solar eclipse Instagram post, liked by 128

Nancy: What are the challenges in analyzing this data? For instance, is it difficult to collect, or hard to interpret? 

Ella: Most social media platforms have extensive data tracking tools that track beyond likes and reposts! Facebook and Instagram offer multiple metrics, such as a posts’ reach and the amount of clicks from the post to our profile. Our main challenge in analyzing this data is understanding the algorithm and our followers. Posts are promoted in an algorithm depending on the time of the post, and other factors. It can be tricky to understand how the algorithm works and ensure our posts have reach. We wish there was more information on what our followers enjoy as well, to make sure we are providing relevant and helpful information for followers. 

Nancy: This is such an ever-changing space. How do you keep up with what’s going on, and most importantly, where students are in the space and how we can be most effective in connecting with them there?   

Ella: In this rapidly evolving digital landscape, staying abreast of the latest developments and trends is crucial. We diligently monitor the platforms where our students are most active and strive to establish an effective presence there. Collaborating with other Temple University accounts has proven to be an effective strategy for expanding our reach and connecting with students.

Climate series X post, 2385 views

Furthermore, we continuously research and analyze the interests and preferences of our target audience, as well as the content strategies employed by other successful accounts, to inform and refine our approach. 

Nancy: It’s a challenge, particularly as our target audience is bombarded with so much information, as well as very cute puppy pictures. Thanks very much for these insights, and for all the work that you and Nika do to make sure students are aware of our resources, wherever they are.

Cute dog Instagram post, 131 likes
Posted in data-driven decision making, statistics, web analytics | Tagged , | Leave a comment

An Assessment Librarian Converses with ChatGPT

Inspired by Hyangeun (Jenny) Ji’s workshop out of the Scholars Studio yesterday on How to Talk with AI: Prompt Engineering, I continued my exploration of how generative AI tools might support my work in data analysis and assessment. Jenny’s lesson #1: The best way to learn about AI is to experiment, and her tips and guidance provided just the motivation to do so.

We need to understand trends in the use of the Charles Library. We see thousands of students come through the door each day. But our knowledge of day, time, and demographics can be anecdotal. We do get weekly gate reports, .csv files that include date, time, college, and student level. In this experiment, I asked ChatGPT to analyze two gate reports from a busy week in February (February 19-25, 2024 and February 20-26, 2023). My starting instructions: “Analyze these two csv files and then I will ask questions of you to compare the data in each.” 

ChatGPT proceeds to output a data overview of each file, detailing each column name, such as date, time, academic level and college, and entrance used. It’s rather dry and technical, just raw numbers. But kindly, ChatGPT suggests,

“If you have specific questions or analyses you’d like to perform on these datasets, please let me know! “

What a nice invitation! I did have some questions. I asked ChatGPT to compare traffic by days of the week, by time of day, and also by college. Since I like visualizations, I asked this data to be shown as a bar graph. ChatGPT complied: 

I’m impressed. But the analysis is not perfect. For instance, there is no school of Theology at Temple, nor is there a Creative Arts school. Chat GPT is using a two-letter code and predicting what that might stand for. But still… 

Chat GPT as Consultant

What would it look like to have an external data “consultant” review this data and provide their perspective? I asked ChatGPT to take on the consultant “persona” and provide me, an academic library administrator, with some insights.  Here’s the result:

  • Resource Allocation:

The clear peak usage times around midday (10 AM to 2 PM) suggest a need for optimal staffing and resource allocation during these hours to ensure efficient operations and user satisfaction. Consider scheduling more staff or offering additional support services during these peak times.   

  • Space Utilization:

With weekdays experiencing significantly higher traffic than weekends, consider evaluating the effectiveness of the space and resources available during these peak days. This could also be an opportunity to promote the library as a study space during less busy times, like weekends, to even out usage.   

  • Targeted Programs and Services

The data shows varying access patterns among different colleges. Tailoring programs or workshops to the needs and schedules of students from colleges with the highest foot traffic (e.g., Science and Technology, Liberal Arts) could improve engagement and utilization of library resources.  

For colleges with increasing access year over year, such as Engineering and Medicine, consider collaborating with faculty to offer specialized resources or services that support these disciplines.

  • Security and Access Control:

The analysis did not directly address security, but consistent monitoring of access patterns can help in identifying any irregularities or security concerns. Ensuring that access control systems are up to date and conducting regular reviews of access logs can enhance security measures.

  • Observations:

The significant access by the Science and Technology college, particularly its increase from 2023 to 2024, suggests a high engagement level that might correlate with specific resource needs or study patterns. This could warrant a closer look to ensure resources meet demand.

The decrease in accesses for the Health Professions and Liberal Arts colleges might reflect changes in enrollment, curriculum delivery methods (e.g., more online resources), or shifts in study habits. It would be useful to investigate these trends further to adapt library services accordingly.

The notable increase in Medicine and slight increase in University College suggest growing areas of academic focus or expansion. These trends could guide future resource acquisitions or service expansions.

Human Reflections

The analysis provided by ChatAI is quite thorough, even insightful, given the limited quantity of data provided. Imagine what we might learn with more numbers, or the efficiencies we might gain by refining these queries. Of course we need to be concerned about replicating our analysis, and hallucinations can be problematic. In this case, ChatGPT (4) allows me to peek under the hood by viewing the Python codes used for analysis. I can also share the query and response with others through a QR code or link. But I like how ChatGPT presented its analysis in an organized and usable style that is easily shared with others, like colleagues in access services and facilities. What a wonderful way to initiate a conversation with human stakeholders.

I incorporated another hint from Jenny, one based on research. She suggests we use polite, respectful and encouraging language when conversing with ChatGPT. While this can actually provide better responses, it also created a relationship that felt like a partnership as we worked through the analysis together. The conversation became a collaboration, and that tight division between human and AI seemed less obvious to me.

Thank you to Hyangeun and to all the library staff who provide guidance (through workshops and consultations) on best practices as we explore and understand AI tools in our work.

Posted in assessment methods | Tagged , | Comments Off on An Assessment Librarian Converses with ChatGPT

A Better Understanding of Library Work Effort

Happy New Year! What a year we had in 2023. Tumultuous. I predict that 2024 will be equally filled with change and uncertainty. Before we move on to those new challenges, a brief update on an important assessment effort conducted last year, the Strategic Resources Review.

The Review was undertaken to get a clearer sense of how staff, in the aggregate, are spending time towards providing library services.

  • How many staff are engaged in work related to collection management?
  • Preparing for instruction?  
  • What percentage of our time is spent with administrative work?
  • Are there ways in which we should be shifting our efforts towards a different future of work? For instance, do we need more time to spend on licensing electronic resources? More time developing workshops for students and faculty on the use of artificial intelligence tools?  

Our aim was to systematically document how time was being spent on what types of activities This descriptive picture has potential to benchmark current work effort as we identify emerging areas of service to guide future investment.

The approach  

Understanding how staff spend time is complicated. We might have asked everyone to document each hour of each day’s work for a year! Instead, we asked each department and unit head to provide a list of all activities and services provided by their unit and the percentage of time (over a year) that staff, in the aggregate, spend on this work. For each department, the total work effort should add up to 100%.

All this was compiled into a large, shared spreadsheet (293 rows), with percentages multiplied by the number of staff in each – amounting to a rough calculation of how much effort we expend, as an organization on services. To strengthen the analysis, we asked about service audience (internal, external) and whether work in that area was increasing or decreasing. We asked how trends were documented (for instance, tracking transactions).

We designated this descriptive phase of the review our Services and Work Effort Catalog.  Admittedly subjective and broad, this kind of analysis allowa us to see at a high level, how we use our time.  

Early findings 

Findings from the effort inventory were not too surprising but describe in a new way how we are currently investing in categories of service at the libraries.

For instance,  

  • 1/3 of our activity is devoted to “Resource Access” – this includes circulation, cataloging, acquisition of new materials and stacks management. 
  • Almost 1/5 relates to Administration – this might be HR functions, committee and professional service activity, or time spent with internal communications. 
  • We learned that for many services, we’re just not tracking our time investment in a consistent, systematic way. This makes sense. It’s easy to count our instruction sessions, less straightforward to calculate time spent preparing that instruction.

While this descriptive exercise was interesting, there are bigger, unanswered questions. Does this allocation of effort reflect our values as an organization? Is the effort aligned with the impact we desire to have on our communities in support of the work of students, faculty and the community? And how do we measure that value and impact? Perhaps our biggest takeaway is the difficulty quantifying our work in this way, that “return on investment” and true productivity and value are more than numbers that can be calculated with a formula.

–  

Posted in data-driven decision making | Tagged , | Leave a comment

Asking for help from ChatGPT

I’ve been exploring prompt engineering this week in my efforts to learn more about the uses of AI – for personal interests and professional work. It connects to our thinking about the use of AI tools for instruction and by students – a recent study reported in Inside Higher Ed indicated that college students are far more likely to use AI than their instructors. Temple is ahead of the game in developing policies for student use of AI. But what does “use” mean exactly?

ChatGPT can do far more than write homework assignments. My early learning in prompt “engineering” through Coursera gets into these complexities and possibilities. I highly recommend it.

In learning about the Chain of Thought prompt, I asked ChatGPT to Provide steps for a novice viola player to take in practicing a part for in a chamber ensemble. Break out the practice routine into simple steps for 30 minutes of daily practice. 

The prompt yielded this output:

Practicing a part for a chamber ensemble on the viola requires a structured and focused routine. Here’s a step-by-step practice routine broken down into simple 30-minute sessions for a novice viola player: 

Day 1: Establishing the Foundation (30 minutes) 

  • Warm-up (5 minutes): Start with some basic warm-up exercises to get your fingers and muscles ready. Practice scales or simple finger exercises to improve dexterity. 
  • Review the Sheet Music (5 minutes): Spend a few minutes looking over the sheet music for your part. Familiarize yourself with the key signature, time signature, and any challenging passages. 
  • Basic Technique (10 minutes): Focus on the fundamental techniques such as bowing, finger placement, and intonation. Play through the section slowly, paying attention to correct fingerings and bowing direction. 
  • Rhythm and Timing (5 minutes): Work on the rhythm of your part. Use a metronome to ensure you’re playing in time. Start at a slow tempo and gradually increase it as you become more comfortable. 
  • Play Through (5 minutes): Play through the entire part at a slow tempo, concentrating on maintaining proper technique and rhythm. 

Day 2-4: Refining Technique (30 minutes each day) 

  • Warm-up (5 minutes): Begin with warm-up exercises to prepare your fingers and muscles. 
  • Isolated Practice (15 minutes): Break your part into smaller sections or phrases. Work on one section at a time, paying close attention to intonation, bowing, and dynamics. Repeat each section until you’re comfortable. 
  • Integrate with Metronome (5 minutes): Combine the sections you’ve practiced with a metronome to improve your overall timing and coordination. 
  • Play Through (5 minutes): Play through the entire part at a slow to moderate tempo, focusing on smooth transitions between sections and maintaining correct technique. 

Day 5-6: Musical Expression (30 minutes each day) 

  • Warm-up (5 minutes): Start with your usual warm-up routine. 
  • Expression and Dynamics (15 minutes): Pay attention to the musicality of your part. Experiment with dynamics, phrasing, and articulation to make your playing more expressive. Try different bowing techniques to achieve the desired musical effect. 
  • Slow Practice (5 minutes): Play your part at a slower tempo, concentrating on expressive playing. Use vibrato and bow control to add depth to your performance. 
  • Play Through (5 minutes): Play through the entire part, focusing on musical expression and dynamics. Try to convey the emotions and nuances of the music. 

Day 7: Putting It All Together (30 minutes) 

  • Warm-up (5 minutes): Begin with your warm-up exercises. 
  • Full Tempo (15 minutes): Play your part at or close to the intended performance tempo. Focus on maintaining all the techniques and expressions you’ve practiced. 
  • Play Through (10 minutes): Play through the entire part multiple times, gradually increasing your confidence and fluency. 

Remember to take short breaks between practice sessions to prevent fatigue. Consistent daily practice and attention to detail will help you prepare your viola part for a chamber ensemble effectively. Adjust the practice routine as needed to accommodate your progress and specific challenges in the music. 

Citation: Chat GPT 3.5 November 1, 2023

This program made a lot of sense to me – I’d love to hear from other musicians about it.

In addition to being a novice viola player, I enjoy photography. I took lots of pictures when traveling in Ireland, and I particularly liked this one:

I asked ChatGPT, How could I make this photograph more impactful? and again, the bot’s suggestions are worth considering:

The photograph you’ve shared captures a serene urban scene with beautiful natural reflections on the water. Here are some suggestions to make the photograph more impactful: 

  • Cropping: Consider a tighter crop to eliminate unnecessary distractions. For example, cropping out some of the sky can place more emphasis on the reflections in the water. 
  • Enhancing Colors: Increase the saturation slightly to make the colors pop, especially the golden hues from the sunlight and the greens from the trees. 
  • Contrast and Brightness: Adjusting the contrast can help to make the image look more dynamic. Brightening specific areas can also help draw the viewer’s attention to key elements. 
  • Highlighting Reflections: Emphasize the reflections in the water by increasing clarity or sharpness. This will add depth to the photo. 
  • Leading Lines: The railing in the foreground is a strong leading line. You might want to ensure it leads the viewer’s eyes towards the main subjects, such as the bridge or the line of buildings. 
  • Sky Enhancements: If the sky appears too bland, consider using gradient filters to darken the top part, bringing out more details in the clouds. 
  • Vignetting: Add a slight vignette to the edges of the photo. This darkens the corners slightly and can help draw the viewer’s eye to the center of the image. 
  • Remove Distractions: Consider removing any small distractions using a tool like the clone stamp. For example, any litter on the pavement or small objects floating on the water. 
  • Experiment with Black and White: Sometimes, turning a photo black and white can give it a timeless, classic feel, emphasizing shapes, lines, and contrasts. 
  • Add a Focal Point: While the scene is beautiful, adding a focal point such as a solitary person on the bridge or a boat on the water might create a story or theme for the viewer to latch onto. 

Remember, the best edits will often depend on the mood or emotion you want to convey with the photograph. Always keep your intended audience and the purpose of the photograph in mind when making edits. 

Citation: ChatGPT 3.5 November 1, 2023

Finally, I asked ChatGPT, What famous paintings are similar to this picture in terms of color and content? While I’m not familiar with all these paintings, the suggestions seemed to me quite useful, providing a helpful starting place for further exploration.

The photograph you’ve shared showcases an urban scene with buildings, a bridge, calm water with reflections, and a rich golden hue, reminiscent of the warm lighting found during the golden hour. Here are some famous paintings that share similarities in terms of color palette and content: 

  • “The Bridge at Argenteuil” by Claude Monet: Monet painted many scenes featuring reflections on water, bridges, and buildings. The colors and tranquility of this specific painting resemble the mood captured in your photograph. 
  • “Boulevard Montmartre at Sunset” by Camille Pissarro: The warm colors and urban setting of this painting align with the mood and content of your photograph. 
  • “Houses of Parliament, Sunset” by Claude Monet: The golden and warm hues of the setting sun, along with the reflection on the water and the silhouette of a city, have similarities to the photo you’ve shared. 
  • “View of Delft” by Johannes Vermeer: Although the color palette is cooler, the focus on buildings, water, and reflection has parallels to your image. 
  • “The Starry Night Over the Rhone” by Vincent van Gogh: This painting has a night scene with reflections on water and buildings on the side. The colors are richer and darker but evoke a similar tranquil mood. 
  • “Evening on Karl Johan” by Edvard Munch: The urban setting, with a play of light and shadow, creates an atmosphere that can be related to your photograph. 

While these paintings might not be exact matches, they each share certain elements of color, content, or mood that resonate with your photograph. If you’re interested, you might want to explore these works further to see how artists have historically captured similar themes and scenes. 

Citation: Chat GPT 3.5 November 1, 2023

These examples are just for fun, but as we consider our policies related to the use of generative AI in higher education, it’s easy to see how challenging and complex the determination of “use” might be. For what purpose might we use AI and at what stage of the research or creative process? We would never discourage a student from asking for assistance from their human instructor about best approaches to practice, improvements on a draft or creative work, or recommendations for related content to explore. And what an opportunity for learning – to learn how best to mine large learning models, ask better questions, and to critique the results.

Posted in instruction and student learning, research work practice | Tagged , | Leave a comment

Designing Libraries: Finding New Adjacencies to Improve Experience

The ways in which we use libraries is changing. How we conduct research. How we use (and don’t use) print collections to seek information. How we design our physical spaces to welcome students and create a sense of belonging for diverse communities. A theme throughout the Designing Libraries conference last week was how situating services and spaces together and creating unexpected aesthetic experiences provide for both convenience and a welcoming environment for students today. Browsing the book stacks is no longer what drives traffic to the library.

The conference was held at the University of Arizona’s sprawling Tucson campus. Imagine a Student Success District comprised of three adjacent buildings over nine acres! (Well, space is different in the West).

Designing the exterior landscape that connects those buildings presented many engineering and aesthetic challenges, according to keynote speaker Christine Ten Eyck. In Tucson, providing for shade is necessary and the plants must be able to thrive in the hot, arid climate. With little maintenance. The landscape design is inspired by the city’s surrounding arroyo’s – meandering paths for waterflow, here providing for an urban oasis that attracts both students and wildlife. Ten Eyck’s themes of aesthetics and nature ran throughout the conference. Others spoke to how outdoor landscaping and indoor plants contribute to students’ sense of well-being.

Another example of situating aesthetics in new ways to support student wellness and belonging is the Artist in Residence Program at Geogia Tech. This is a school that is predominantly graduate level STEM (Science, Technology, Engineering & Mat) – disciplines not frequently associated with the arts. Artist residents are asked to engage the community in an artistic and aesthetic exploration of the role of the libraries in society.  The project, now in its third year, supports artists, and as importantly, engages students in an experience that encourages them to think about art in new ways.  

A centerpiece for the conference was the University’s Student Success District. The main library is part of the district alongside technology help (and lending), career services, academic support and tutoring, study rooms, a data visualization lab and a space for makers.

CATalyst Maker Space at the Student Success District
Technology tools to borrow

While the luxury of space provided by three buildings is not something we all can afford, the Success District demonstrates also creative thinking and attention to student needs made possible by collaboration between multiple administrative organizations. These services are not all under the control of the library! But the shared goal of providing a welcoming space for students that supports their learning in a variety of ways – that can be replicated by all of us.

The convenience and ready access to a variety of services is not limited to books, technology or academic support. Food is at the ready too, with these darling robots that travel autonomously throughout campus to deliver food and drink. A service that works better when there is sunshine 350 days out of the year.

Student pets robot. Credit: Chris Richards/University of Arizona

Posted in conference reports, library spaces, user experience | Tagged , , | Leave a comment