Gathering Feedback on the Library’s Public Programs

The latest Assessment in the Real World discussion focused on the feedback survey that goes out to attendees at the Libraries’ Beyond the Page public programs. 11 library staff members participated in a discussion with Nicole Restaino (Manager, Library Communications & Public Programming) about the survey, the results, and the challenges of developing an assessment tool that collects feedback for actionable, data-driven decision-making.

Nicole has several things that she wants to learn about who attends the programs.

  • Who are they? Undergraduate or graduate student? Faculty? Staff? Community members?
  • Why do they come? Do they receive class credit? Is it a topic that interests them or have they heard of the speaker?
  • What time of day works best for their schedule?
  • How do they learn of the program? From their instructor? A listserv? A poster at the Library?

This last is important, since it helps Nicole make decisions about where to post announcements, what formats work for what audiences, and how to optimize her limited “marketing” budget. If students hear about a program through the radio, then it makes sense to pay for spots there.

The current data (1000+ responses going back to  Fall 2012) indicates that 62% of our attendees are undergraduates. 31% learn about a library program from their instructor – and 39% attend because they receive extra credit or are required to attend for class. This is not unexpected, since Nicole works with faculty, particularly in Gen Ed and at Tyler, to develop programs that support the curriculum.

public programming snip

In conducting the survey, a good return rate is a challenge. We talked of some ways to increase the percentage of survey response. The personal “touch” is effective at Blockson, where the short survey is passed out by hand. That space is also more confined and the audience is typically more community-based – factors that might also lead to a better feedback response. What if we added an online form as a way of gathering feedback? The response rate after instruction workshops is excellent, although students are in a classroom environment and may feel more incentive to complete the evaluation. Would we get a better response rate if the form was shorter?

We talked about what questions work best and which ones don’t lead to actionable data. And since learning is an important goal for these programs, we’ve added a question related to what was learned.

The compilation and categorization of the paper surveys is time consuming, but provides us with a story to tell that’s backed with numbers. Out of over 1000 respondents, over 96% would attend another program at Temple University Libraries. That’s popular programming.

 

This entry was posted in data-driven decision making, surveys and tagged , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.