Developing an organization-wide culture of assessment is a slow, incremental process. I get frustrated by this fact at times, particularly when I imagine a perfect world of systematic, routine plans put into place for assessing and demonstrating the value of all we do here in the library and press – supporting faculty, students, and community. But there is more going on at Temple than first meets the eye; examples of assessment and program evaluation that taken together, are pushing us towards a culture of continuous improvement.
For example, we’ve been using a paper form for collecting feedback on our public programs for years. This week I met with Kaitlyn Semborski, Sara Wilson and Steven Bell to discuss other ways of gathering feedback – creating an online form and tweaking our questions to get a better sense of how well the programs meet the needs of our many patron types – from students to friends in the community. We talked of “tracking back” via email with instructors who send their students to these programs to get feedback on the effectiveness of the programs for their curricular needs. This will be essential information as we shape the programs going forward.
I met with Ginsburg Instruction Librarian Natalie Tagge last week to brainstorm the assessment of instruction provided to first year medical students. The information literacy component utilizes flipped classroom, video, and a pre-test, and the Dean was enthusiastic about last year’s efforts. This year Natalie intends to build on that success by using a rubric to evaluate student presentations as students (we hope) demonstrate their grasp of the material. It’s a good method because it allows us to assess learning outcomes in an unobtrusive way, placing no undue burdens on the faculty or students.
RIS librarians are also having conversations about more systematically documenting the work they do with instruction and research consultations. Tracking back to faculty, recording repeat customers, insuring that we have more detailed information about research consultations. We know that numbers do not tell the whole story. Anne Harlow points out that real “effectiveness” is when students are able to locate quality information on their own, without intervention of library staff. This means that we may be supporting students even more, yet our transaction numbers may go down. But engaging in the conversation about meaningful metrics is a good start.
A final example. At Temple we collaborate across departments in many ways – planning the orientation of student workers for public services desks is no exception. Staff from Access & Media Services, Library Administration, Reference & Instruction, Library Technology Services, coordinated by Katerina Montaniel, worked hard to organize this year’s orientation. The effort was strengthened even more by the inclusion of 3 students in the planning. As it turned out, the student-produced video providing “tips” for working a desk was a highlight – it was informative, clever and funny. Another change to orientation this year was a simple assessment, an evaluation form entitled: “Please Help Us Continue to Improve”. On a scale of 1-5 (5 is the best), 30 students weighed in – with an average score of 4.48. Well done!
Including students in the planning of the orientation made a huge difference in developing a sound, but also fun, training session for our student workers. What else did we learn? Many students didn’t realize they have ready access to a panic button when they are working a public desk. Important information. And going forward, staff will know even more about what topics to emphasize to engage student workers in the essential aspects of work on the public service desks.
So although much of the work of assessment goes under the radar, it’s actually happening in a big way. Even more importantly, a growing number of staff members, from throughout the organization, are thinking about assessment and engaging in the process. Thanks for letting me brag about it!