There’s a Blizzard but Library Assessment Slogs On

It’s good to be back home from the ALA Midwinter conference.  Chicago was not only windy last weekend but also snowy and awfully cold! Well, it is winter…and librarians can not be stopped.

IMG_0631

The meetings are always informative and it’s great to catch up with what other libraries are thinking about and doing in the field of library assessment.

 ARL Library Assessment Forum

At the ARL-hosted  Library Assessment Forum there was lots of discussion, and some fretting, about the new procedure for submitting library statistics annually as part of the University-wide IPEDS (Integrated Post-Secondary Education System) statistics program.  The concern is over instructions for library data that don’t fully align with the standardized definitions used elsewhere (ARL, for example, or COUNTER for e-usage). There was positive discussion and practical advice about how best to handle this situation (thanks to David Larsen and Elizabeth Edwards at the University of Chicago and Bob Dugan at the University of Florida). Over time and with input from librarians to IPEDS, we will help to provide clearer definitions and more meaningful data to analyze trends using this widely used tool for college data. This is one way the Library Assessment forum provides for information sharing between professionals in the library data world.

So what should we be counting?

Related to this topic of collecting meaningful statistics, Martha Krillydou updated us on several current ARL initiatives. After conducting an extensive listening tour with library directors, new ARL head  Elliot Shore proposed that “libraries shift their assessment focus from description to prediction, from inputs to outputs, from quantity to quality.” Library directors suggested some interesting new measures that would support the case they make to their institutions for funding. How about a:

  • Collaboration index
  • Enterprise fit index
  • A cost-avoidance index (Temple Libraries’ open education resources (OER) program would fit in nicely here.)

Library Interest in Qualitative Methods of Assessment

To balance out the numbers-oriented approach to assessment, I also attended (and convened) the ACRL Assessment Discussion Group. There is currently a good deal of interest in the use of the personas method for understanding user needs.  Personas can be a way of putting a face on a user type (Ken, the tenured economics professor or Stacy, the undergraduate art major). Grounded in real data, personas may be developed through focus groups or interviews with users – that research is compiled into a set of “archetypes” or library user types. They can help the Library explore the user experience from multiple perspectives.

  • What path would Ken take when looking for a journal article on the library’s web site?
  • What path would Stacy take when searching for a book on architecture at the Library?

Libraries are using the persona method to develop new services and to tell compelling stories about how the Library is used. Cornell was one of the first libraries to use this method (http://ecommons.cornell.edu/bitstream/1813/8302/2/cul_personas_final2.pdf) in designing its web site, but it’s been used as well by the University of Washington, BYU, and DePaul. Exciting.

Related to wayfinding, Florida State recently gave students ProCams to use to document their search for materials located in the stacks. The recordings (both visual and sound) pretty quickly exposed problems students had with navigation. For staff, it was eye-opening to see for themselves the (sometimes) utter confusion students experienced between the catalog and shelf. That recognition of a problem is the first step in making improvements.

For more information on any of these items, do not hesitate to ask!

 

This entry was posted in conference reports and tagged , , , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.