Assessment in the Real World

This post was made possible by the excellent notes and input from Laura Chance, temporary art librarian at Paley. Thanks, Laura!

Last Friday 12 librarians gathered to talk about feedback received from students participating in the Analytical Reading and Writing library workshops. Our “text” for analysis consisted of several thousand comments in response to questions, including:

  • What is one new thing you learned in today’s workshop?
  • Which part(s) of the research process are you still nervous about?

The session was the first in the series of “real world assessment” meetings for library staff, facilitated by Caitlin Shanley and Jackie Sipes, designed to practice a hands-on assessment exercise in a group setting.

The AR&W workshops provide multiple opportunities for learning assessment, from the clicker responses, to the worksheets students complete during the session, to the online feedback form students submit at the end of two instruction sessions. The data generated via this form was the topic of the Real World Assessment workshop with librarians.

At the beginning of the year, Caitlin and Jackie tasked themselves with evaluating the existing feedback form with an eye towards getting beyond the “Did you like us?” questions and probing at the more complex question of what learning takes place through the instruction workshops. They considered the essential learning outcome for the class, “How do you evaluate the credibility of a source?” – Ideally, the assessment process should help us to understand if students are grasping that concept.

The discussion touched on the challenges of learning assessment.

  • Is it meaningful when students can essentially regurgitate information shared with them during the session immediately prior?
  • Could there be a pre- and post- test?

Recognizing that the assessment of learning is difficult to do immediately after a session, we asked:

  • How do we measure the impact of our workshops outside the workshop itself?
  • What if we could look at the bibliographies they include as part of their papers? A challenge is that instructors evaluate the papers differently, depending on their own vision for the goals of the class. The first year adjuncts grade papers as part of a peer process – what if we could sit in on these sessions as they talk about normalizing the paper grades?
  • What are the ethics (privacy, sampling, providing differential services) for students who participate in studies like these?

Designing the Assessment Instrument

It also became clear that the assessment instrument dictates the kind of analysis that can be conducted. The feedback consisted of quantifiable data (limited-option questions) and more qualitative, free-text data. Analyzing free text can be more difficult and quantifying text, using a tool like Voyant, can be interesting but many not be meaningful as, again, students tend to repeat back the language provided by the instructor.

Our session generated more questions than answers, but the discussion brought home important issues for those of us engaged in designing an assessment of learning.  We learned in a practical way,  how:

1. Assessment is an iterative process

2. It can be hard to know how best to approach it until you gather the data and look at what you get.

3. Defining what you want to learn is essential before you begin, but that might be different depending on your role and/or how you plan to use the data that you collect.

One way or the other, it was a fun and useful meeting – and a great inaugural workshop for Assessment in the Real World series.

Posted in instruction and student learning | Tagged , , , | Leave a comment

There’s a Blizzard but Library Assessment Slogs On

It’s good to be back home from the ALA Midwinter conference.  Chicago was not only windy last weekend but also snowy and awfully cold! Well, it is winter…and librarians can not be stopped.

IMG_0631

The meetings are always informative and it’s great to catch up with what other libraries are thinking about and doing in the field of library assessment.

 ARL Library Assessment Forum

At the ARL-hosted  Library Assessment Forum there was lots of discussion, and some fretting, about the new procedure for submitting library statistics annually as part of the University-wide IPEDS (Integrated Post-Secondary Education System) statistics program.  The concern is over instructions for library data that don’t fully align with the standardized definitions used elsewhere (ARL, for example, or COUNTER for e-usage). There was positive discussion and practical advice about how best to handle this situation (thanks to David Larsen and Elizabeth Edwards at the University of Chicago and Bob Dugan at the University of Florida). Over time and with input from librarians to IPEDS, we will help to provide clearer definitions and more meaningful data to analyze trends using this widely used tool for college data. This is one way the Library Assessment forum provides for information sharing between professionals in the library data world.

So what should we be counting?

Related to this topic of collecting meaningful statistics, Martha Krillydou updated us on several current ARL initiatives. After conducting an extensive listening tour with library directors, new ARL head  Elliot Shore proposed that “libraries shift their assessment focus from description to prediction, from inputs to outputs, from quantity to quality.” Library directors suggested some interesting new measures that would support the case they make to their institutions for funding. How about a:

  • Collaboration index
  • Enterprise fit index
  • A cost-avoidance index (Temple Libraries’ open education resources (OER) program would fit in nicely here.)

Library Interest in Qualitative Methods of Assessment

To balance out the numbers-oriented approach to assessment, I also attended (and convened) the ACRL Assessment Discussion Group. There is currently a good deal of interest in the use of the personas method for understanding user needs.  Personas can be a way of putting a face on a user type (Ken, the tenured economics professor or Stacy, the undergraduate art major). Grounded in real data, personas may be developed through focus groups or interviews with users – that research is compiled into a set of “archetypes” or library user types. They can help the Library explore the user experience from multiple perspectives.

  • What path would Ken take when looking for a journal article on the library’s web site?
  • What path would Stacy take when searching for a book on architecture at the Library?

Libraries are using the persona method to develop new services and to tell compelling stories about how the Library is used. Cornell was one of the first libraries to use this method (http://ecommons.cornell.edu/bitstream/1813/8302/2/cul_personas_final2.pdf) in designing its web site, but it’s been used as well by the University of Washington, BYU, and DePaul. Exciting.

Related to wayfinding, Florida State recently gave students ProCams to use to document their search for materials located in the stacks. The recordings (both visual and sound) pretty quickly exposed problems students had with navigation. For staff, it was eye-opening to see for themselves the (sometimes) utter confusion students experienced between the catalog and shelf. That recognition of a problem is the first step in making improvements.

For more information on any of these items, do not hesitate to ask!

 

Posted in conference reports | Tagged , , , | Leave a comment

Faculty Seeking Course Content – A Qualitative Research Project

Many Temple Library Staff, particularly those in RIS, are already familiar with the qualitative research project that Jenifer Baldwin, Anne Harlow and Rick Lezenby have been working on for the last two years. (Yes, qualitative research, at this depth, takes time!) They have already presented on their work several times at Temple Libraries and conducted a workshop on their approach to interviewing at the Maryland Library Association conference.  Currently, they are wrapping up the data collection phase, which included interviews with faculty and co-viewing sessions with peer librarians. Using a grounded theory approach, they’ve been analyzing their data all along. Now they’re ready to talk about it.

NT: What was your question?

JB, AH, RL: Our first question related to how faculty used Blackboard to provide course content to students. As we interviewed participants our question evolved to a more general one of exploring how faculty choose and share course content.

This content is predominantly course readings, from published articles or book chapters to lecture and lab notes, or reflections. We became interested in how faculty understand their own expertise and how they model that to their students.

NT: What method did you use and why?

JB, AH, RL: We were inspired by the work of Nancy Foster at the University of Rochester – I (Jenifer) took a CLIR workshop on the ethnographic method at MIT along with Peter Hanley from Instructional Support. Then we hosted a CLIR workshop here at Temple, so Anne and Rick were able to take advantage of this training as well. We already had lots of quantitative information here at Temple, we wanted to see what the qualitative research process would look like. We wanted to use in-depth, semi-structured interviews with faculty about their work practice. We had a method and we needed to find a question to apply it to!

Once we developed our questions we went through the Institutional Review Board for approval to conduct the research. In the interview we asked the faculty member to show some recent examples of materials they used for a class. We would ask them to talk through the process of deciding on what content to use and demonstrate how they located the materials and organized them. We asked them what they expected the outcome would be for their students. Each interview was video taped and we then conducted co-viewing sessions with librarians.

We had the recordings transcribed and we coded these (using Atlas.ti) for themes that emerged from the interview texts. We employed a grounded theory approach – where the theory emerges from the research rather than applying the research to a pre-formed theory. The co-viewing process became part of the data and analysis as well. For instance, one of the products was a spreadsheet of suggestions of initiatives and projects that might be developed out of needs expressed in the interviews.

NT: Tell me about your results? What did you learn?

JB, AH, RL: Several themes emerged. One is the tacit knowledge that scholars bring to their work. There is a lot that they know about their discipline that they might have trouble articulating to a student new to that discipline. When asked, well how did you know to include this article they might say, “”Well I just know”. Often they find out about something in what seems to them to be serendipitous ways, but in fact they are predisposed to the literature, invisible college, attuned to the environment that relates to their work

So they might say to a student, “just go and browse the shelves” – this is a behavior that they might usefully undertake because they know what to look out for. But a student might have a very different, less successful, experience with this kind of “browsing.”  Faculty try many ways of modeling these “expert” behaviors for their students.

We heard lots and lots of stories about faculty experiencing serendipitous discovery. The use of video and popular culture is pretty ubiquitous, so faculty might get ideas for class by watching cartoons or movies, at a social event, wandering a bookstore, or reading the newspaper or magazines.

We learned some interesting things about expectations for reading.  There are differences between the kind of reading that is expected of undergraduates compared to graduates,  that was not what you’d intuit. Undergraduates are expected to read in a more transformative and analytical way – graduate students need to read more broadly. If something is really important it would be distributed in class and paper, perhaps even read together.

NT: Is there anything you’ll change based on your findings?

JB, AH, RL: We hope to have a full discussion of implications and ideas for service initiatives and outreach this spring as one of several products of the project.  The co-viewing process provided us with practical ideas for outreach and push notification, for example. Most faculty talked about new book announcements that had an impact on them. This led us to ask what other kinds of things could we push.

Faculty consult with their peers for ideas about content. If librarians have a well-grounded relationships, they will take our suggestions seriously as well.

Another example – we saw a need to create easily and rapidly accessible resources for students who are also practitioners in the field– resources that they can readily access when they themselves are in a classroom or other practitioner setting. We think we might be able to help out in this area.

NT: If you did this again, what would you do differently?

JB, AH, RL: Atlas.ti, for textual analysis and coding, was cumbersome to use, particularly in a networked environment. We’d like to interview a broader group of faculty, a less “self-selected” group of participants. And currently, our library staff is small and stretched so we don’t have all the time to focus on projects such as these. But the “rewards are worth it.”

Posted in qualitative research | Tagged , , | 1 Comment

Gathering Patron Feedback at the Charles L. Blockson Afro-American Collection

This month I met with Diane Turner, Curator of the Charles L. Blockson Afro-American Collection at Temple University Libraries. This post illustrates the idea that assessment doesn’t have to be complicated to be useful and it doesn’t need to take a lot of time. It can serve as a gauge of program success and audience engagement, as well as demonstrate learning and provide feedback for future planning.

The Blockson Collection hosted a two-day symposium as part of the city-wide festival of the Underground Railroad in Philadelphia held this October. The sessions at Blockson included lectures, panel discussions and musical performances

Prior to the symposium, I met with Diane to talk about ways she might assess the effectiveness of the program in terms of one of the Blockson Collection’s key goals: “To contribute to the education of the Temple University community and general public about African-American history and culture, particularly the Black experience in Philadelphia and Pennsylvania.” As curator of this significant collection, Diane also wanted to get feedback and suggestions for other types of programs and topics that would be of interest.

Assessing learning outcomes can be tricky, requiring pre- and post- tests. We did something a little less complicated. Diane designed a simple half-sheet feedback form to distribute to program participants and asked them directly about what they learned. We won’t be sending attendees a followup quiz, but their responses to this question provides excellent documentation of the key takeaways, what surprised participants and what they’d like to learn more about. It was clear from the enthusiastic survey responses that attendees gained new knowledge, and the program inspired many to learn more – as evidenced by these responses to the question, “What did you learn?”

“Dr. Blockson taught me a lot about the various people who were involved in the Underground Railroad but aren’t mentioned much in history.”

“Have to review my 10 pages of notes to answer this.”

“More than I can write. I have so much reading to do”

Participants had many suggestions for what they’d like to see in future programming. Workshops on genealogy came up more than once, as well as themes related to Dr. Blockson’s talk – the lesser-known history of African Americans, particularly early American history (18th-19th century) and Philadelphia’s role in the Underground Railroad.

The feedback tool was simple, straightforward, and because participants were particularly engaged with the program, 73 surveys were returned by 104 participants, yielding an excellent 71% return rate.

Still, we are always learning better ways to phrase questions. In this case, the question, “Where are you from?” did not yield the expected responses, i.e. institutional affiliation. The lesson learned is to be specific about what type of information is required. If geographic location is important (as it might be in a survey like this), asking for a zip code provides more useful information for understanding reach into the community.

Posted in surveys | Tagged , , , | Leave a comment

So What’s Up with that Assessment Committee?

The Assessment Committee at Temple University Libraries has been active now for almost six months. I’d like to update you on what the Committee has been doing since Joe Lucia charged this group with “providing advice, support, and the development of projects and initiatives related to measuring, evaluation, and the demonstration of value across all areas of the TUL enterprise.” It’s a big charge and a big group, with members from all units of the Libraries including Law, Ambler, Blockson, SCRC, Paley, SEL and Health Sciences/Podiatry.

One of our first orders of business was to conduct a “data audit” – identifying the (mostly) quantitative data that is currently being collected on a routine basis throughout the Libraries. The great thing is that we’re all collecting data, and lots of it. From statistics on electronic journal usage to interlibrary loans to use of library computer workstations to gate counts to feedback from instruction sessions to library web site traffic – the list goes on and on.

But unsurprisingly, these data sets are stored in different areas, neither centralized nor widely accessible. Similar data, like reference transactions, are collected in various ways – data entry via the web, manual record keeping, spreadsheets. The Assessment Committee is reviewing this environment with an eye towards making data collection, storage and access more standardized, systematic and easy to do.

We’ve engaged in several brainstorming or “visioning” sessions in which we work in small groups to address big questions:

  • If we had no restrictions on money or time, what would a library data repository look like and what would it do?
  • If we could know anything about our patrons and their use of the library, what would we want to know?

We’ve also explored the question of what metrics and assessment methods need to be in place to evaluate our effectiveness as an organization – using the strategic actions document as a starting place. This is a tough one. It’s easy to count circulation, but much harder to measure our impact on faculty awareness of new methods for scholarly communication.

As for actual assessment taking place, we’re piloting two patron surveys this month. A customer satisfaction survey is helping us learn how patrons perceive our service at public desks. The second survey is a follow-up to research consultations, in which patrons are sent and online survey one week after meeting with a librarian.  This is to learn if students are retaining the research skills used during the consultation.

Upcoming blog posts will profile additional assessment underway here at Temple, including:

  • SEL’s focus group session with engineering students to learn how the Library contributes to their success here at Temple
  • Caitlin Shanley’s ACRL Assessment in Action project on the effectiveness of library instruction towards student learning
  • RIS’s qualitative research project on faculty research assignments
  • DLI’s assessment of tools for improved discoverability of digital library resources

The Assessment Committee has been instrumental in cultivating a culture of assessment here at TUL. As important, is the participation and engagement of all staff in our efforts. If you have an idea, a comment or question, please do contact me or one of our members!

Nancy Turner, Chair

Jenifer Baldwin, RIS

Steven Bell, RIS

Lauri Fennell, HSL

Leanne Finnigan, CAMS

Eugene Hsue, Law

Doreva Belfiore, DLI

Jessica Lydon, SCRC

Nicole Restaino, Communication/Events

Brian Schoolar, Collections

Cynthia Schwarz, LTS/HSL

Caitlin Shanley, RIS

Gretchen Sneff, SEL

Diane Turner, Blockson Collection

Sandi Thompson, Ambler

John Oram, Access

 

Posted in organization culture and assessment | Tagged , , | Leave a comment

Cynthia Schwarz explores the use of technology in libraries

Cynthia Schwarz is the Senior Systems & Technology Librarian at the Health Sciences Library. She recently moved into this position after 6 years on the Main Campus. In both these jobs,  Cynthia has used her technical expertise to address questions about how students use technology and how we can optimize our offerings to support their work. She sat down with Nancy Turner, Assessment Librarian, to talk about her “investigations”.

NT: So what were the questions that you had?

CS: Here at the Health Sciences Library we have several public Mac computers that are located in the open stairway area – this is a pretty high traffic area and yet they didn’t seem to be getting a lot of use. As we make decisions about investing in new computer workstations, I wanted to have more data about how frequently the different computers in the Library were being used, what types were preferred, who was using them, and what software was being used.

NT: So how did you go about gathering this information?

CS: The timing was good, because we now have access to an analytics tool called LabStats. The TECH Center has been using this tool for a while to learn about what’s going on there. We combine data from LabStats with data from our student database, Banner, so that we can get the department and status of the patrons that use our computers.

Use of Computers by Student School/College

I have a map of where each workstation is located in the Library and each computer has a unique name. I can see how many times that workstation was logged into, how many unique users, what time and days. I collected the use data for the Spring semester. There is a separate report for what type of software is used but that’s a little more difficult to manage. It takes time to run the reports and it’s not as accurate, or it would take more time to analyze it. It’s based on the name of the application.

NT: What did you learn?

CS: Well, the data showed that these Mac computers do indeed get used. I was also surprised by the relatively high use of our computers by students in schools outside of the health sciences. And there were a relatively high number of alumni are using the computers.

NT: Why do you think that is?

CS: My theory is that the Library is quiet and it’s a pleasant place to work. We don’t limit who uses the library space to only health science students. So students who live nearby, or who are taking a class outside their major, or maybe students with jobs nearby can come and use our library.

NT: Any next steps?

CS: My goal is to monitor computer use through next year. We want to understand if there are preferences for MACs over PC’s. And using the Millennium reporting capabilities, we can run reports on the use of other types of technology: laptops, iPads, remote controls and keys to the group study rooms. For instance, we’ve got 65 laptops available for borrowing. I’d like to explore how frequently these get used. This information will help us to make decisions about future investments in technology here at the Library.

 

 

 

Posted in data-driven decision making, technology use | Leave a comment

Temple University Libraries Overhead Survey Enters Its 4th Month

Temple Library staff members may have noticed yellow paper surveys floating about the building or seen Jim Bongiovanni’s e-mail messages about the web survey launched each month for users of proxied library resources. These are part of the “Overhead” or Cost Analysis Survey. We will be conducting the survey at random hours in Paley, Science and Engineering, and Ginsburg Health Sciences Library through June 2015. The survey is designed to establish a percentage of library costs that support “sponsored” research – for instance grants from the National Science Foundation or the National Institute of Health.

In the first three months, we’ve distributed 1620 surveys and had over 800 returned. And although it’s too early to have truly representative results, I’d like to share some user data gathered so far, with the caveat that only at the end of June will we have a full picture.

The responses are evenly split between Undergraduate and Graduate students, but we have a good number from faculty, staff and community members as well. overhead graph 1

Each of the libraries serves a different mix of patron types. Paley serves undergraduates primarily (58% of Paley responses are from undergraduates), and HSL serves graduate students (44%). SEL patrons are primarily undergraduates from the School of Engineering. And where are the faculty? They’re coming in from the Web. 74.4% of web responses were from faculty.

overhead graph 2

The chart above shows the top ten schools represented in the survey responses so far, but all the schools and colleges make use of the Libraries. Based on survey responses, HSL patrons are primarily students from Medicine, Podiatry, Pharmacy and Dentistry. And while mostly serving engineering students, SEL also supports students from Science and Technology and even Theatre. Paley has the widest “mix” of students.

The web surveys provide us with information about who’s using library resources, from where, and what resources those are. (This is anonymous data, of course). Of all patrons using online library resources (not including the catalog), 51% are off campus, 36.9% are on campus but not in the library, and 11.9% are actually in the library. That’s interesting, isn’t it? Perhaps an indication that students in the library space are here to work on papers, study, or attend a class – but not necessarily using the licensed resources. So far, the top databases and resources used via the Web are: PubMed, Refworks, PsycInfo, Academic Search Premier, JSTOR, and ISI’s Web of Knowledge.

These data are not too surprising, but as the year progresses, we’ll have more robust and interesting data. And I hope additional insights into use the the library and its resources.

I want to thank all of our students and staff who have helped out with the survey so far. They are: Tiffany Ellis and Lauri Fennel at HSL, Cody Smallwood at SEL, and from Paley, David Murray, Anthony Diamond, Lori Bradley, Jonathan LeBreton, and our invaluable students: Mariah Butler, DJ Daughtry, David Glover , Nadia Khatri, Roma Marcos, Kaitlin Mashack, and Steven Wei. Kate Lynch and Jim Bongiovanni have done a tremendous job with the web survey.

If you’d like to help out, please do let me know. There will be 9 more opportunities!

Posted in surveys | Tagged | Leave a comment

Survey of Teacher Education Faculty

Jackie Sipes is the Education Liaison and Emerging Technologies Librarian in Research and Instruction Services (Paley Library). Towards the end of the spring semester, Jackie conducted a survey of the faculty in the department of teacher education. Nancy Turner asked her about this assessment project.

NT: What were you trying to learn by conducting this assessment?

JS: I’ve been here for just under two years, and I wanted to learn more about the kinds of research-based assignments being used in the teacher education program. I felt like I hadn’t yet made inroads into the program, either with faculty and students. I wanted to find out about their needs in library instruction and student research support. This was also a way of letting faculty learn about me and what I can do for them. I also wanted to know about specific courses being taught that incorporated research assignments.

NT: How did you choose your assessment method?

JS: I chose to conduct a survey because it would reach the greatest number of faculty and could be brief and unobtrusive. I chose to focus just on faculty so that I could tailor the questions to get their impression of how they felt about their students’ skills in information literacy.

I was a little disappointed in the number of responses I got (9) but the information I gathered is very useful. Something is better than nothing!

NT: Right. You can’t necessarily generalize about all the faculty from these responses. So what did you learn from the feedback?

JS: Most of the respondents are aware of the library’s services, but they don’t feel they have time to incorporate a full-blown instruction session in their schedule. So I will be emphasizing to them the other ways I can connect with their classes and students, like a 5-minute “drop in.” While most of our instruction sessions teach students how to locate scholarly articles, I learned that teacher education students are often required to find other types of materials as well: lesson plans, curriculum documents, or general subject information with which they can build a lesson plan. So I’ll start incorporating those types of searches into my instruction. And I learned that there may be a different use of language in describing a research assignment. For instance, students might be required to do an assessment of a classroom intervention.

NT: Will you make any changes based on what you learned?

JS: Definitely. I’ll make sure that instructors know that I can come in for just a few minutes to introduce myself and be visible. I’ll focus my outreach efforts towards those classes that were identified as having a research component.

NT: If you did this process again, is there anything you’d do differently?

JS: I was pretty happy with the questions that I asked. It got me the information I was looking for. I think that I’ll try conducting the survey in the beginning of the semester next time. The survey could be part of my outreach e-mails where I let faculty know about what the library can do for them.

Posted in instruction and student learning | Tagged , | Leave a comment

David Murray’s Experience as an Embedded Librarian

David Murray (librarian for History, Latin America, Spanish & Portuguese) in the RIS department partnered with faculty member Ron Webb to teach The Legacy of Mesoamerica this spring. In this conversation with Assessment Librarian Nancy Turner, he describes his work as an “embedded librarian”, the assessment he conducted as part of that work, and considers the value of embedded librarians in contributing to student research success.

 

NT: Tell me about your project.

DM: I worked with Professor Webb for 9 to 12 months to put together a course and syllabus on The Legacy of Mesoamerica. The goal was to move beyond the imparting of ACRL’s standard for information literacy instruction, the so-called one-shot instruction session, and to instead embed information literacy into the curriculum. We went beyond the one-shot by having me, the librarian, convey or impart course content as well as those information literacy skills. We thought having the librarian deliver course content, in this case a series of lectures on Classic Maya art and architecture, would incline the students to take the information literacy component of the course more seriously.

There were three information literacy assignments within this course, and one “warm up” exercise that we can use as a “pre-test” of students’ skills. The primary assignment was an annotated bibliography, which was graded using a rubric provided to the students.

I had several questions.

  1. Are there indications of student learning, based on comparing the pre-test to the skills demonstrated in the annotated bibliography?
  2. Did the students feel that having a librarian in the classroom improved their understanding and skills with the research process?
  3. Did the improvements in student skills justify the time spent on the class and with the students?

For the first question I could compare the IL skills manifested in the “warm-up” assignment (pre-test) against those of the annotated bibliography. The two are not directly comparable, however, except in the sense that for both I asked students to describe the research process. In other words, the two assignments differ structurally. Yet I believe when properly compared can reflect changes in the students’ approach to finding, using, and evaluating information sources for academic work.

I used a survey to get at the second question. The survey was intended to measure perceptions and attitudes, not knowledge.

NT: Tell us about your results

DM: 7 out of 9 students responded to the survey. 2 were not in class that day.

The top three skills the students felt they learned were:

  • “Choosing the most appropriate method and tool for accessing information”
  • “Recognizing and employing “subject” searches in library databases and catalogs such as Diamond”
  • “Learning how to critically evaluate sources of information for college-level research”

Most students expressed some level of appreciation for having a librarian closely involved with the course. One student valued having a “second opinion” available on papers. Another stated, “… by seeing the librarian so much in class it created a closer relationship between him and the students that made it more likely to maintain communication,” an indication, perhaps, that at least one student perceives value in establishing a long-term connection with a librarian.

I used a rubric to grade the annotated bibliography. While rubrics are practical and help students to understand the expectations, I don’t want to have students just “parrot” back to me what they know I want to hear. I haven’t yet systematically compared the pre-test to the annotated bibliography, but a preliminary comparison suggests students learned to become more sophisticated in their approach to information. For example, one of the students – an excellent student, I might add – progressed significantly in her understanding of the use of primary and secondary sources in her research. Several students also demonstrated, pre- vs. post-test, a better understanding of which of the many library databases available to them would likely uncover the most relevant sources for their topics. Finally, I noticed that several students were able to demonstrate an improvement in their basic understanding of search syntax (or the way a search is expressed in the database).

NT: Is there anything you were surprised by?

DM: I was surprised that learning to use subject headings for searching was important to the students. Librarians debate the utility of teaching students what we, as insiders, refer to as “controlled vocabulary.” Reference and cataloging librarians know the importance of subject headings, but some in our field claim that students and other library patrons do not appreciate subject headings, and cannot learn to use them effectively. Given this internal debate, I was pleasantly surprised to discover that my students expressed an appreciation for learning how to recognize and employ subject headings and other controlled vocabulary. Strangely, only three students said they learned to understand “what constitutes plagiarism and acknowledging the use of sources through proper citation.” I say strange because we emphasized the importance of mastering American Anthropological Association (AAA)-style citation in class, in the course guide, and on the assignment prompts.

NT: What are your next steps?

Students appreciated both the help with research and the writing. This means that partnerships with the Writing Center might be useful. Either the time spent on citing sources was not effective (students’ annotated bibliographies suggest it was effective), or perhaps more likely students felt they didn’t need it.

Overall, being truly embedded into the class is a lot of work for the benefit of a relatively small number of students. Because of this commitment, there are always questions about how sustainable “embedded librarian” initiatives such as this one really are in the world of academic librarianship. Ethically, it is important to consider the extent to which devoting so much time and effort to the needs of a small number of students might impact the hundreds of other history and Spanish students who fall into my portfolio. Ultimately, I feel that in terms of student learning all the hard work was and will continue to be well worth the effort. I would not have committed to such an endeavor if I felt my other students were getting short shrift, and I now have three years of experience teaching this course and can say with some confidence that, at least for me, such an initiative can be sustained.

Librarians are concerned, rightly, about democratizing access to our services; we always want to make sure we’re reaching as many students as possible! It’s all about maintaining a balance, and yet we can probably do the most good by focusing our time and attention on those faculty and students who are most receptive to what we have to offer. One thing is for sure, the teaching role of the academic librarian has increased exponentially over the last 10-to-20 years, a trend that at this point is hard to ignore. TULibraries is recognized for its instruction efforts. While my “embedded librarian” course is only one small part of those efforts, I hope to continue refining and improving the embedded course.

 

For more information on David’s work, contact him at dcm@temple.edu or visit his guide for this class at: http://guides.temple.edu/las2098

Posted in instruction and student learning | Tagged , | Leave a comment

My Library Assessment Adventures

I have just returned from 5 days in Seattle, Washington to attend the 2014 Library Assessment Conference: Building Effective, Sustainable, Practical Assessment. This is a biennial gathering (now 600 attendees strong) and is sponsored by the Association of Research Libraries and the University of Washington.

The conference presents a fantastic opportunity to network with other professionals, learn about new tools, and keep up with trends in the library assessment and measurement community. It was especially fun for me to catch up with colleagues from past workplaces, including Syracuse, New Mexico State University and even the University of Louisville.

While I encourage you to explore the slides/presentations being posted at the conference site (http://libraryassessment.org/schedule/index~print.shtml), here are some of my personal highlights:

Using data analytics for prediction

Data is big. Several presenters encouraged librarians to pay attention to the higher education trend of using learning analytics to more proactively support student and faculty work and demonstrate value. Both Deb Gilchrist (Pierce College ) and Margie Jantti (University of Wollongong) spoke to the need for libraries to be more assertive in collecting data. Jantti’s research findings indicate, for instance, that students who do not use electronic resources will, on average, fail.

Gilchrist reminds us that we can get mired in descriptive analytics, and our time may better be spent thinking about predictive analytics. We need to think within the institutional context and proactively consider trends in higher education. She suggests that we might need to give up some of our principles for privacy – citing the example of Minnesota’s use of its Assignment Calculator (research paper analytics) to intervene at key points. She goes on to ask the question, “What would intrusive reference look like?”

Data Tools

Based on several powerful demonstrations of Tableau for presenting library data, I can’t wait to re-acquaint myself with this powerful tool for data visualization. Sarah Murphy was able to share some of her work on the public site at: https://carmenwiki.osu.edu/display/libraries/Dashboards. Rachel Lleweyan (UMass Amherst) and Jeremy Butler (University of British Columbia) also described best practices using Tableau dashboards to answer specific questions, provide multiple views of a single data set, and to generate new questions.

Space and Services

The University of Washington is a beautiful campus and this conference shows it off to advantage – both the physical spaces and the work of library staff. I particularly loved the newly renovated Odegaard Undergraduate Library with its open floor plan (that also accommodates quiet study), its active learning spaces with writable glass walls that can be adjusted for class use or open lab.

learning

This picture shows so much going on in one of the new spaces, both analog and digital technologies in use. And I was excited by Lauren Ray (University of Washington) and Katharine Macy (Tulane) account (Assessment of Space Designed for Experimentation) of the Research/Writing Center that is a partnership between the writing center and the reference services. Students can walk in or make appointments to consult with librarians and tutors to get support for writing and researching papers. There is natural linkage between these two services, and providing support to students in a shared space benefits everyone. The Scholar’s Studio is space that provides services for graduate students – they’ve experimented with lightening round research presentation events and a CoLab event designed to foster interdisciplinary collaboration.

Strategic Direction and How We Measure Our Progress

A key theme throughout the conference was demonstrating value and connecting library strategic direction with institutional goals. Roger Schonfeld (Ithaka S&R) addressed this in Vision, Alignment, Impediments, Assessment : The Views of Library with reactions from Anne Cooper Moore (Southern Illinois) and Scott Walter (DePaul).

Ithaka S&R conducts triennial surveys of faculty, their views on the library’s role in their work. They recently conducted a survey of academic library directors, and Schonfeld presented some results from this research, suggesting possible mis-alignment in how the role of the library is perceived by faculty and by library directors. While faculty value the library’s role as buyer, archive and as gateway to information, library directors see the support for research and teaching as of primary value,  as well as the increasing support for information literacy. Moore and Walter provided additional context from the library directors’ vantage point – that many schools are focusing on the needs of undergraduates, needs that may differ from faculty. These directors describe a library role that takes into account institutional missions as well as a role that serve the needs of faculty more directly.

By day 3, my head was exploding, but fortunately the slides are available for Steven Town‘s (University of York) provocative whirlwind presentation,  Making a Value Measurement a Reality: Implementing the Value Scorecard.

He extends the balanced scorecard approach further to explore the measurement of library values in the dimensions of relational capital, library capital, library virtue and library momentum. For example, the library provides the university with relational capital that can be measured by its competitive position (reputation beyond institutional boundaries). In the dimension of virtue, the library’s contribution to research, learning and employability of students might be measured. According to his paper (with Martha Kyrillidou, Developing a Values Scorecard), the value scorecard “provides a framework and categorization for the types of evidence required to prove the value and impact of libraries.”

Lastly – the coffee! There wasn’t lots of extra time, but I did my best to assess the options. My favorite – Cafe Solstice.

latte

 

Posted in conference reports | Tagged , | Leave a comment