“why can’t life just be easy !?”

One of our strongest educational partnerships here at Temple Libraries is our collaboration with the First Year Writing Program. Nearly all entering first-year students at Temple take at least one of the courses in this program, so it’s a great opportunity for library staff to reach a large population of undergraduates. Each section of the three largest courses (ENG 802, 812, and 902) is scheduled for two workshops with a library staff instructor. These workshops are designed to build on each other to support students as they work on increasingly complex research writing assignments throughout the semester.

One of the ways we assess our success in this program is by having students complete a brief, five-question survey after each workshop. In summary, the questions we ask on the survey are:

  • Who was your librarian?
  • What’s one new thing you learned today?
  • Rate your confidence BEFORE and AFTER this workshop.
  • What is one question you still have about using sources (after Workshop 1) / the research process (after Workshop 2)?

Though brief, these surveys are a rich source of data about our instruction. We tend to get a high response rate, because we have students complete these surveys while they are still in the classroom. While there are a number of potential findings from these results, in this post I will focus on major trends across all instructors. The results I discuss are from the Fall 2018 and Spring 2019 semesters.

Student Confidence

When designing these surveys, our curriculum working group struggled with the best types of information to gather in this particular context. If we quiz students on concepts we discussed in the workshop just minutes before, are we sure that their responses mean they can apply and retain these concepts? One measure we believed students could effectively self report on was their confidence about doing research. We hoped to learn if students felt increasingly confident after our workshops.

Our data suggests that students’ self-reported confidence did increase due to our workshops. The blue lines on the chart below indicated confidence before, while the green indicates confidence after. The results are on a scale from one to four, with the exception of Workshop 2 in Spring 2019, where we updated the scale to run from one to five, based on survey best practices. That’s why the very last measure is higher overall.

What Students Learned

One of our survey questions asked students to report one new thing they learned in each session. Overall, student responses were relevant to the workshop content, suggesting that students are paying attention and leaving with some of the concepts we discuss in mind. Students recalled specific databases by name, especially JSTOR and Academic Search Complete. Students also recalled the BEAM method that we discuss, indicating that perhaps acronyms and other mnemonics can help with learner recall.

Sometimes, though, student responses suggested a broader understanding of research than what we cover in the workshops. For instance, students often claimed that they left knowing “everything” about how to use the library, databases, and/or the library website, when we cover just a fraction of the available resources in each workshop. Students also sometimes misinterpreted concepts like BEAM, suggesting that while they remembered the name they may not have learned how it applies to their research process.

Student Questions

In my opinion, the survey question that asks students to share any remaining questions they have is one of the richest sources of data we have. I picked up on three common themes when reading through student responses.

Straightforward answers

Students want specific numbers when it comes to finding sources for their research papers. Some examples of the kinds of questions include:

  • How many years is too old for an article?
  • How many sources is too many (or too few)?
  • What time length is necessary to commit to doing the research portion of the paper?

Value judgments

Students want librarians to tell them the difference between “good” and “bad” when it comes to their research. Examples of these questions include:

  • How do I tell if a source is bad?
  • How do I tell if I have a good topic / thesis / article / database?
  • Why are databases better than the internet?
  • Is it okay to [fill in the blank with various research practices]?

Genuine curiosity

For the most part, students took these evaluations seriously, and answered sincerely (though there were occasional humorous answers – this blog post’s title being one of them). In addition to reflecting on what they had learned, students demonstrated interest in the library beyond what we covered in the workshops. Examples of these questions:

  • How many types of librarians are there?
  • Do we use the dewey decimal system?
  • How many books can we take out?
  • Are librarians available on the weekends?

Conclusions

It’s difficult to immediately apply these results to our instruction practice (e.g., should we be giving students value judgments on source types? I think most of us would say no!). There are also definitely limitations to using a survey to assess student learning – we don’t get to see if students actually apply the skills they learn to their writing, and we can’t ask follow-up questions when we get a confusing response. But these evaluations represent an important part of the prism of student learning assessment that we do here at the Libraries. I hope that by sharing these results with teaching librarians, we’ve continued to shift our practice to meet changing student needs and to improve our instruction overall.

I look forward to continuing conversations with my colleagues about the following questions:

  • How might these survey responses help us improve our teaching?
  • How could we be using the data we have to demonstrate our impact?
  • What other data would be helpful? How might we gather it?
This entry was posted in instruction and student learning, surveys and tagged , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.