Strategic or Operational? That is the Question

Photo courtesy Geof Wilson

 

Or is it? This last year we’ve had many lively conversations at the Libraries/Press about distinguishing the strategic work from the operational work we do. Those conversations, coupled with this morning’s early yoga class, have me reflecting on how strategy and operations need to work side by side.  

How do we balance these two ways of describing our work? The conversation often goes, “Well, there is just so much time for strategic (perhaps code for “new”) initiatives when we have our operational work to do.” That makes sense.

But what is that operational work? And if we describe it as operational, does that provide it some immunity from scrutiny or assessment? Will we continue to purchase books using the same procedures, just because it’s “operational” or “what we do”? The reality is that we are continually changing up our methods, procedures and operations in acquisitions and collection development – to save money, to meet new needs, to save staff time, to explore new access models. It’s continuous improvement, and it can be labeled strategic as readily as the purchase of a 3 -D printer for the Digital Scholarship Center.

Another great example is the mapping collections project in the Special Collections Research Center. With use data, the staff is making strategic decisions about where collections should reside, at what level they need description, and what collections might be digitized for wider accessibility. How we optimize our space, staff skills, and staff time while providing for improved access / or preservation  – those are strategic moves.

Perhaps it’s my own rosy lens on the Libraries/Press but I’d like to call all of our work strategic, in that it has intention and direction – the work is continually changing  (some aspects more quickly than others) to meet new needs of the organization, institution and community.

Rather than separate,  strategic and operational are balancing forces that are dependent on one another. To grow and meet changing needs, we need to strategize our operational work. Likewise, we must consider how our strategic work can be operationalized, with goals, objectives and measures of success.  The yin and yang that keeps us both grounded and moving forward.

Photo courtesy Geof Wilson

Posted in organization culture and assessment | Tagged , | Leave a comment

Springtime Refresh at Ambler

Continuous improvement is a kind of assessment that we don’t usually think of as assessment per se – there are no statistics, there isn’t a formal plan for data collection, and our efforts may not result in a report to stakeholders. But the work that Jasmine Clark, our resident librarian, has done with staff at Ambler definitely falls into the category of assessment towards workflow improvement. In this blog post I’ll use the assessment components of identifying needs analysis and measuring success – to frame Jasmine’s work with staff that has created efficiencies, standardized workflows, and fostered change at the Library. I sat with Jasmine and Sandi Thompson, Head of the Ambler Campus Library, this week to talk about the project.

Photo credit: Darryl Sanford

 

 

 

 

 

 

 

NT: Tell me about the details of to your stint at Ambler?

JC: I needed to pick a rotation and I was interested in higher-level decision making, how organizations are run. Sandi & Andrea [Goldstein] expressed a desire to take a more comprehensive approach to the existing documentation at Ambler.

ST: For many years much of the information on policies and procedures was kept in a physical notebook at the public service desk. We were having trouble keeping it up to date, particularly with the migration to Alma. 

JC: I was able to bring my past experience with creating documentation to bear on this project, which involved collecting policies and procedures, moving documents to an online environment, standardizing workflow, and training users in using the system.

I looked at various technologies – Google drive, Slack, JIRA, and Confluence. We didn’t want to get too fancy, and needed to take into account the current skills of staff, their interest in technology, as well as the amount and nature of the data we’d be working with. So I decided to use Confluence, linking out to Google documents when necessary. As it turned out, Confluence is a perfect solution for our current needs. 

At Ambler, the “print” was the primary location for documentation. This made it hard to access, and hard to maintain. Now, Confluence: Ambler Campus Library is primary and if we need a printed copy, it’s exported as PDF, printed, and placed in the reference binder.

NT: How do you know if you’ve been successful in accomplishing this change – which is about both technology, but also the organization and how it shares information?

ST: Moving to the Confluence environment has had a multitude of benefits. When a student has a request, we’re not dependent on a particular individual to provide that service. Having everything in a centralized location online allows for other staff to comfortably fill in when someone is absent.

We’ve drastically improved the accessibility and the sharing of our knowledge and awareness – from notifying everyone that a student will be late for work  to how to process an interlibrary loan.  We are less silo’ed in our work, and this has led to a lot of “cross-training”.

JC: Yes, and I’ve seen staff who have taken real ownership of the site. They go beyond using it in a passive way, but also contribute to its accuracy – making corrections, interacting with the documents, updating on their own.

Training is an important part of the process, of course. Our workflow is realistic and based on everyone’s level of comfort and pace. I provide support as they are learning. I let staff know, you will not be “looked over” even when you are not familiar with technology.

NT: Are there other success indicators?

ST: There is a social media function, so we see an uptick in commenting on the blog post. Just the fact that people are using it for everything – having a place to go where everything is current and everything has made it the“go to” place for information. 

Reviewing the documentation has forced us to look at procedures in a different way, with “fresh eyes” looking at the work we do and how it might be changed. This was an unexpected result.

JC:  Something like this changes workplace culture. It’s become the norm to share information. It’s started discussions about new problems to solve.

NT: If you could describe the benefit of the project in one word, what would it be?

JC: Efficiency!

ST: Collaboration!

NT: Taken together, those two things really do speak to using process improvement work as an approach to building a team; working together to create a shared knowledge base. And really improve our service to users. Thanks for sharing this with us.  

Posted in process improvement | Tagged , , | 2 Comments

An Agile Approach to Assessment

The Epitome of Agile. Photo by Philippe Rouzet

Last week I had the pleasure of talking with Emily Toner, our Technology Projects Librarian, about how she works with developers in Library Technology Development to conduct ongoing assessment of their work on the Blacklight project. (Blacklight is the software that will be used for the upcoming enhanced version of the Library Search tool, due for general release in June 2018.)  In her role as project manager, Emily coordinates between what the developers are working on and what users (librarians, patrons) need and expect in a discovery tool.

The programmers/developers work in an “agile” framework – an approach that incorporates the principles of iterative, flexible development and continuous improvement. The project work is divided into short “sprints” – concentrated effort on a specific feature or collaboration opportunity. The team’s typical sprint lasts 2 weeks. When the sprint is wrapped up,  Emily facilitates a “retrospective” session with the team – they reflect together on what is working and what isn’t working so well.

In the world of agile development, there’s a whole tool box of techniques for doing retrospectives. One of the group’s favorites is the “Four L’s”. The exercise was developed by Mary Gorman and Ellen Gottesdiener and works like this: The group asks together of the sprint:

  • What did we Like?
  • Learn?
  • Lack?
  • Long For?

The exercise can be done with big sheets of paper mounted on the wall, but since TUL developer David Kinzer works remotely, they do the brainstorming entirely online now. Emily leads the team in a brainstorming session to generate feedback on the positive and negative things that happened during the sprint.

  • What was productive during the sprint?  This is a LIKE.
  • Did we figure out how to resolve a problem, or learn about a new technology? This goes into the LEARNED category.

Likes and Learns serve to highlight the positive – to bolster the energy of the team and to appreciate the good work getting done. It also serves to build the team as a cohesive group.

  • But there are also breakdowns in communication, flaws in process, occasional lack of support for developing a certain feature – these are LACKS.
  • And finally, LONG FORS – identifying items that are absent from the current project.

In conducting the exercise, team members take time to write down their personal thoughts, then these are exchanged and talked through. As a group, common themes are identified.

The retrospectives serve a couple of purposes: One is to identify which project outcomes were successful and which not. Also, a retrospective provides an opportunity for the team to think about how the team is performing as a team.  How is the team communicating? Too much, Too little? Is our work as a team effective? How can we make it better? 

And how is this assessment? The process is all about continuous improvement; the principle that we can always reflect on our work and what’s working about it, and what can be improved. And the retrospective serves a practical purpose –  putting that reflection into next steps for making the team’s work better and more effective.  It’s a kind of process improvement but not just about efficiency, about effectiveness.

Getting in the habit of regular self-reflection on our work – both celebrating the positive and recognizing what is challenging, leads to team building, trust and creative innovation.

Thanks, Emily

For more about the Agile Retrospective process, check out:

Posted in assessment methods, process improvement | Tagged , , | Leave a comment

Improving Temple Libraries’ System for Systematic Reviews

Stephanie Roth, Biomedical and Research Services Librarian at the Health Sciences Libraries,

Stephanie

doesn’t always think of herself as doing “assessment.” But my conversation with her about the evolving service and education she’s doing with Systematic Reviews is a good example of how librarians at Temple are being strategic and continuously improving how we support faculty and students, even when we don’t think about it in those terms.

 

What is a Systematic Review, exactly?

It’s like a literature review on steroids. It’s the most comprehensive type of literature review using a very specific protocol. The work must be transparent and reproducible. Anyone who claims to do a systematic review needs background and knowledge of the seven stages to be followed. Take a single question and try to figure out what the evidence is; then critically analyse the evidence and form a consensus at the end on the current state of evidence. The author creates own analysis  based on what is published; It’s important that the process does not introduce any personal bias.

It sounds like a really good opportunity for librarians to support faculty research? So how did you get involved in this work?

I’ve been here for almost three years. I had some gaps when I came. I’d been out of health care for a couple of years, and although I had conducted systematic reviews (from nuts to bolts), at the co-author level, the technology had changed by the time I got to Temple. For instance, there are many more options for citation managers, and Temple also has lots of journals and databases. I had had no formal training in Systematic Reviews, and there is so much to know. So I had a huge learning curve.

In your time here, what have you learned about the specific needs of Temple faculty and students?

Graduate students are often assigned systematic reviews but not given information about them. They may understand this to be an ordinary literature review. We spend a lot of time with researchers (faculty, students, hospital staff) so we understand the kind of review that they actually need for their work.

The word about how we can help is getting out, informally through word of mouth but also guest lectures and workshops.  I see these consultations as educational  – I used to be a school teacher. We are trying to move from  “this is a service we can do for you” to “this is what you need to ask yourself to know what approach to take.” 

I’ve done a lot of work with the other HSL librarians as well. For instance, on how to know when a question is more than it first appears. A patron might arrive at the desk asking for help finding articles, but upon some probing, we learn that it’s not for a single course assignment, but is more like a systematic review. The librarian can help them to select the appropriate review type.

As these services become better known, how do you manage the additional work?

A couple of things. We have a new protocol form that faculty need to fill out before we begin working with them. In the past, faculty were requesting a full systematic search but they hadn’t done much in the way of preparation. Our request form requires them to think carefully about their question, their inclusion and exclusion criteria. Completing this work streamlines the process, and also demonstrates a level of investment on their part. Having the search “protocol” developed before the process begins is important to eliminating bias. I also ask up front for co-authorship.

We’ve also implemented team training. This was something that Barbara Kuchan (Director, Health Science Libraries) suggested, and it’s making a lot of sense. I’ve created a formal model for this process that can be replicated by other librarians. I also use the Open Science Framework to share the model more widely.

In addition to creating a more sustainable process for supporting Systematic Reviews at Temple, I wanted to model what “open access” might look like. I used to be worried that I would attract critique, or my work would be stolen, but now I think about it as a benefit to other librarians. And I think that overall, it will improve the quality of systematic reviews published by Temple authors. 

So you are taking more of a collaborative role with faculty and students for this type of research, you are developing and training a team to support the work, AND you are using an open access platform to share your work. 

It’s a great example of how we are expanding our roles as librarians. Thanks for sharing your experience.

 

Posted in process improvement, research work practice | Tagged , , , , | Leave a comment

An Interview with Gina Calzaferri: Temple’s Director of Assessment & Evaluation

Assessment at Temple University Libraries is part of a robust institutional culture of assessment, fostered by offices like Institutional Research and Assessment. I visited the IRA office last week to talk with Gina Calzaferri, just promoted as Director of Assessment & Evaluation. This office is essential to many University efforts, from providing for centralized and authoritative data for external needs, to helping drive decision-making by University administration.

Institutional Research & Assessment is comprised of two organizational areas, and in my job I get to work with both. Data Reporting & Analysis serves to collect centrally the statistics required by agencies like IPEDS (Integrated Post-Secondary Education Data System) as well as the many ranking publications like U.S. News & World Report. They produce the annual At a Glance brochure each year, a useful tool to find accurate figures on enrollment (34,349) or high school GPA of incoming freshman (3.54).

Gina directs Assessment & Evaluation, providing the infrastructure for program reviews, the support for the assessment of student learning and continuous improvement towards institutional effectiveness.  Her commitment to this mission relates closely to her personal story. She herself was a first generation college student from rural Pennsylvania and her dissertation research (for an EDD in Higher Education from Penn) concerned the kinds of access issues that rural students face in earning a college degree.  When the position in Evaluation & Assessment opened at Temple, it was a perfect fit for Gina. She’s been here for over 4 years.

Assessment & Evaluation coordinates multiple surveys each year, including the Temple New Student Questionnaire and the National Survey of Student Engagement.  The results of these surveys are freely available on the web site, and may be of interest to library staff in understanding our student body. For instance, the New Student Questionnaire asks why students choose to go to college, and why they choose Temple.

Gina chairs Temple’s Assessment Planning Committee, with representatives from all schools and colleges (as well as Student Affairs and the Libraries). This is the committee that helps the IRA office to roll out initiatives, to communicate about upcoming events and training. Members coordinate the tracking of student learning outcomes assessment within each academic program.

Of course Gina’s office is also engaged with the accreditation process. While some might assume accreditation is the driver of assessment, she disagrees.

When it comes to assessment, “The point we are trying to drive home is that [accreditation] should really be the last thing on our minds. Assessment that is ongoing and systematic is for program improvement. It is to understand what you are doing well and not doing so well. We need to be accountable to our  stakeholders and assessment helps us to do that; students, parents, government officials;  Whether it is assessment of student learning or institutional effectiveness as a whole”

Gina continues. “Assessment helps us to tell our story; this is what we do very well here. It lets people know why they should come to Temple.”

Institutional Research & Assessment provides us with resources and support for all of us to show how planning, assessment and resource allocation are connected.  We want to be able to say, “Here are the numbers that demonstrate that we need another faculty or staff member. Assessment supports that demonstration of need.”

I couldn’t have said it better myself. Thanks Gina!

 

Posted in organization culture and assessment | Tagged , , | Leave a comment

A New Year’s Reflections and Resolutions

The beginning of a new year is a somewhat artificial, but useful time to reflect on the past and look toward the future. Reviewing the 2017 archives of this blog reminds me of all that has happened this year.  One big change is how assessment activity is organized –  evolving from a  rather traditional Assessment Committee to a growing Community of Practice. And it is growing; the last assessment meeting attracted 22  staff members to engage with Cynthia Schwarz and others about the recent library website user survey.

Cultivating this assessment community includes drawing upon staff here at Temple and beyond to contribute to this blog,  from our own colleagues’ experiences at conferences to Swarthmore’s Mary Marissen – relating her experience as an assessment librarian. 

Some of my favorite posts are those that speak to broad issues: How do we develop a culture of assessment in an organization?  If for nothing but the title, I liked this one, “Grounded or Toppling Over: The 3-Legged Stool of Assessment Culture”.

But enough reflection, and on to the future.  The recent ARL Assessment Program Visioning Task Force recommendations (published December 4, 2017) was just released, and has particular relevance for what we are also accomplishing locally. 

“Research libraries need to define the values by which they want to be measured, rather than trying to manifest values out of the data that they have.”

Rather than holding fast to those important, though conventional library metrics (e,g, number of volumes, number of participants in programs) we need to take the lead on developing metrics (and telling stories) that serve to better describe our value. Not just in relation to other libraries (i.e. rankings) but value to the institution and its strategic priorities.  

So as I consider my own resolutions for the upcoming year: My mantra will be impact. I plan to think more, and talk more with staff, and external colleagues,  about how best we use metrics to understand impact. 

With the Administrative Council I’d like to explore together how we can better demonstrate impact as we report out and share our activities, and how we connect those activities to the strategic contexts.  Can we ask ourselves:

  • What library work will have the greatest impact? On student success? On faculty research productivity ? On support for faculty instruction? 
  • How do we prioritize the work we do to make our collections most accessible to the widest range of users (geographically, for instance) and with the most effective metadata?
  • How do we organize our own time in ways that have the most impact? That may involve pausing in our routine work to learn something new, to allow ourselves the time for brainstorming and creative thinking.

Perhaps considering impact can serve to insure that we’re collecting the data (both quantitative and qualitative) we need to demonstrate value.

Here’s to keeping those resolutions in 2018, and continuing to appreciate all that we’ve accomplished already. Thanks for a great year. 

Posted in organization culture and assessment, statistics | Tagged , , , | Leave a comment

Learning about Our Users: The Website Use Survey

Cynthia Schwarz, project manager for the Libraries’ Web Environment Redesign Project, contributed this post. She reports on the results and next steps of the team’s user survey conducted last month, providing us with a healthy return of 460 user responses.

What did you set out to learn with this survey?

The Website Redesign and Blacklight Project Team wanted to gather feedback early on in the project about what is important to our patrons with regards to the website. The online survey contained these questions:

  • When you last visited the library website, what did you do while you were there?
  • What library resources or collections are valuable to you?
  • Please rate in order of importance: (Library Search, Borrowing from another library, Library Events, Individual and group study space, Printing and Computing, Library Hours, My Library Account, Information, Blogs, News and Social Media, Contacting a Librarian. )
  • What other tasks do you visit the library website to complete?

Can you tell us about your survey results? 

Library search is clearly of primary importance, as indicated with an average ranking of 4.18 on a scale of 1-5.  In the “heat map” below, green indicates high importance to users, while red is lesser importance (Blogs, news, social media averaged 2.10).

The results demonstrate what library staff already suspected, that patrons primarily come to the website to access to books, articles, journals and databases. These are all discovered primarily through the library search interface on the website’s homepage. The results also demonstrate that the computers, printers and study spaces are used and of high value to respondents. While these services are not directly related to the website, the information may useful as we plan the physical space in the new library building. Additionally, we’ll want to make sure that information about our computing, printing and study spaces is highlighted on the new website.

Were there any challenges in terms of interpreting the results of the survey?

Well, it seemed clear that all respondents didn’t read the questions carefully, since many cite printing and computing as very important – although they don’t do that through the website. Since most of our surveys were launched through a “pop-up” on library computers, the majority of our responses (over 55%) were undergraduates. And we don’t know if this response is representative, whether undergraduates are also the primary users of our website. The results show that faculty and graduate students use the website to locate articles first, then books. Undergraduates say that books, research, databases and homework/study are most important.

What Do You Plan To Do Next? 

This survey is one of several methods we’ll use to understand how our website is used – and what resources and services are most important to our community. We may use this information to highlight events and collections in a different way. We may offer different navigation paths to different user types. But as importantly,  it assures us that our work in developing a robust discovery tool is what will be most valued by our users.

Thanks, Cynthia. Great project. Thanks for sharing it with us. 

 

 

 

 

 

 

Posted in surveys, usability | Tagged , , | Leave a comment

Notes from the Field: Collections and Collective Action

Last week I participated in a symposium on the changing mission of academic libraries. While not directly related to assessment or strategic planning, those practices are implicit as we explored together “the mission of academic and research libraries in the 21st century information environment.”

The event was sponsored by the New York chapter of the Association of College and Research Libraries (ACRL). The organizers suggested that the library’s “traditional” mission – stewardship and guidance in the exploration of recorded knowledge and experience valuable for higher learning, may be due for some adaptation as research, teaching, publishing and mass communication are changing.

So how are we changing? A theme throughout the day was collaboration and collective action. Libraries are rethinking the traditional model of collection development, where the research library needs to own everything, maybe in duplicate. Eloquently speaking to this shift was David Magier,  Princeton’s Associate University Librarian of Collection Development. Magier’s work includes the coordination of (yes!) 55 subject specialists, all engaged with collection development (and supported by a healthy collections budget).  In his talk on “Collecting, Collaborating, Facilitating: New Dynamics in the Role of Content in the Research Library’s Evolving Mission” Magier described changes from his perspective (with a nod to Lorcan Dempsey’s notion of a facilitated collection). 

For example, Princeton is collaborating with Columbia and the New York Public Library to share storage space. They started with a “condominium” model – where libraries engage common processing but collections are maintained separately. This model led to lots of duplication. The libraries are re-conceptualizing the approach to create a more “collective” collection. In the emerging universe of shared print archiving, libraries commit to retention of a title allowing other libraries to let it go. It’s not easy. The model involves MOU (Memorandum of Understanding) about retention, as well as borrowing agreements between the institutions. But,  for the end user, the access to content is greatly expanded.

For those users, it does not matter how the content gets into their hands. Our job is to reduce the “friction” patrons experience in getting to the resource they discover. This friction can be more or less.  To click on a title in the library catalog and be able to read its entirety,  that’s no friction. To access distantly held materials by acquiring a travel grant to visiting an archives  – that’s a lot of friction. Magier thinks this balance can be expressed as a mathematical metaphor, with content being all the resources, and friction the pain in accessing that content:

Content ÷ Friction = Happy Spot

How does this relate to assessment? Or strategic planning? In order to do any of this well, the library needs to consider its core communities and their needs; both in the short and long term. The majority of attendees at this symposium were not from Princetons, but rather city branch libraries or community colleges. Their  patrons and the collection support they require are quite different. Their collections strategy look different as well. 

Figuring out the best balance of friction and access is also an assessment issue. What are the expectations of our users? What is the user experience we need to provide to them? How can we make the access to content as seamless as possible? 

And to understand the possibilities and costs, we need to collect and analyze the data. At what point should we be subscribing to a journal rather than using interlibrary loan? At what point are we better off buying the e-book rather than paying the usage fee? How few simultaneous users licenses can we get away with to best serve our patrons without frustrating them? 

These questions require collaboration within our library organization (collections, technology, access, subject specialists, administration) as well as without (our user community and partnering institutions).  It is exciting and inspiring to hear how other libraries are taking collective action.

Sunrise over Secaucus before a day of meetings

 

Posted in access, collections | Tagged , , | Leave a comment

The Pain of Numbers

 

I have just completed the Libraries’ submission of annual statistics to Temple University’s Office of Institutional Research & Assessment. I’ve been doing this kind of work for years, and it still pains me. This in spite of tremendous support from colleagues.

Why is it challenging? Well, this year we were in the midst of migrating data from Millennium to Alma at the end of the fiscal year (June 30, 2017). Mark Darby had to get his statistics in quickly (volume and title count, number of e-books). Extracting collections expenditures data is not too difficult, but Temple’s multiple fiscal close dates means that we may not have final numbers until months after the actual end of the fiscal year. I count on the expertise of Christine Jones and Brian Schoolar for that.

Reference and instruction statistics are pulled from multiple sources: department annual reports, LibAnalytics, various Google sheets. From the outside it seems inefficient (and it is), but Special Collections Research Center, Research & Instruction, Blockson, Digital Scholarship Center, Ginsburg, Public Programs – they all have somewhat different needs for documenting what they do. So we accommodate that and standardize the numbers as best we can.

I use Google Analytics for statistics related to use of the the library’s web site and digital collections. This is relatively straightforward, except that digital collections are spread out over several domains. These need to be aggregated.

Counts of the number of staff members seems a straightforward metric, but we need to decide on who should be counted as a “professional” and who not. Archivists, information technology staff, editors? The library has many, many roles that our outside traditional MLS-degreed librarians. And each library counts it differently. ARL (Association of Research Libraries) has us count professionals. ACRL (Association of College and Research Libraries) has us separate professionals into two groups: librarians and non-librarians. The definition of a librarian reads, Staff members doing work that requires professional education (the master’s degree or its equivalent) in the theoretical and scientific aspects of librarianship. That’s kind of vague!

Circulation of physical materials should be straightforward, but ARL is only interested in books and media – no reserves, no computers. Our coding systems don’t always make it easy for us to pull those kinds of materials out of the reports. Study keys – those numbers are  important  to us locally, but not to ARL or ACRL.

The electronic resources statistics. This is a huge effort, mainly on the part of Darina Skuba and Karen Kohn – who go to each vendor site to pull statistics on e-book usage, article downloads and database searches. Then we combine two sets of files (The Counter standard is based on calendar year but we report for the fiscal year), to determine a total number.

Finally, I rely on Royce Sargent (HSL)  and Carla Wale (Law), to provide me with the same sets of numbers for Health Sciences and Law.

Why do we do all of this? Several reasons: ARL and ACRL provide useful metrics for benchmarking our activity and institutional support compared to our peers. The numbers can provide important information about trends in academic libraries – what’s going up (e-resource usage, collections expenditures) what’s going down (circulation of physical materials, reference). Finally, the University relies on these numbers as they contribute to rankings, ultimately attracting the best faculty and students (and alumni support) to Temple. So I endure the pain and promise myself it will get easier next year. 

 

 

Posted in statistics | Tagged , | Leave a comment

Assessment Community of Practice Focus: Faculty Survey on Undergraduate Instruction

Last week’s Assessment Community of Practice meeting focused on findings from the Ithaka S+R faculty survey, particularly those related to Undergraduate Instruction. Twelve librarians participated and we were lucky to be joined research project team members  (Rebecca Lloyd, Annie Johnson, Fred Rowland and Kristina De Voe). The full summary report is linked here:  Ithaka Survey Undergraduate Instruction_Final

Finding: A high percentage of faculty (72%) show video and their classroom and almost half (47%) assign the creation audiovisual or digital media projects.

Multimedia is used a good deal in classrooms, and increasingly, student assignments include a multimedia component. This finding prompts some further questions:

  • How do instructors find out about media offerings at the Libraries? Are they included in regular notifications of new library materials? 
  • Is it easy for instructors to embed library-licensed video or other media into course management systems? As easy as incorporating YouTube?
  • Do we, as librarians, need to be more savvy about the kinds of functionality available for use of multimedia in the classroom? Copyright restrictions? Technical barriers? 

Students are also required to create  “multimedia” projects as part of their course work.

  • What do these “audiovisual or digital media projects” look like? Are they Powerpoint presentations or podcasts?
  • Does the library provide sufficient support (technology-equipped spaces and services) for students working on these projects?
  • Do we have the information we need to make appropriate referrals to other services on campus that do provide support, and how might we partner with them?
  • What kinds of services and support will there be in the new library building for students as they develop digital technology projects? 

Finding: Materials that are openly available on the web are of interest for course content. 79% of faculty often or occasionally “give preference to assigning course text or materials that are freely available”.

This is a finding of real interest to the Libraries as we promote Open Educational Resources.

  • How do faculty understand “freely available”? Might this mean YouTube videos, or open access scholarly journals?
  • Are faculty aware of the Library’s Textbook Affordability Project?
  • How should we encourage faculty to use librarians help them locate quality resources for use in their courses?

Finding: A good percentage (50%) of faculty are strongly positive about the adoption of “new pedagogies or instructional approaches that take advantage of opportunities offered by digital technology.” For instructional support when introducing these new pedagogies, they rely most on their own ideas 72%, then on scholars in their personal network (52%), and then on the library (40%).

Faculty  tend to rely on their own devices when learning about new technologies to incorporate into the classroom and in spite of technology challenges in the physical classrooms,  many are interested adopting new approaches.

  • What support is currently available to faculty interested in learning about digital technologies for use in the classroom?
  • Are there technologies available to faculty through their schools that we should be aware of?
  • How might the Libraries provide more support for faculty in need of instructional support in these areas?

These Assessment Community of Practice meetings generate many good questions, and allow us to consider how research into practice can be challenging.  Surveys, no matter how well designed, do not always lead to clear answers or implications for library services. Did the respondent understand the question and the way that we, as survey designers, meant it?   How do we best write up our findings in a compelling, yet neutral and balanced way? As the research team can attest to, turning raw numbers into a meaningful and compelling story took hard work and a willingness to creatively translate findings into  action steps.

 

 

Posted in instruction and student learning, surveys | Tagged , , , | Leave a comment