What the Users Want: Guessing vs. Knowing

At some point someone must have asked Henry Ford if he conducted focus groups, surveys or ethnographic studies to find out what types of cars and unique features his customers wanted. I say that because the statement “If I asked my customers what they wanted, they would have asked for a faster horse” is a quote attributed to Ford that I’ve now heard used in multiple presentations and in multiple blog posts. An internet search of the phrase will bring up dozens of occurrences, yet know one actually knows if Ford said this or when he said it or in response to what sort of question. The essence of the quote is that it’s pointless to just ask your customers what they want because they either don’t know what they really want or what they think they want isn’t what they would really want if something much better was offered – like a car instead of a faster horse.

Those who use the quote will often point to the success of Apple, a company that promotes the idea of trying to determine what the users would like to have that they currently don’t have or cannot do, and uses that approach to improve on existing technologies or create systemic experiences where none exist. This all tends to conflict with the idea of using techniques such as surveys and focus groups to better understand user reaction to existing products and services, as well as wants and needs. What risks do organizations take if they ask these questions and then develop services or create change based on what they learned from the user? This is particularly critical when planning new buildings or renovations. Do we add dozens of additional electrical outlets because the users tell us they need them or because we observe them sitting on the floor next to a scarce outlet or do we take a risk on a new technology that can power devices wirelessly because we think they’ll want that even more – even if they don’t have it now?

So what do we do? Do we make educated guesses about these things in an attempt to pleasantly surprise the user with something new and unanticipated, or do we always try to make sure we know what the users want by taking the time to ask the right questions and listen carefully? Or, do we use anthropological and ethnographic methods that offer some mix of strategies. For example, if you watch the Deep Dive video you’ll see members of the IDEO shopping cart project team going out to supermarkets to talk to the people who use carts. They learn that the carts get stolen because of the metal’s value, that carts can damage cars if blown by the wind, that fast shoppers leave their carts at the end of the aisle and then walk to the products rather than taking the carts up and down the aisles and that parents take multiple approaches to putting kids in the carts. Some of this information is gathered by asking shoppers what they do while some comes from direct observation. In a previous post I pointed to the importance of learning about users from listening to and observing them; I related the story of the company that learned from observation that men used their body soap products in a very different way – and quite different from what they learned when questions were asked in focus groups.

It seems that more libraries are catching on to the use of anthropological methods that were pioneered at the University of Rochester Library. I have heard of several libraries that are exploring this method, and more will no doubt be employing it with librarians attending workshops on how to use this technique in their libraries. Just recently, the Library atCalifornia State University at Fresno, issued a report on their findings, and this will no doubt continue to add to the popularity of conducting anthropological studies of members of the library user community. So what might we learn from this study which examined student behavior across multiple dimensions of library use? First off, the study team involved faculty. That seems to be developing into an accepted model for conducting these studies. If your campus has anthropologists, seek them out to collaborate with you on this project. The report provides good insight into many techniques available to better understand student work practices. Anyone seeking to replicate this type of study will find good ideas in this report. As I read many of the recommendations and conclusions I find few that are particularly innovative and some mirror what we already know about student work practices and space preferences, but it is a reminder that creating a better user experience is not necessarily about concocting some cool new service. It’s about understanding your students and the things that give them a memorable library experience.

Can the library community benefit from more of these studies? As the authors of the Fresno study make clear in the introduction to their report there are significant differences between their library and others, such as the University of Rochester, that have conducted work practice studies and shared the results. Given the uniqueness of each library user community, one library’s findings about their students and faculty are quite likely to be different from another. Similar trends may be found across different communities, such as student procrastination or the desire for technology-outfitted study rooms, but the differences in demographics, size, resources and other factors suggest that each one could benefit by delivering a unique user experience. So expect more of these studies. Each will add to our knowledge of how to design a better library.

Do Library Staff Know What The Users Want?

Perhaps the most basic premise for delivering a great library user experience is knowing what members of the user community want from the library, and being able to articulate their service expectations from the library. Then, using that knowledge, the librarian’s responsibility is to design an experience that delivers on those expectations and exceed them when possible. If successful we should be able to create a loyal base of community members who will support the library and desire to use it repeatedly – and recommend that their friends do so as well.

Much depends on our ability to identify and develop services that meet user expectations. But how well do we know what those expectations are? According to a recent research article, not well enough. This article’s findings should be a cause of concern for librarians hoping to design a better experience for their users. The bottom line: the priorities for the library staff and for the library users are poorly aligned. This is based on a study of Association of Research Libraries (ARL) that participated in the 2006 LibQUAL+ library quality survey. The authors, Damon Jaggars, Shanna Smith Jaggars and Jocelyn Duffy, in their article titled “Comparing Service Priorities Between Staff and Users in ARL Member Libraries” found that a disconnect existed between library staff and their users.[See portal: Libraries and the Academy, Vol. 9 No. 4, 2009, pgs. 441-452]. For library staff, the highest priority was “affect of service”, but for all user groups (undergrad, grad and faculty) the highest priority was “information control”.

For those less familiar with LibQUAL, “affect of service” relates to service interactions between library staff and the users; survey participants are asked if library employees instill confidence, give individual attention, understand user needs and have the knowledge to answer questions. “Information content” refers to the materials and collections made available by the library to its users; respondents are asked about their access to printed and electronic materials, navigation of the library website and ease of use factors associated with finding information provided by the library. We may have a serious problem when what library staff think is most important is not what the users think is most important. If I think that good food is the most important component of a dining out experience, but the staff have as their highest priority something entirely different, such as comfortable seating, that may spell disaster for the quality of the overall experience.

But the more I thought about the findings, the less alarmed I was by it than the authors of the article. While this disconnect does exist, the good news from my perspective is that the staff of the ARL libraries included in the study believe that providing high quality service is a priority. Even if that was not the priority for the respondents, my expectation is that those ARL libraries where staff see affect of service as the highest priority are well positioned to deliver good service. While we can acknowledge that faculty, graduate and undergraduates may care less about the affect of service and more about the content, it should not diminish our desire to create a better user experience for them. I would encourage those who read to article to take from it an understanding that ARL libraries must always deliver high quality content for researchers, but a priority is to create the best relationships with the user community that will encourage them to see that the academic library is more than books, articles and media. The irony is that it is the people who acquire and make accessible the content that is the priority of the users. Now how do we get them to feel the same way about the people?