There are few things more important in the librarian profession today than proving to the funders that the library is relevant and valuable to the community. You can better prove that by using more than “count” numbers – circulation count, door count, program count, etc. – by using qualitative data for your measurements of success.
Last April I posted Library Strategic Planning Process Overview in which I made the claim that,
… the library director should have a good working knowledge of the Strategic Planning Process in order to steer the Library Board toward accomplishing the process and assisting in developing a useful Plan that the Director and library staff can then follow through their daily activities. The basic components of a Strategic Plan include the following….
VII. Measures and Outcomes – The performance of an Activity that can be determined over time using quantitative data, and the actual impact, benefits and/or changes resulting from performance of the Activity. [Emphasis added.]
Quantitative data, and the actual impact, benefits and/or changes affected by an Activity are too broad of descriptors to work with effectively. Let’s take a closer look at each of these elements.
Quantitative date is easy. It’s the “count” numbers – at a program, for increase/decrease in ‘whatever’ count, and any other numbers associated with an Activity. Associated with some benchmark of what “success” means, they can be used as straight forward quantitative measures of “success”.
Actual impact, benefits and/or changes affected by an Activity represent the total opposite of quantitative, because now you’re getting into the qualitative measure of “success”, and that’s becomes more subjective and much harder to collect data to establish that success.
OK, so what is the point of restating the obvious? Of course, qualitative data is much harder to collect than quantitative.
Last August I posted Outcomes, Outcomes, Outcomes in which I re-cited (it’s hard to overstate the value of that article) the “exceptionally thought provoking article by Scott Corwin, Elisabeth Hartley & Harry Hawkes – “The Library Rebooted”. One of their seven propositions included;
6. Expand the metrics.
…it will be important … for the measurements to move beyond the strictly countable … into attitudinal areas like level of engagement and customer satisfaction. … [I]n the bigger context of changes, this resistance to [measure staff performance] should be easy to surmount. Institutions that proactively measure performance, embrace change, and look for ways to serve users will have an easier time getting financial support in an era of reduced public resources and private donations.” [Emphasis added.]
Using qualitative data to prove your value to the community is a sure-fire, can’t-fail way to demonstrate your library’s value and relevance to the community.
I do not mean anecdotal data. Forget that concept in connection with qualitative data. Anecdotal data will always be viewed as just that – anecdotal. “My son Johnny used to be inattentive and disruptive, but now he is a good listener thanks to the library story time.” Seriously? Would you expect that to persuade some city council members to give the library more money instead of the police and fire departments? However, if you told the city council that “80% of parents reported that their childrens’ attention and listening skills had improved as a direct result of attending the library’s story time.”, and “90% of parents reported that their childrens’ reading skills had improved as a direct result of the library’s story time.” THAT would have much more impact and prove the value of a library service.
Qualitative date regarding outcomes means the amount of impact, benefits and/or changes affected by a service/program. Qualitative data are not intended to be predictive, therefore they do not require the same scientific rigor that quantitative data often require in research. In research terms qualitative is considered ‘historical’ type data.
The only way to collect these data is to get up-close and personal with your customers, constituents, partners, stakeholders. You should be already, but in order to prove your value to the community – it is essential.
Taking techniques from research, let’s look at what is termed “naturalistic” or “humanistic” research, which is more interested in the historical perspective – what happened – than the predictive – what will happen. In Qualitative versus Quantitative Research: Key Points in a Classic Debate, author James Neill wrote about the nature of qualitative data compared to quantitative data, where qualitative data is;
• Subjective – individuals’ interpretation of events is important, therefore, uses participant observation, in-depth interviews, etc.,
• More ‘rich’, time consuming, and less able to be generalized,
• Researcher tends to become subjectively immersed in the subject matter.
While quantitative – “count” – data can tell us there was a circulation of 2.5 items/per capita for your library, it can not tell us what was the impact of that level of circulation. Only by interacting with the customers who actually checked out those items, and how that impacted their lives, can we tell the real story of our library’s value and relevance to the community.
According to Guba & Lincoln, “It is most useful in all forms of inquiry, whether research, evaluation, or policy analysis, to organize the human instrumentation into teams. … Teams have at least the following advantages:”
• Teams can accommodate multiple roles; …
• Teams can represent a variety of value perspectives; …
• Teams can represent multiple disciplines; …
• Teams can pursue multiple strategies; …
• Teams can reflect both substantive and methodological expertise; …
• Teams can be organized so as to provide for internal checks on rigor; …
• Teams can provide mutual support…
[Guba, E. & Lincoln, Y., 1981. Effective evaluation. San Francisco: Jossey-Bass]
They also explain that “The human instrument [the interviewer] operating in an indeterminate situation (not knowing what is not known) falls back on techniques such as interview, observation, unobtrusive measures, document and record analysis, and nonverbal ques.”
Interview customers that you see frequently, as well as customers you’ve never seen before. If nothing else more specific, ask them – “What value do you find in using the library?” Collect the data and analyze it for commonalities.
Observe the behavior of your customers as to what they do, who they talk to, what sections of the library they frequent, what services are most used and when. Collect the data and analyze it for commonalities.
Unobtrusively measure various segments of your customers by age, gender, ethnicity, culture, and other observable factors and their related behaviors in the library. Collect the data and analyze it for commonalities, and assess whether the data are representative of your community.
Document and record analysis includes customer survey responses, program feedback, training feedback, customer suggestions, and most of all circulation data. It is not intrusive to customer privacy to analyze what circulates and what does not. Analyze the items that circulate in your collection for commonalities and trends.
In order to ‘prove’ the value and relevance of your library to your community, conduct naturalistic inquiry using all of your staff to get close to your customers, collect data about how they value your library, and analyze the data to determine what is the library’s value. How can that approach fail to get results with your funders?