The Cost of Library Value

The inability for libraries to quantify what they do is a continuing challenge in the face of substantial cost pressures.  As libraries move away from services that support metrics and into wholly intangible operations, there is a risk of making it harder for funding bodies to understand what we do.

When you buy a product, you pay a price.  The assumption in that sentence is that you pay a price that is equal to or less than the value that you think the product will give you.  No-one wants to pay more for something.  This is problematic in libraries of any stripe, because the end product of library work is often intangible.  A lawyer might find a successful outcome for a client, someone might get hired after updating their resume, someone else might learn to read.  This is not a new topic but a couple of recent articles show the tension continues.

Libraries:  Counting v. Valuing

Case in point:  Plymouth (UK) has drafted a plan to eliminate about half of its public library branches.  The remaining branches account for the vast majority of activity:  80% of visits, 75% of books loaned.  These are the fundamental metrics for any library:  foot traffic and books used.  There is no good way to measure the value of that output, but the funding authority can determine how much it cost for each visit or each book loaned.  And, more importantly, it can measure the cost of the lower-performing libraries.

Another example.  The public libraries around Minneapolis, MN, have seen all sorts of change, but  some see the changes as distracting from the core.  An unhelpful star ranking system raises questions about the work each library does.  And they do a lot, but it includes lots of things that are hard to actually measure or valueThe state agency managing libraries finds that foot traffic and reference questions are down, that computers in libraries are used less, and so on.  The library managers note that (a) it’s difficult to measure traditional libraries and (b) libraries are now providing a lot of non-traditional services.

We have created a history of benchmarking ourselves based on our core activities:  managing access to information, still heavily physical collections, and comparing usage year to year.  Our funding bodies still focus on those numbers because (a) we said they’re important and (b) they are data, which is hard to get from other aspects of library work.

How to Measure Value?

Libraries have struggled with this.  I’ve seen libraries attempt to use a balanced scorecard approach to invoke commercial concepts like key performance indicators (KPI).  Libraries are businesses and we have an orientation that should be similar to businesses selling a product.  At heart, if no-one is buying what we are selling, no-one is going to pay for it either.  That’s what library cuts are; they’re a statement that the perceived value is less than the cost, rightly or wrongly.  The hundreds of closed libraries in the UK reflect that the value and cost relationship isn’t static; as money becomes more scarce, the expected value goes up, because the dollars are needed in more places.  Douglas County (OR) is closing its libraries because the taxpayers wouldn’t bear the cost any longer.

American public libraries are looking at outcome measurement and using surveys to try to quantify qualitative experiences in Project Outcome.  This looks interesting and is at least a whole lot better than the Library Value calculators that have been around for a couple of years.  The calculators remind me of legal publisher electronic subscriptions, which often have a default value attributed to a transaction:  $10 for a search, $25 for a form, $5 for a database browse, and so on.  It makes library promoters feel good but I don’t think people who use the calculators would necessarily value the activities at the same level as libraries do.

Project Outcome gets at the very difficult question that businesses struggle with all the time.  How do you measure the value someone got from the exchange?  Libraries appear to offer free transactions, and that may be an obstacle in gathering this kind of feedback.  One of the challenges is how to scale this kind of solution.  When you look at the raw data for the project [WARNING:  Microsoft Excel], most libraries are managing a couple of surveys.  There’s staff time involved in creating, running, and analysing the data.

Some Ideas

It will continue to be necessary to count transactions in libraries.  This is the meaningful measure that funding bodies can understand.  Failure to count will create battles over anecdotes and, with other financial pressures, libraries will likely lose out.  But value measures are, I think, possible if they are individualized and automated.  For example, a library might:

  • insert a screen in the self-checkout process, BEFORE the final confirmation and completion, that asks a single question:  satisfied/unsatisfied, or found what you wanted/didn’t find what you wanted, and 5 stars.  Similarly, in libraries that are adopting self-checkin, ask for feedback about whether the book was used or useful;
  • send an automated response on receipt of an e-mail request for reference or document delivery WARNING the sender that the response would come with a one-question survey, and then send the survey link in the response e-mail, and set up a one or two e-mail nag in the few days after the person received a response;
  • like Amazon, ask whether catalog results match what the person expected they’d receive, and then look at ways to increase the number of positive responses.

All of these would work in the digital environment that libraries are struggling with measuring.  And they would all generate regular data that could complement the more traditional metrics.  Depending on the library, there might be other options to surface the actual cost of the free-to-them resources:

  • at self-checkout, include a receipt that shows a zero balance but that has the ACTUAL COST of the items (which are probably or could be put into the catalog) so that each person sees the actual benefit they’re getting, not some average randomized one;
  • on a database log-in screen – or, more likely, on the EZProxy or other pass-off page before a person hits the database – include information on the dollar value of using the database.  Whether we have COUNTER or not, we can generate an average cost per user of a database and let people see that at the time of use.

Electronic resource management tools are already common in business, corporate, and law libraries to attempt to tie transactions to dollars.  But there are opportunities in many libraries to be more assertive in sharing the cost of the activity with the person who doesn’t realize they’re paying for it with taxes or membership dues and who thinks that it is free.

David Whelan

I improve information access and lead information teams. My books on finding information and managing it and practicing law using cloud computing reflect my interest in information management, technology, law practice, and legal research. I've been a library director in Canada and the US, as well as directing the American Bar Association's Legal Technology Resource Center. I speak and write frequently on information, technology, law library, and law practice issues.


  1. This post hits on many of the issues we’re struggling with. The ideas you propose are interesting, but I’m not sure they’re what we really need – they focus on things we’re already measuring fairly well (circulation, web catalogue use, licensed online content use).

    I can’t speak for other libraries, but for us the challenging areas where we need to do a better job of capturing and demonstrating value are programming and interactions with staff.

    1. That wasn’t my intention; I don’t think that what libraries are measuring now provide any indicators of value. The question I’d intended to raise was whether we can insert into our activities or usage measurements an actual assessment of value. Library context will matter but extrapolating value based on usage, or applying it uniformly misses the reality that value differs by recipient. FWIW, I spin this one on a bit here:

Comments are closed.