<span class="padlock_text"></span> v25 #3 Biz of Acq

by | Jul 10, 2013 | 0 comments

Gathering Data:  How Two USMAI Libraries are Using eBook Statistics

by Randall Lowe  (Collection Development, Acquisitions & Serials Librarian, Lewis J. Ort Library, Frostburg State University, 1 Stadium Dr., Frostburg, MD, 21532;  Phone: 301-687-4313;  Fax: 301-687-7069)

and Lynda Aldana  (Head of Technical Services, Albin O. Kuhn Library, University of Maryland Baltimore County, 1000 Hilltop Circle, Baltimore, MD 21250;  Phone: 410-455-3468;  Fax: 410-455-1598)

and by Column Editor:  Michelle Flinchbaugh  (Acquisitions Librarian, Albin O. Kuhn Library & Gallery, University of Maryland Baltimore County, 1000 Hilltop Circle, Baltimore, MD 21250;  Phone: 410-455-6754)

Background

It is probably safe to say that most librarians now consider ownership and access to eBooks a required, if not essential, part of their collections and developing eBook collections a required, if not essential, part of their collection development plans.  eBooks assist librarians in meeting the information-seeking needs of users regardless of where they are:  in campus dorms, off-campus apartments, studying abroad, etc.

Just as with print collections, libraries offer a variety of eBooks using a variety of purchasing models and working with multiple vendors.  The methods of acquisition vary as well.  eBooks can be “acquired” as single-user purchases, multi-user purchases, rented via short-term loans, or the content can be leased.  In other words, not all eBooks, eBook packages, or eBook vendors are created equal and as a result gathering statistical data that covers all aspects of eBook use can present challenges.

Frostburg State University and the University of Maryland, Baltimore County (UMBC) are both part of the University System of Maryland Libraries (USMAI) consortium.  The schools that are part of USMAI have, as a part of their collective eBook holdings, a large legacy eBook collection.  Individually, several USMAI schools also have their own eBook collections as well as demand-driven acquisitions programs.  Additionally, USMAI is working to implement a demand-driven acquisitions (DDA) pilot with EBL at the consortial level which will provide access to a group of eBooks for all users within the consortium regardless of campus affiliation.

Just as the eBooks come in many different “packages,” so do the statistics.  Initially, it can appear to be a case of comparing apples to oranges and not necessarily an undertaking for the faint of heart.  Eventually, for Frostburg and for UMBC, not only will we be gathering and analyzing statistics at the local level, but we will also be analyzing  statistics for our consortial DDA titles.

However, before we examine what statistics we are gathering, we need to be mindful of why we are interested in this data.  The reasons to gather and analyze eBook statistics are just as varied as the platforms, packages, and vendors.

Why Gather Statistics

Just as we have for our print collections, statistics are being used to:

•    Inform collection management decisions for new eBook purchases.

•    Ascertain the effectiveness and appropriateness of particular purchasing methods (i.e., DDA vs. acquiring subject collections).

•    Justify purchasing eBooks using materials/book budgets or funds that have traditionally been used to purchase print materials.  Comparing use of print books and eBook titles can be useful.

•    Evaluate eBook purchases or selections to see if these continue to meet ongoing research and information needs of the faculty, staff, and students.  Perhaps a program has changed or a course of study has been added.  Are the items being acquired being used?  Are these resources effectively meeting the needs of the institution?

•    To share with university administrations or funding agencies how valuable an eBook program is and how it aligns with the institutional priorities and informational needs of the institution.

Statistics, when presented with clear definitions of what is being measured, are useful.  However, those collecting and analyzing the data need to understand what is available and what the advantages and limitations might be for all of the sources of eBook statistics.

Definitions/Concepts

One source of eBook statistics for libraries is the group of reports provided by COUNTER (Counting Online Usage of Networked Electronic Resources)COUNTER, the initiative to set standards for recording and reporting online usage statistics, provides a way for librarians to use statistics from vendors who are adhering to this standard (http://www.projectcounter.org).  COUNTER was originally released in 2002.  Release 1 of the COUNTER Code of Practice for Books and Reference Works was published in March 2006 and Release 4 of the COUNTER Code of Practice for e-Resources was published in April 2012.  More and more vendors are providing these COUNTER-compliant statistics.  COUNTER provides several reports.  Each has a particular focus as you can see from the work being done at UMBC and Frostburg.

A key part of becoming comfortable with using the COUNTER statistics is becoming familiar with the terminology.  For those who have worked with e-resources and COUNTER this is not an issue, but for those new to eBooks and this initiative there may be a learning curve.

Each release of COUNTER includes a glossary of relevant terms.  Some of the commonly used terms are:

Search — a specific intellectual query, typically equated to submitting the search form of the online service to the server.

Section — a subdivision of a book or reference work.

Session — a successful request of an online service.  It is one cycle of user activities that typically starts when a user connects to the service or database and ends by terminating activity that is either explicit (exiting or logging out) or implicit (timeout due to user inactivity).

Successful request — for Webserver logs, successful requests are those with specific return codes.

Turnaway (rejected session) — defined as an unsuccessful login to an electronic service due to exceeding the simultaneous user limit allowed by the license.

While it is easy to find and study the COUNTER-compliant definitions, the statistics that come directly from vendors can be more complicated.  In some cases for these statistics, it has been harder to ascertain how terms are defined and to determine exactly what is being measured.  While some vendors are able to provide very granular data about how an item was used, this data is not as standardized as with the COUNER reports.  In other instances, vendor-supplied statistics can use the COUNTER terms differently.

One such example is the term “turnaway.”  As indicated above, turnaway or rejected session according to the COUNTER definition consistently indicates “an unsuccessful log into an electronic service due to exceeding the simultaneous user limit allowed by the license.”  But at least one vendor uses “turnaways” to mean that users attempted to access a resource to which an institution is not licensed to access.  In other words, users were turned away not because the limit of allowed users was exceeded but because the library was not paying for or subscribing to that title(s).  This particular vendor uses the turnaway data to show which additional resources a library might want to purchase.  When having institutional level conversations, the differences between the uses of these terms are important to keep in mind.

Another question related to data gathering that we have struggled with is how we draw the line between significant use (i.e., the book is checked out) and just cursory use (i.e., the book is pulled from the shelf, browsed, left to be reshelved).  For a DDA collection, the point at which a purchase or short-term loan is triggered may provide that “line.”  For purchased titles, it might not be that easy.  Another question that statistics may help answer is how can a library effectively compare uses of eBooks vs. circulations for print materials?

While there is a need for advocacy to press vendors on the data that is provided for statistical analysis, we can begin to use data that is available to inform our eBook discussions and decisions.  In the sections that follow, UMBC and Frostburg present how each school is working with COUNTER statistics as well as vendor-supplied statistics, the advantages and limitations of each, and the importance of gathering this information at each institution.

eBook Usage Statistics at UMBC

UMBC gathers COUNTER eBook usage statistics and attempts to standardize them in order to determine relative value of eBook purchases across vendors, packages, and purchase methods in terms of price per use.  At UMBC eBooks are purchased from seven vendors via a variety of methods including one-shot purchase, one-shot purchase with ongoing platform fees, Demand-Driven Acquisitions, and subscription.  Six out of the seven vendors have at least one COUNTER report available.  All are using COUNTER Books and Reference Works, Release 1, usage statistics, the 2006 release.  One vendor provides ICOCL statistics but is currently converting to COUNTER.  All of the vendor reports can be run in Excel or Excel compatible format.

UMBC maintains a unique spreadsheet for each vendor containing its statistics.  All of the vendor COUNTER reports, or alternate vendor reports, are placed in individual worksheets in that vendor’s spreadsheet and labeled by year and type of report.  The first worksheet in each vendor’s spreadsheet contains general information, the number of volumes owned, totals for each report for the year, cost, and calculations of cost-per-volume, per request, per search, and session (Figure 1).  Where no acceptable COUNTER report exists, we substitute comparable vendor reports in their place where possible.  The spreadsheet is stored on a shared network drive making all of the statistics available to library selectors and collection management librarians.

Figure 1

Figure 1

Each unique vendor spreadsheet contains a summary worksheet (Figure 1).  The per-use statistics are drawn from the vendor COUNTER report or alternate unless there is no source for a given statistic in which case “n/a” is recorded in its place.  Annual numbers of section requests, turnaways, searches, and sessions are linked to the appropriate cell in the appropriate worksheet for the year and report.  If the vendor didn’t provide a report with a given statistic, “n/a” is put in place of the statistic.  If the vendor has a comparable non-COUNTER compliant report, those statistics are added on in additional, appropriately labeled columns.

Cost is input in the summary and used to calculate a variety of cost-per-use numbers: cost-per-title, cost-per-request, cost-per-search, and cost-per-session.  Some costs are partial estimates due to difficulties in locating information on how much was paid for eBooks purchased long ago.  When cost was estimated, this was noted in the summary spreadsheet to ensure that everyone who uses the data is aware that it isn’t necessarily accurate.

Because we want to end up with comparable statistics, we make costs with a variety of different purchase models comparable by using an average of the payments to calculate the cost-per-use statistics.  In the case shown in Figure 2, where we paid a one-time fee of $4,006.95 completely in the first year, and annual platform fees thereafter, we add all of the money paid to the vendor and divide by the number of years, and find a cost-per-year of $1,245.49 which is utilized to calculate all of the cost-per-use statistics.

Figure 2

Figure 2

The information ubiquitously available and comparable among all of the vendors was cost-per-title and cost-per-request.  That data was pulled into a summary spreadsheet for side-by-side comparison of cost-per-title and cost-per-request among eBook vendors (see Figure 3).

Figure 3

Figure 3

eBook Usage Statistics at Frostburg State University

Frostburg State University collects and analyzes eBook statistics in order to assess the effectiveness of these resources as demonstrated by student and faculty use as well as to inform future collection development decision-making.  Reliable, accurate, organized, and useful COUNTER-compliant usage statistics for eBooks are vital to libraries practicing sound fiscal management of their materials budgets while simultaneously attempting to meet institutional priorities and the needs of their constituents.  This is especially true at Frostburg where the Lewis J. Ort Library has served approximately 5,000 students and over 300 full- and part-time faculty while only expending an average of $35,421 on monographs annually since 2005.

eBooks were introduced to the Frostburg campus community in the early 2000s via a NetLibrary collection shared by the member libraries of the USMAIFrostburg made a further commitment to incorporate eBooks as a regular component of its collection in 2011 after completing a migration of most of its periodicals collection to electronic format from 2008-2012.  Frostburg also introduced online MBA and Nursing programs, which in addition to a growing number of online courses across the curriculum, increased the urgency to make more eBooks available to students and faculty.

Frostburg could not add many eBooks within the framework of its existing monograph budget, and after the careful migration of other parts of the collection to electronic format, no additional funds to support an adequate number of titles were available by shifting priorities within the materials budget — at least not without making painful cancellations.  There also was uncertainty concerning student and faculty adoption of an expanded eBook collection.  As a result, the library applied for and received a $45,000 grant from Frostburg’s Student Technology Funds in fiscal year 2012 (July 2011-June 2012) to pilot the expansion of the number of eBooks available to students and faculty; a second $45,000 was granted for fiscal year 2013.  Since Student Technology Funds are not permitted to be utilized for ongoing subscriptions, the library decided to combine one-shot eBook purchases as well as pilot a Demand-Driven Acquisition (DDA) program.  Following  extensive vendor comparison research, Frostburg decided to both purchase eBooks and establish a DDA account with EBSCO.  Since Frostburg could not absorb platform fees and the eBook titles from the previous USMAI NetLibrary shared collection were now hosted by EBSCO, it was determined that the library could best utilize the granted funds with a single vendor, thus being able to measure and assess the use of both new and legacy collections of eBooks while simultaneously evaluating the effectiveness of two different acquisitions models.

Given the uncertainty of future technology fund grants and the limitations of the library’s current operating budget, Frostburg not only has a need to collect eBook statistics to assess general use in order to determine if students and faculty are adopting the format, but also which collections (shared legacy vs. new titles) and which acquisition models are most effective.  As a relatively small library with limited fiscal resources, we will not have many options available to us after the technology fund grants are expended.  The extent to which we commit to eBooks in the future and the acquisition models we pursue will be informed by the data we collect.

Frostburg measures the general use of its eBooks by employing COUNTER-compliant statistical reports made available by EBSCO.  This is a straightforward reporting process as Frostburg is only utilizing one vendor; standardization of multiple reports in a manner such as UMBC will be implemented as eBooks are obtained from additional vendors.  Title requests is the primary metric used to determine and demonstrate use of the collection.  FSU uses COUNTER Book Report 1 (Number of Successful Title Requests by Month and Title) to obtain this information, although monthly statistics are kept on a spreadsheet based on fiscal year (July-June), which means that the vendor-supplied report cannot be used as produced since COUNTER statistics are based on a calendar year.  Statistics are also kept for the number of searches of the eBook collection using COUNTER Book Report 6 (Total Searches and Sessions by Month and Service) and turnaways are tracked using Book Report 3 (Turnaways by Month and Title) for identification of titles for which the purchase of additional simultaneous users may be considered.

Like many libraries, Frostburg considers cost-per-use data to be of the utmost importance in setting budget priorities as well as making collection development decisions.  Since neither COUNTER nor unique vendor reports from EBSCO include cost data, annual eBook expense information with the vendor is applied to the use statistics spreadsheet to obtain cost-per-use data for the year.  As described above, Frostburg’s most pressing need is to measure the use of its eBooks by acquisition type (one-shot purchase vs. DDA) in order to determine which model is more effective and thus most likely to be pursued after special eBook funding is expended.

While collecting and reporting use and cost-per-use information from one vendor is not particularly challenging, obtaining the desired information requires assembling the data from disparate reports from two different vendor administrative tools.  Frostburg uses COUNTER Book Report 1 use statistics combined with EBSCO’s “My Owned Titles” report in its EBSCOhost Collection Manager (ECM) system to obtain the desired information.  The latter report, in addition to title, author and imprint, includes the publication date, ISBN, e-ISBN, subject headings, method of acquisition, simultaneous user (check out) limit, and quantity (number of copies held) information.  Figure 4 illustrates the report as downloaded to a spreadsheet and edited by Frostburg to delineate titles by acquisition model type.  Titles from the USMAI legacy shared collection have a TRUE designation in the Shared column with a FALSE label in the DDA-Triggered and Owned columns.  One-shot purchases by Frostburg are labeled TRUE in the Owned column and FALSE in the DDA-Triggered and Shared columns.  Those titles triggered for purchase by Frostburg users have a TRUE designation in both the DDA-Triggered and Owned columns.

Figure 4

Figure 4: Contract Publisher, ISBN, eISBN, BISAC/Library of Congress Subject Heading, Library of Congress Call Number, and Format information is included in the full EBSCO report

COUNTER Book Report 1 is the smaller of the two reports since it only includes titles actually used.  Therefore, acquisition type information from the ECM owned titles report is manually added to the COUNTER report in order to obtain the information we desire.  The final combined report (Figure 5) includes eBook title, publisher, retrievals, publication year, and acquisition type information (DDA, DDA not purchased, purchased, shared).  An acquisition status of “Purchased” is a one-shot purchase.

Figure 5

Figure 5

An interesting issue arose when titles appeared on the COUNTER report, but not the ECM report.  Upon examination of some of these eBooks, it was found that the investigated titles were available to users through the DDA program, but were not yet triggered for purchase.  As a result, use for all such eBooks was credited to the DDA acquisition type (listed as “DDA Not Purchased” in Figure 5), but having a designation in the vendor-supplied reports for eBooks with this status would be helpful in verifying this assumption.

This manually assembled report is valuable for comparing use of eBooks by acquisition type.  As Figure 5 suggests, there has not been significant use of the eBooks that comprise the legacy USMAI shared collection.  Since only 549 titles were used in 2012, combining use and acquisition type data was possible at this early stage of eBook purchasing for Frostburg, but is not a sustainable model for collecting data we need, especially as the eBook collection gets larger.  In fact, Frostburg is no longer producing this manual report on a regular basis, but we plan to compile annual data as needed for budgeting purposes, especially in preparation for a time when special eBook funding may no longer be available.  This is a clear case of the need libraries have for vendors to use COUNTER-compliant data in providing more robust, user-defined reports.

Strength and Liabilities of COUNTER Statistics

COUNTER statistics are invaluable because they provide consistent and reliable usage statistics across vendors not only by providing standards but also by auditing vendors for compliance.  While COUNTER serves an invaluable purpose, some bothersome issues impact the consistency and reliability of COUNTER reports.  But the COUNTER standards are a work in progress and the Code of Practice for Books and Reference Works is currently targeted for improvement, so improvement can be expected.

Because COUNTER provides standards for usage statistics and does not influence how files are stored and served, variations in this across vendors make data incomparable.  The R2 CONSER report is based on sections of a title that users have requested, and there is no standard definition of a section beyond how a given vendor has split a given title into multiple files.  Multiple uses of a single title may in fact be a single user using the title one time but navigating to different parts of the book that happen to be stored in a different file.  For this reason, usage data cannot be meaningfully compared across vendors.

A lack of specificity in the COUNTER standard allows for variation in the reports across vendors.  COUNTER reports may come with all titles that the vendor has, all that libraries owns, or only those with uses. In the best case scenario the vendors provide options for this, but ideally this would always be the same, and include all titles owned to also provide additional information about the library’s holdings.  Additionally, COUNTER hasn’t specified whether to report on the set or individual title level, again creating variation, but inhibiting ability to match on numbers in other data sets and creating inflation of titles owned.  If a five-volume set is only sold as a set and doesn’t have distinctive titles for each volume, reporting shouldn’t be at the volume level, yet some vendors report on each volume although inexorably bound to the remainder of the set.

But COUNTER’s greatest liabilities are in limiting its scope to if a use occurred with no information on the characteristics of that use and in excluding all statistical information not related to use regardless of how valuable it might be to libraries.  No COUNTER reports provide information on the characteristics of a use beyond that a given NCSA code was recorded in the Web-server log, so accidental clicks, five-minute stays resulting in the determination that a resource was inappropriate, and cover-to-cover reading are all reported as the same thing.  No COUNTER reports include cost-per-use, information that is considered the gold-standard in assessing the value of an electronic resource, nor price or cost for calculating it, leaving libraries to find and manipulate data to get this while it could be readily included in the reports.  Vendor and subject information are also not included in the COUNTER standards and would provide useful additional information to libraries.  Finally, by expanding the statistics collected to include information on the quality of a given eBook or eBook collection, measured via a survey instrument administered to users, we might get a sense of the true value of eBooks and eBook collections.  COUNTER might partner with the ARL DigiQUAL Project, or a similar initiative, to develop a survey instrument for doing this and reporting actual statistics on the user perceived value of electronic resources.

While there are clearly advantages to the limited scope of the COUNTER initiative, in terms of ease of reaching agreement across parties and simplicity of standards both for implementation and use, there is much that could be gained not only in refining but in expanding scope.  The additional requirement of subject and publisher information would allow for sorting and searching to gain valuable information on important and meaningful subsets of the library collection.  Most importantly, COUNTER standards would be far more meaningful if the reports included information on the actual quality of the resources either measured by information on characteristics of uses or as measured by a survey instrument.

Getting the Usage Data We Need

Improvements in COUNTER standards and reports that address the limitations described above are vital to providing reliable, accurate, organized, and useful eBook statistics.  This is especially true since this standard is typically recognized as the best method by which usage statistics can be standardized and compared across vendor collections.  However, in addition to providing COUNTER reports, vendors can do more with this data in order to provide more robust, user-defined reports.

Vendors have information pertaining to their eBook collections that, when combined with the usage statistics provided in COUNTER reports, would be extremely useful to libraries.  If vendors would make such reports available via their administration modules, this would greatly reduce the amount of data manipulation required by librarians for what is largely considered standard in eBook usage statistics collection.  In addition to providing a gold-standard cost-per-use report, use by publisher, subject, and acquisition type reports would also be invaluable.  Usage statistics by publisher and subject are extremely useful in helping to make collection development decisions, including creating and updating DDA eBook profiles.  Usage statistics reports by acquisition type (one-shot purchases, subscription, DDA purchases, DDA short-term loans, etc.) would allow libraries to analyze the effectiveness of each model and to make informed decisions to provide users with as many eBooks as possible while practicing sound fiscal management.  Ideally, such reports would be user-defined, with librarians having the capability to select attributes to combine with COUNTER usage data.

In order to make meaningful progress toward obtaining reliable, accurate, organized, and useful eBook statistics, vendors should not only engage librarians to determine what information they need but also actively seek their input in developing usage reports; this includes involving librarians in usability testing.  Similarly, librarians must be willing to engage in this process not only with vendors, but also in making improvements to COUNTER and other standards.

Sign-up Today!

Join our mailing list to receive free daily updates.

You have Successfully Subscribed!

Pin It on Pinterest