<span class="padlock_text"></span> v25 #5 Analyze This

by | Dec 5, 2013 | 0 comments

Usage and Your Collection — Undergraduate Student Search Strategies: Findings From A Two-Year Study

by Beth Bloom , MA MLS  (Associate Professor / Instruction Coordinator)

and Marta Deyrup, Ph.D.  (Professor / Head of Cataloging)

Column Editor:  Kathleen McEvoy  (EBSCO Information Services)  <KMcEvoy@ebsco.com>

Between 2011 and 2012, the authors, recipients of a Google Faculty Research Grant, conducted research on undergraduate student online information-seeking behavior at our home institution, Seton Hall University. Our idea was quite simple — we gave 42 students, who were sophomores, juniors, and seniors, access to a Web-tracking product called OpenHallway and asked them to record in real time what they were doing as they worked on their class assignments.  Once they had completed their research, we sent them a link to a Survey Monkey questionnaire, which asked them about their academic background, history of, attitudes about, and confidence in doing research.

The result of our research is strikingly different from other similar projects, mainly because the students were allowed to describe what they were doing without any librarian intervention.  We were also fortunate that Google gave us complete autonomy to conduct the research as we saw fit and put no restrictions on the grant.

After many hours of viewing and analyzing student videos, we have begun to write up our findings.  The initial results were presented in the form of video snippets at the Charleston Conference.  These results have been published in the 2012 Charleston Conference proceedings.  To quote from the proceedings:

“Most of our students did their [library] research using Internet search engines, primarily Google.  Their online research behavior was oriented by Google’s organization and information methodology, which simply put is keyword responsive and full-text document inclusive, employing a transparent Boolean AND.  This prioritizing of keywords ostensibly supports student research but tends to discourage hierarchical thinking in the research process.”

This behavior, while successful in searching Google, is disastrous for searching library information structures such as catalogs, A-Z listings, and citation databases.  Our conclusions were that students have been so oriented towards keyword information retrieval through their years of exposure to and use of the Google search engine that they were unable to understand the traditional information structures that appeared directly in front of them on the screen as they tried to navigate through the library’s Website.

Our research resulted in 40 hours of videos that were analyzed by Bloom over a period of three months.  Bloom and a research assistant coded the data into ATLAS, a qualitative data analysis tool.  The data were divided into major topics: Destination, Citations, Feelings and Sounds, Search Strategies, Search syntax, Source evaluation, and Webpage Navigation.  Atlas measured many facets, but we focused on numbers of times each behavior appeared.

Destination statistics reported on students’ initial search engine choice;  e.g., library home page, Google, or Wikipedia.  Citations measured how students saved their initial information;  e.g., bookmarks, folders, tabs, saved URLs, or PDFs.  Feelings and Sounds measured expressed satisfaction, frustration, or excitement.  Methodology covered the criteria for and pattern of Website or destination choices;  e.g., going back and forth between library Website and Google or using previous source links to find new ones.  Search strategy covered students’ choice and application of key terms;  e.g., typing topics that are too broad or too narrow, or changing search terms before complete exploration of results.  Search Syntax reported on phrase patterns in search boxes and the possible application of advanced search techniques;  e.g., natural language or keyword phrasing, punctuation, using date limitations, or limiting to full text or peer review.  Source evaluation describes student behaviors while evaluating each result;  e.g., briefly skimming the abstract, title, or location and verbal commentary on source relevance.  Website evaluation involves site behaviors resulting from perceived quality or relevance of site;  e.g., browsing page for next step, clicking everywhere on page, going through >2 pages of database or Google, linking from one source to another, expressed confusion as to how to use a page, or having no plan of action at all.

In our research, in many instances, students were vocal about their preference for Google.  They used it to access book information, news, videos, and images.  They used it to find sources, statistics, film reviews, specific articles, databases, and definitions.  They also employed such capabilities as Google Scholar or Google docs.  Often, when they became frustrated or confused while using the library home page, they would revert to Google.  Our research indicated that although they were not skilled at searching there, participants appeared to become much more confident once the Google screen appeared.

Similarly, students were vocal about their use of and reliance on Wikipedia.  They seemed to see it as a necessary evil.  They often justified the site whenever they used it, commenting that they liked to start with it to get an overview of a topic.  They liked its simplicity, its quick links, its references, its ideas.  Several admitted that they will “… use its ideas,” “… use the article, even though it’s Wikipedia,” and/or “… use citations to find other sources.”  However, the students also expressed fears about Wikipedia and the validity of its contents, claiming that he or she “[is]… not supposed to use, but…,” “Doesn’t want to use, but…,” “Doesn’t like to use, but…,” “Knows that s/he can’t quote from, but….”

The project revealed similarities and differences between the utilization of the library’s proprietary resources and Google.  Regarding resource exploration, in the library, the students were willing to explore resources through hyperlinks.  They also had some success once they found and used the library’s subject specific databases.  However, when using Google, the tendency was to approach each search as a new search.  When it came to the use of Search Syntax, they replicated their Google search strategy when they were in the library databases.  Google uses AND as a search default, and the students did not seem to understand that the database algorithms functioned differently and relied on a more sophisticated use of Boolean logic that could yield more specific results.  There was no evidence of field searching in either Google or the library databases.  It is interesting to note that the students expressed less confidence in Google resources than in those found through the library databases, although they often expressed doubt that the library would contain the information they needed.

However, the opposite was true when the students expressed confidence in their own abilities to search.  They expressed much more confidence when they searched Google than when they looked for results in the library databases.

Since our research was completed, our library has redesigned its home page and migrated to EBSCO Discovery Service (EDS) as our discover layer.  Consequently, we have launched a second project for the fall of 2013.  Four libraries that use discovery systems (most use EDS) will repeat our initial experiment and see whether there is a difference in the way students approach library resources using a “Google-like” interface.  We will present an updated report of findings when they are completed.

Sign-up Today!

Join our mailing list to receive free daily updates.

You have Successfully Subscribed!

Pin It on Pinterest