Home 9 Special Issue 9 Ithaka S+R Faculty Survey Response: Learning Analytics

Ithaka S+R Faculty Survey Response: Learning Analytics

by | Dec 9, 2019 | 0 comments

#

A Faculty Member’s Perspective on Learning Analytics

By Kyle Jones, Assistant Professor, Indiana University-Indianapolis (IUPUI) School of Informatics and Computing, Department of Library and Information Science

I am not unfamiliar with learning analytics. In fact, nearly my entire research agenda focuses on information ethics and policy issues associated with learning analytics. Primarily, my interests have been on issues of student privacy and autonomy, but I have also completed empirical research with institutional stakeholders who adopt, use, and sometimes advocate for learning analytics in the context of higher education. For the sake of transparency, I put myself in the camp with the rest of the skeptics and educational technology critics who see very good reason to interrogate the aims learning analytics advocates hope to achieve, the resources extracted to support those aims, and the very real possibility that learning analytics is another instrument of administrative control that displaces faculty governance.

When I read through the Ithaka S+R Faculty Survey and its results regarding learning analytics, I find it not unsurprising that the data signals somewhat low adoption rates—though admittedly higher than I would have thought—and a healthy degree of skepticism, especially among humanists. Learning analytics has a branding problem. It is difficult to know what is and what isn’t a form of learning analytics, and the data plays that out with a swath of respondents unsure of the tools. Perhaps because of this branding issue and because learning analytics are still fairly immature, it’s unclear how to best implement them in digital and physical classrooms. This is a gap that our respective teaching and learning centers could fill.

The difference in responses among faculty disciplines is hard to parse without further data; this is an area where qualitative research could add value to these findings. I can only make assumptions about disciplinary teaching practices and pedagogical values to explain, for instance, why humanists seem to be worried about how learning analytics may be used to limit their instructional autonomy, yet there’s over 10% less concern by medical faculty. It is also plausible that those for whom teaching is a greater concern for tenure and promotion hold these worries, whereas others who focus on grant writing and research are less bothered by possible autonomy constraints.

But out of all the questions, it is the final one that raises the most concern in my mind. At least 75% of faculty have either neutral or negative views toward their institution’s ability to protect against a breach of student activity data, both in terms of systems and protocols. That is astonishingly low. It should raise a number of important questions for chief information officers, chief privacy officers, and data stewards. For if the faculty do not believe that their students’ data will be protected, why would they increase their support for learning analytics systems?

Finally, there is a question unasked that needs to be raised: What do faculty think about the student privacy invasions that learning analytics systems create? If higher education truly cares about creating learning environments where students feel at ease to discuss and debate concepts, opinions, values, and politics, then it follows that its institutions would not develop systems that surveil their behaviors and create lasting data profiles that put students at risk. Are faculty aware of how, exactly, their institution attempts to protect students? What choices do their institutions give students to opt out of these data mining and analytic systems? These questions are institution-focused, but we also cannot forget that faculty can play a part in protecting student privacy. Do faculty consider their students’ privacy when adopting these tools? What options do faculty give their students to use alternative platforms and educational technologies? The Ithaka S+R Faculty Survey begins to fill this knowledge gap, but more work is needed to find answers to these and other questions.

Kyle M. L. Jones (MLIS, PhD)
Assistant Professor
Indiana University-Indianapolis (IUPUI)
School of Informatics and Computing
Department of Library and Information Science

Further reading:

Jones, K. M. L. (2019). Just because you can doesn’t mean you should: Practitioner perceptions of learning analytics ethics. portal: Libraries & the Academy, 19(3), 407–428. doi: 10.1353/pla.2019.0025

Jones, K. M. L., & McCoy, C. (2018). Reconsidering data in learning analytics: Opportunities for critical research. Learning, Media and Technology, 44(1), 52–63. doi: 10.1080/17439884.2018.1556216

Jones, K. M. L., & VanScoy, A. (2019 – forthcoming). The syllabus as a student privacy document in an age of learning analytics. Journal of Documentation. https://doi.org/10.1108/JD-12-2018-0202

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

LATEST NEWS

SUBSCRIBE TO OUR PODCAST

Share This