v32#2 Love and Other Data Assets

by | May 11, 2020 | 0 comments

by Jesse Stommel  (Digital Learning Fellow and Senior Lecturer of Digital Studies, University of Mary Washington)    Twitter @Jessifer

“Love first, design later.”Maha Bali

For the last several years, I’ve led pedagogical development seminars at University of Mary Washington (UMW) focused on topics like “higher education pedagogy” and “digital knowledge.”  The groups have included staff, librarians, full-time and adjunct faculty from a wide array of disciplines, and students.  We’ve discussed practical issues arising from our own teaching, discussed readings, met with special guests, and done microteaching (offering feedback on short lessons taught by participants). 

I’ve guided these discussions and chosen many of the readings, but the topics have arisen organically, beginning with a group-brainstorm at our first session.  In any year, what we’ve done was more important than what we set out to do, and we’ve reflected regularly on how we did it and why.  Sometimes, there has been a clear path from one week’s topics to the next, but more often, the juxtapositions have been haphazard, as much about generating friction as about finding common theses.

There is very little time, support, or funding in higher education for this kind of reflective consideration of our work.  Teaching is quite often something we do alone with very little direct preparation.  Conversations between and among faculty, librarians, instructional designers, and technologists are much rarer than they should be.  The faculty/staff divide at many institutions limits our ability to talk across structural barriers.  And contingent, adjunct, or precarious educators are too often left out of these conversations altogether.  Each year I’ve facilitated the seminar at UMW, I’ve faced barriers to including (and compensating) adjuncts, librarians, and staff.  Spending faculty development funds on non-faculty, even non-faculty with teaching assignments, has been discouraged by administration.  In fact, every year, funding has been eliminated altogether for this project, and I’ve had to advocate (with the support of previous participants) for it to be reinstated.  Unsurprisingly, I’ve recently discovered that funding for the seminar has been eliminated again for the 2020/21 year.  At the same time, the annual bill for our LMS contract has gone up.  This kind of decision-making sends a message that the administration wants technical support, not deep inquiry into the complicated work of teaching with technology.

Educational technology is strangely situated at many institutions (usually somewhere vaguely between academics and IT), which further frustrates necessary conversations across the teaching/technology divide.  And, quite often, for-profit EdTech companies take advantage of this situation through predatory marketing tactics — pitching their tools to the most powerful, least knowledgeable folks at an institution.  The majority of EdTech is driven by the bureaucratic traditions of education more than the pedagogical ones. 

In “Teaching as Possibility: A Light in Dark Times,” Maxine Greene writes, “It is obvious enough that arguments for the values and possibilities of teaching acts (no matter how enlightened) within the presently existing system cannot be expressed through poetry.”  What Greene describes is a conundrum.  For her, the space of the imagination, the habitus of poetry, is necessary to the work of education.  But how do we reconcile the philosophies of John Dewey with the fact of the learning management system?  How does the work of Maria Montessori sit without combusting alongside the increasingly aggressive marketing of remote and algorithmic proctoring tools?  How do bell hooks’ words about self-actualization in the classroom not wither in a world of keystroke monitoring and plagiarism-detection software?  And how can we honestly approach Virginia Woolf’s A Room of Ones Own with students if we’re complicit in the monetization of their educational data by for-profit companies?

These crises aren’t existential, nor are my examples purely hypothetical.  The technological tools we’ve widely adopted for education are increasingly out of step with what we say education is for.  There’s a serious problem in education if we assume dishonesty on the part of students while failing to acknowledge that for-profit tech companies like Turnitin or ProctorU might care about their bottom lines more than they care about students. 

During a 2019 investor conference, the CEO of Instructure (maker of the Canvas learning management system) bragged about their “second growth initiative focused on analytics, data science, and artificial intelligence,” saying:  “We have the most comprehensive database on the educational experience in the globe.  […] No one else has those data assets at their fingertips to be able to develop those algorithms and predictive models.”  What concerns me are two specific words that fall off the CEO’s tongue so easily:  “data assets.”  Teachers and students have long been called “users” and “customers” by educational technology companies, and this has had me uncomfortable enough, but reducing us and our work to “data assets” takes this a step further, exposing the role that for-profit companies and technologies play in the increasing precarity of education and educational labor.  It’s not a coincidence that more than 70% of university teachers are working off the tenure-track and nearly 1 in 2 students in the U.S. is food insecure, even as Turnitin claims their product is used by more than 30 million students at 15,000 institutions in 150 countries and the global learning management system market is expected to reach $28.1 billion by 2025 (Wood, 2019).

As these bureaucratic technological systems become more ubiquitous, educators increasingly accept them as inevitable instead of furiously raising our collective eyebrows when institutions invest more and more in machines (and algorithms) and less and less in teachers (and the work of teaching).  To me, the biggest issues arise when we adopt tools across an entire educational institution, discipline, or course and give teachers and students no choice but to use them.  A student’s degree or grade shouldn’t rest on whether they are willing to sacrifice their privacy or give their data to a for-profit corporation.  There’s a common end game for tech companies, especially ones that traffic in human data: create a large base of users, collect their data, monetize that data in ways that help assess its value, and then leverage that valuation in an acquisition deal.  An educational institution should be skeptical of those companies, not suspicious of its own students.

In “Digital Sanctuary: Protection and Refuge on the Web?” Amy Collier writes, “We in higher education need to seriously consider how we think about and handle student data, and we need to respectfully and empathetically acknowledge where our practices may cause harm.”  This work starts by talking openly with one another and across institutional divides.  Collier’s call is for dialogue, not empty bureaucratic structures.  We all need to work together to ask hard questions of our technologies:  Does the tool educate students about IP and data privacy before collecting data?  Can individual students opt out, no matter the university policy?  Is there a single button students can click to remove all their data?  If the company does monetize the data it has collected, whether permission was given or not, will the owners of that data be compensated? 

Asking these questions is necessary when a tool is institutionally adopted, but smaller versions of these conversations should happen every time we consider pointing to a tool on an institutional Web page or require the use of a tool for a course assignment.  “If higher education is to ‘save the web,’” Chris Gilliard writes in “Pedagogy and the Logic of Platforms,” “we need to let students envision that something else is possible, and we need to enact those practices in classrooms.  To do that, we need to understand ‘consent’ to mean more than ‘click here if you agree to these terms.’”  Our work as educators is not just to question ubiquitous practices, compulsory data collection, and algorithmic decision-making, but also to model what it looks like to think critically about the whens, whys, and hows of technology.

The learning management system isn’t an accident.  It exists for very particular historical, bureaucratic, institutional, and pedagogical reasons.  The same is true for remote proctoring software, plagiarism detection services, and algorithmically-driven mobile apps for student retention.  These tools are not neutral.  Chris Bourg, Director of MIT Libraries, offers (in an interview with Tara Robertson at Mozilla), “Many people don’t proclaim their agendas, but definitely have agendas, even if they are agendas about maintaining the status quo, and never get asked about how they handle people in their organization who don’t agree with their agendas.”  Bourg argues for a feminist administrative practice of radical openness and transparency about our own agendas.  The onus has to be on the tech companies themselves to educate “users” about data security and data monetization. 

EdTech companies need to state clearly:  “Here’s what we’re collecting, here’s why we’re collecting it, here’s what we hope to do with it, and here’s why it should matter to you.”  Far too many companies attempt to shift these responsibilities entirely onto educational institutions, when they should be shared by students, educators, institutions, and the EdTech companies themselves.  We all need to talk honestly about what tools are for, how they might shift culture, and who they could disadvantage.  We need to actively resist marketing jargon that would have us believe our “culture of academic integrity begins with Turnitin,” or that ProctorU “validates knowledge.”  And by “resist,” I mean we should not adopt tools that conceal their monetization strategy or lie to us about their function.

My work in critical digital pedagogy begins with the presumption that there aren’t easy solutions in education — that students and educators bring complex backgrounds, experiences, and contexts to the work — and that this work demands we “gather together a cacophony of voices” (2014).  This means acknowledging fractious (and sometimes abusive) faculty/staff divides, drawing students into the work of building their own educational spaces, and creating much more inclusive conversations between technology companies and “stakeholders,” the human beings that populate (and construct) a university, college, or other school. 

One of the first things I encourage participants to read in my “digital knowledge” seminar is Teaching to Transgress by bell hooks.  It’s productive to hold her work up against investigations into the past, present, and future of educational technologies.  Henceforth, every time I hear the phrase “data assets” to describe students, educators, and their work, I’m going to recite in my brain (or aloud) these words from another book by hooks, Teaching Critical Thinking:  “It is the most militant, most radical intervention anyone can make to not only speak of love, but to engage in the practice of love.  For love as the foundation of all social movements for self-determination is the only way we create a world that domination and dominator thinking cannot destroy.”  And thinking on those words, I’ll be reminded of this from Paulo Freire’s Pedagogy of the Oppressed, “If I do not love the world if I do not love life if I do not love people I cannot enter into dialogue.”

We are not data assets.  And the work of dialogue depends on that.  It begins when we:

• build a community of care

• ask genuine, open-ended questions

• wait for answers

• let conversation wander

• model what it looks like to be wrong and acknowledge when we’re wrong

• recognize that the right to speak isn’t distributed equally

• make listening visible

From there, we can find tools and technologies that support the project of education, ones that allow us to own our data, delete it at will, and export it easily — tools that allow “users” (human librarians, staff, faculty, adjuncts, administrators, technologists, and students) to create our own spaces (and conversations) on and about the Web — tools with an architecture of radical openness, love, imagination, hard questions, possibility, and poetry.

Bibliography

Bali, Maha.  (2016, February 14).  Love First, Design Later; Love as Praxis.  Retrieved February 17, 2019, from Reflecting Allowed:  https://blog.mahabali.me/pedagogy/love-first-design-later-love-as-praxis/.

Collier, Amy.  (2017, August 28).  Digital Sanctuary: Protection and Refuge on the Web?  Retrieved February 17, 2019, from Educause Review:  https://er.educause.edu/articles/2017/7/pedagogy-and-the-logic-of-platforms.

Freire, Paulo.  (2000).  Pedagogy of the Oppressed.  New York: Continuum.

Gilliard, Chris.  (2017, July 3).  Pedagogy and the Logic of Platforms.  Retrieved February 17, 2019, from Educause Review:  https://er.educause.edu/articles/2017/7/pedagogy-and-the-logic-of-platforms.

hooks, bell.  (2009).  Teaching Critical Thinking: Practical Wisdom.  New York: Routledge.

Robertson, Tara.  (2017).  A Feminist Among Us: An Interview with Chris Bourg.  In: Feminists Among Us: Resistance and Advocacy in Library Leadership.  Library Juice Press, pp. 172-188.

Stommel, Jesse.  (2018).  Critical Digital Pedagogy: a Definition.  In: An Urgency of Teachers: the Work of Critical Digital Pedagogy

Wood, Laura.  (2019).  Learning Management System (LMS) Market – Global Forecast to 2025.  Retrieved February 17, 2019 from Business Wire:  https://www.businesswire.com/news/home/20191216005307/en/Learning-Management-System-LMS-Market—Global.  

Sign-up Today!

Join our mailing list to receive free daily updates.

You have Successfully Subscribed!

Pin It on Pinterest