As the COVID-19 crisis forced more and more schools, colleges, and universities to shift their courses online last spring, news outlets published a spate of articles sounding the alarm about educational institutions surveilling students through online proctoring. Take these headlines from April 2020 to early May 2020:

However, for years now we’ve been warned about algorithms and artificial intelligence, the technologies upon which online proctoring is built. Simone Browne published Dark Matters: On the Surveillance of Blackness in 2015; Cathy O’Neil published Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy in 2016; Safiya Umoja Noble published Algorithms of Oppression: How Search Engines Reinforce Racism in 2018; Virginia Eubanks published Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor in 2018; Meredith Broussard published Artificial Unintelligence: How Computers Misunderstand the World in 2018; and Ruha Benjamin published Race After Technology: Abolitionist Tools for the New Jim Code in 2019. Chris Gilliard’s Twitter feed is one long, sobering doomscroll of surveillance abuses in education and other fields. And then there’s Audrey Watters, the self-proclaimed Cassandra of edtech, who has been detailing edtech’s monsters and Trojan horses for more than a decade.

Despite the many voices and voluminous scholarship describing the harm caused by some educational technology, analysts have estimated online proctoring to grow to a $23 billion industry by 2023.

Against the backdrop of the media’s renewed interest in how technology can be used to surveil students and the boon it’s meant for edtech companies, Maha Bali, in preparation for a keynote speech at the Online Learning Consortium’s Innovate 2020 conference, tweeted on June 5, “If you could UN-invent one technology that’s been used in education…what would it be?” I responded with a tweet, writing, “Any technology that frames students as adversaries. Turnitin, Proctorio, all that harmful surveillance garbage.” Proctorio, an online proctoring company, disputed my description of their vision of students and education. So did the company’s CEO. What followed was many things — including the CEO telling Bali in a tweet to “include both sides” in her presentation, a tweet that is no longer visible because he deleted it and all the tweets in the conversation before making his account private — but it often came down to the Proctorio CEO assuring his critics of his surveillance tool’s technical safeguards. (The same CEO later posted a portion of a student’s private chat log on Reddit then offered a meager apology for his actions.) Yet the problem is larger than how any one tool protects students’ privacy and data. To be sure, students’ privacy and data must be protected — and students should have more control over their data. But students would never be exposed to these tools if institutions and their decision makers trusted students and understood how educational technology can erode students’ well-being in the name of some marketplace abstraction like degree integrity.

The Problems with Keep Teaching

Until recently, I was an educational technologist at a large public university. When the coronavirus epidemic forced many to emergency remote teaching in March, my group’s monthly teaching with technology e-newsletter went to a weekly publication. We put together Canvas and Zoom resources, hosted several open office hours each day, and reminded the faculty we communicated with over phone and email to extend patience to their students and to themselves as we navigated the unknown together.

The rapid move to emergency remote teaching caused by the coronavirus also forced many instructional designers, educational technologists, and other staff to work long, hard hours building Keep Teaching websites to help faculty shift their courses online. Many of the Keep Teaching sites reminded faculty to consider the disruption in students’ lives, how their access to technology might be limited, and thus to plan accordingly. On my former institution’s Keep Teaching page about assessment, the distance education team did their best to dissuade faculty from using Proctorio for final exams. They accounted for nine different assessment types and shared ways to translate those assessments into online formats that didn’t necessitate online proctoring. They hosted a webinar titled “Alternatives to Exams and Finals Workshop.” They wrote that only after “you have exhausted all of these options and feel a proctored exam is necessary” should you pursue an online proctored exam.

I am grateful for my former colleagues’ efforts and for the efforts of teams across the world to center students and the ways in which the coronavirus impacted, and continues to impact, their lives. I also think we missed an opportunity. The problem with Proctorio was framed as a technical one. Resources like the Keep Teaching sites reminded faculty that students might not have the same access to their devices or high-speed internet as they did prior to the coronavirus epidemic, so instructors should find low-tech or no-tech solutions. That’s a start toward caring for students.

The resources fail students by excluding the ways in which online proctoring technologies define them as adversaries. If I’m a teacher asking my students to scan their room — or car parked outside the library — before the exam begins, then my message to students is clear: show me what you’re hiding because you must be hiding something. Exposed and vulnerable, students then have to hope such scans don’t flag anything the online proctoring software’s algorithms have been trained to identify as suspicious while they proceed from exam question to exam question. Further, students must endure the racist, ableist technology peddled by companies like Proctorio, ProctorU, and ExamSoft, which frames students’ bodies as abnormal. Have dark skin? The racist technology cannot see you. Wear glasses? The ablest technology sees you, but it doesn’t believe you are you because it can’t detect your eyes. Shea Swauger summarizes the problems with online proctoring:

Algorithmic test proctoring is a collection of machine learning algorithms that reinforce oppressive social relationships and inflict a form of data violence upon students. It encodes a ‘normal’ body as cisgender, white, able-bodied, neurotypical, and male. It surveils students and disciplines anyone who doesn’t conform to ‘normal’ through a series of protocols and policies that participate in a pedagogy of punishment, ultimately risking students’ academic career and psychological, emotional, and physical safety.

What if how-to resources created by staff at universities and colleges included language about the ways in which Proctorio and its ilk harm students?

I understand why resources built in two weeks (at best) under extreme circumstances might lack a critical examination of educational technology. However, many institutions have long-standing how-to pages on problematic tools like Proctorio and the plagiarism software Turnitin. For instance, my former institution’s resource page on how instructors can use Proctorio states that, “The tool allows the instructor to: Record exam audio, webcam recording, and screen recording for review afterward.” What’s missing from this statement? Students. The exam doesn’t sigh. The exam doesn’t have a face glancing at the camera deciding whether or not to urinate in a bucket. I wonder if faculty and staff might balk at using online proctoring if students weren’t erased. I wonder if faculty and staff might abandon online proctoring if they understood how it impedes students’ cognitive abilities. And I wonder if faculty and staff might find an alternative if online proctoring was called what it is: racist, ableist surveillance. Or, if you like, cop shit.

Instructional designers, educational technologists, teaching center staff, academic developers, librarians, anyone responsible for putting together these resources: we need to accurately describe harmful technology if we are to care for students. We can start by learning from other fields. We should refuse to make and preserve objective resources on how to use technology because objectivity is impossible. Attempts at objectivity often maintain the status quo, and the status quo maintains a technological determinism funded by technology companies and their leadership who, despite their mantra of disruption and liberation, have mostly further entrenched patriarchy and white supremacy. As Maha Bali explained in “Centering a Critical Curriculum of Care During Crises,” her keynote at the Online Learning Consortium’s Innovate 2020 conference, “Just like education is never neutral...Educational/faculty development is never neutral. Nor is any educational technology (or its creators) neutral.” Bali’s comments prompt a question raised by Paulo Freire in A Pedagogy for Liberation. Once a teacher recognizes education is politics, said Freire, they must contend with their personal politics and how those politics manifest in the classroom. Freire said:

The teacher works in favor of something and against something. Because of that [they] will have another great question, How to be consistent in my teaching practice with my political choice? I cannot proclaim my liberating dream and in the next day be authoritarian in my relationships with the students.

Instructional designers, educational technologists, teaching center staff, academic developers, and librarians are all teachers, and must therefore avoid the hypocrisy Freire warns against. What’s true for a teacher is also true for an institution. A college or university that proudly proclaims its mission is to support students and help them thrive should understand Proctorio and other harmful technologies for which they pay thousands upon thousands of dollars are often in direct opposition to their stated commitment to students.

The question becomes what to do about established and new forms of authoritarianism on campus. Stating how a technology protects (or doesn’t protect) student data is insufficient. Yes, it’s essential to include, but it’s not enough. If I need to start a sentence about students’ privacy when using Proctorio, as the author of the Proctorio resource at my former institution does, with the phrase “Due to the nature of Proctorio,” then I must refuse the appearance of neutrality and instead be explicit about the nature of Proctorio, of Turnitin, of any tool and how it does or might harm students.

Refusal

The succession of news articles about surveillance technology in schools coincided with individual educators posting pieces about refusal. Lee Skallerup Bessette published “Refusal” and a day later Kate Bowles published “Just Refusal”. Bowles cites Skallerup Bessette as well as Donna Lanclos’ keynote speech, “Listening to Refusal,” which she delivered at the Academic Practice and Technology conference in the summer of 2019. (Lanclos’ speech was instrumental in my decision to refuse to support Proctorio.)

Skallerup Bessette and Bowles focus their refusal on institutions demanding employees continue to serve up their affective and manual labor in order to maintain the status quo. Bowles ends her piece by recognizing many are now “thinking quietly about how to refuse to go along with—and go back to—the things that were already broken, and to the time when even more broken solutions were being touted as the next shiny thing.” Lanclos is more focused on the shiny thing: educational technology and how to refuse it.

If we are to refuse educational technology, I think it’s important to examine the ways in which technology and care are intertwined. We must refuse the weaponization of care, a term introduced by Autumm Caines and Sundi Richard at OER 20, which they define as “care…that is used to justify data surveillance.” A faculty or staff member may care about an exam’s integrity, to borrow the language of Proctorio’s marketing team, at the expense of caring for students subjected to the technology. Even more insidious, according to Caines and Richard, is when care is weaponized under the pretense of caring for students. Proctorio, Turnitin, and other harmful educational technologies claim to care for students with arguments about ensuring their degrees hold value in the marketplace or that students develop necessary workforce skills, statements that lay bare what these companies really care about: upholding the dictates of surveillance capitalism and its hunger for passive labor.

Refusal, Power, and Privilege

How do we refuse educational technology and the weaponization of care? Start by interrogating that “we.” Lanclos notes that “strategic refusal…come[s] from a position of power.” Responding to a tweet calling for the refusal of harmful educational technology, Mar Hicks tweeted:

If you (or the class of people to which you belong) don’t have the right to refuse a system, you probably don’t have any power to change it (like making it more ethical) either. The right to refuse is a key basis of power and agency. It can’t be deployed easily or often…the right (or privilege) to refuse takes many forms & can be partial rather than absolute depending on the system & situation.

I possessed power and privilege in my former system and situation. I was a gatekeeper. As an educational technologist, I was responsible for recommending digital tools to faculty, staff, and students. My authority was bolstered by my privilege; I am a white, cisgender, heterosexual, able-bodied man. I can walk into a room and share my opinions without the thought ever occurring to me that I might be labeled hostile for my questions or wonder what a colleague meant when he praised me for being articulate.

My power was circumscribed because I was a staff member in the academy, a system invested in defending its hierarchies. My power was further reduced because my particular office often framed our work as customer service, and few jobs come with such stark power disparities as customer service. I did not want to view my work with teachers and students as a transaction, a ticket for me to close in five days or less if I was to avoid breaching the service level agreement.

“It Doesn’t Have to Be Like This”

Transitioning from a service-oriented organization to one founded on partnerships can help to deconstruct the power differences between faculty and staff, thereby elevating the staff members, reifying their expertise, and strengthening their ability to push back against faculty who might be using or considering using harmful educational technology in their courses. Amy Collier and Sarah Lohnes Watulak evolved the Office of Digital Learning and Inquiry (DLINQ) at Middlebury College into one such partner organization. For Collier and Lohnes Watulak, a service organization is “transactional, templated, [and] rigid” with “clear roles” that provide “little agency.” A partner organization is “collaborative, individualized, [and] co-constructive” with “evolving roles” that support “a lot of agency.” In a service organization, course development might involve faculty submitting their syllabus, a staff member or student employee checking boxes on a Quality Matters rubric, and then returning the rubric and syllabus. Translational, templated, rigid.

A partner organization is built on relationships, starting with the individuals within the unit. Collier and Lohnes Watulak suggest academic technology and similar units create a shared vision for their work. In this way, a critical approach to educational technology and an explicit care for students can be established as a core mission, a set of founding principles to guide course design conversations between staff and faculty. Again the language we use to describe these relationships is essential. Collier and Lohnes Watulak suggest writing a project charter to share with partners at the outset of a new endeavor. The charter should include sentences like this one: “We believe that successful collaborations are built on shared goals, a willingness to listen and learn from each other, and trust.” If our partnership is to succeed, then you must understand I am not here to click buttons in the learning management system. I am here to engage critically with teaching and technology, and I invite you to join me. I am a professional, like you, so let’s get to work.

Removed from the indignities of customer service, equipped with more agency and the collective courage to make change, staff members and their units can be resituated to reaffirm their power. Tressie McMillan Cottom suggests educational technology be recontextualized in colleges and universities. She argues “our learning technologies cannot continue to live solely in our administrative units; our academic units are where we are doing some of the more transformative work of learning.” Once recontextualized in academic units, educational technologists, instructional designers, and others often labeled “support staff” must ask questions about should. Cottom writes:

Edtech 1.0 was about looking at the shiny stuff. Edtech 2.0 is about discovering what we can do with that shiny stuff. We can even look forward to edtech 3.0, which will be about determining what we should do with that shiny stuff. This leads to the question, in considerations of what we should be doing, who is the “we”? What resources, power, and legitimacy do ‘we’ have to do what should be done? When we think about what we should do as opposed to what we can do, we get into questions about ethics, fairness, and justice and their intersection with learning technologies.

Cottom’s argument emphasizes again that refusal cannot be separated from privilege and power. So I echo what Julie Fellmayer wrote to those with privilege and power: use it. Use it to:

And heed Audrey Watters when she writes in her keynote at Digital Pedagogy Lab 2020:

We're not stuck. We don't have to surrender. We can refuse, and we should. And we should support students when they refuse. These can be little refusals — small acts of resistance, obfuscations, that come from positions of little to no power. These can be loud refusals in support of those with little to no power. We can push back, and we can demand better...it doesn’t have to be like this.

Keep at It

Keep Teaching. Keep Learning. Keep Working. Reading these rallying cries, I’m reminded of the slogan produced by the British government as the nation prepared to plunge into World War II: Keep Calm and Carry On. We are not in a world war, but our world is violent, ever more publicly so as protestors document what is a long history of anti-Black violence committed by the police and others in the United States and around the globe. It is no time to keep calm.

According to The Oxford English Dictionary, the verb to keep once meant, “To watch. To lie in wait for, watch for stealthily with hostile purpose; to intercept on the way.” Should the likes of Proctorio and Turnitin continue to influence our schools and define how we keep teaching, then we are doomed to view students as adversaries. If we are to keep teaching as a form of care, then we must be vigilant against the weaponization of care for our students and our colleagues. We must reimagine the roles of instructional designers, educational technologists, academic developers, librarians and other staff to strengthen their power and build partnerships that support agency and create opportunity to enact change. We must Keep Caring and Refuse On and On and On.