Public version: Making ChatGPT detectors part of our education system prioritizes surveillance over trust

The Globe and Mail published an op-ed I wrote. As a condition of being featured in the publication, the paper has first publication rights for the first 48 hours. Since it’s been more than 48 hours, and for posterity, I’m making a copy available below.

Making ChatGPT detectors part of our education system prioritizes surveillance over trust

George Veletsianos is a professor of education and Canada Research Chair in Innovative Learning and Technology at Royal Roads University.

Imagine a world where surveillance technologies monitor and scrutinize your behaviour. Imagine a report that you write at work being compared with myriads of others and flagged for additional inspection when an algorithm deems it to be “very similar” to others.

Students don’t have to imagine this world. They are already living in it, in the form of plagiarism detection software, remote proctoring technologies, and now, tools aimed at detecting whether the students used ChatGPT – including new software that promises to catch students who use ChatGPT to cheat.

While taking online exams, students’ webcams scan their surroundings; their microphones monitor sounds and background noise, and their body and eye movements are tracked. Unexpected movements may indicate something as innocuous as stretching a tight neck or as problematic as catching a glimpse of Post-it notes on the wall, while unexpected sounds may indicate a child playing in the background or a roommate whispering answers. The essay assignments students submit are compared to a vast amount of writing by others. And a battery of scores might indicate plagiarizing from Wikipedia, passing off text created by ChatGPT as one’s own, or simply using common expressions. Any of this will get students flagged as potential cheaters.

That there are technologies to identify text written by artificial intelligence shouldn’t come as a surprise. What is surprising is that educators, administrators, students, and parents put up with surveillance technologies like these.

These technologies are harmful to education for two main reasons. First, they formalize mistrust. As a professor and researcher who has been studying the use of emerging technologies in education for nearly two decades, I am well aware that educational technology produces unintended consequences. In this case, these technologies take on a policing role and cultivate a culture of suspicion. The ever-present microscope of surveillance technology casts a suspicious eye on all learners, subjecting them all to an unwarranted level of scrutiny.

Second, these technologies introduce a host of other problems. Researchers note that these tools often flag innocent students and exacerbate student anxiety. This is something I’ve personally experienced as well when I took my Canadian citizenship exam online. Even though I knew the material and was confident in my abilities, my webcam’s bright green light was a constant reminder that I was being watched and that I should be wary of my every move.

To be certain, such tools may deter some students from intentionally plagiarizing. They may also improve efficiency, since they algorithmically check student work on behalf of educators.

But these reasons don’t justify surveillance.

A different world is possible when schools and universities dare to imagine richer and more hospitable learning environments that aren’t grounded in suspicion and policing. Schools and universities can begin to achieve this by developing more trusting relationships with their students and emphasizing the importance of honesty, original work, and creativity. They need to think of education in terms of relationships, and not in terms of control, monitoring, and policing. Students should be viewed as colleagues and partners.

Educators also need to come to terms with the fact that our assessments generally suffer from a poverty of imagination. Essays, tests, and quizzes have an important role to play in the learning process, but there are other ways to check for student achievement. We can design assessments that ask students to collect original data and draw inferences, or write and publish op-eds like this one; we can invite them to develop business and marketing plans for real-world businesses in their cities; we can ask them to reflect on their own unique experiences; we can require them to provide constructive peer-review and feedback to fellow students, or have them engage in live debates. In this light, ChatGPT is not a threat, but an opportunity for the education system to renew itself, to imagine a better world for its students.

Educators and administrators should stop using surveillance technologies like ChatGPT detectors, and parents and students should demand that schools and universities abolish them – not because cheating should be tolerated, but because rejecting the culture of suspicion that surveillance technologies foster and capitalize upon is a necessary step toward an education system that cares for its learners.

Previous

4th annual Speculative Education Colloquium

Next

Call for papers: Higher Education Futures at the intersection of justice, hope, and educational technology

1 Comment

  1. Sanjay Mishra

    Totally agree. It is important that both students and teachers are trained to use the new tools, and the discourse should not be focused on a particular tool that is still under testing, abd improvising everyday from user data collection. Educational Institutions must focus on creative but ethical use of AI tools.

Leave a Reply

Your email address will not be published. Required fields are marked *

Powered by WordPress & Theme by Anders Norén