Listen to this article
Slowly but surely, surveillance technology is becoming commonplace in many schools and universities.
Its deployment ranges from facial and fingerprint recognition to the creation of personalised teaching and detailed monitoring of pupils’ — and teachers’ — behaviour.
Educational institutions adopting these systems, and the companies marketing them, insist they deliver a range of benefits beyond the primary aim of improving pupil attainment and lifting attendance.
Surveillance technology, increasingly aided by developments in artificial intelligence, can help protect students from the physical threat of malevolent intruders on campuses, and also stop students accessing inappropriate online content, warn teachers about bullying,and even flag up individuals experiencing suicidal thoughts, proponents say.
US tech companies Gaggle and Securly are among those providing high schools with services aimed at monitoring student behaviour on their computer systems, in part to prevent distribution of sexual content, flag warnings of violent threats such as gun attacks, and avoid self-harm.
Gaggle claims to have prevented more than 700 suicides among the 5m students it monitors in the school year 2018-2019. Its rival Securly claims to have prevented more than 800 suicides across 15,000 schools, based on the screening of 5bn online activities.
But some experts are challenging this approach, which they warn is normalising intrusion into students’ lives and overriding concerns about privacy, freedom and the reliability of such technology.
Teachers, students and their parents are rarely kept informed about what data are collected and for what purposes, warns Emmeline Taylor, an academic at City University of London in the UK.
Cameras were initially introduced to safeguard against external intruders. Since then, they have been used to spot misbehaviour, and then on teachers to monitor their performance, she says. More recently, body-worn cameras on teachers have been trialled to document student behaviour in case of disciplinary disputes.
The repurposing of monitoring technology beyond its original use case is so common that there is a term for it: surveillance creep.
“The window of opportunity to define our cultural values about acceptable levels of scrutiny in schools is closing quite rapidly,” she says. “The next generation of students will already be normalised to surveillance.”
Universities have the ability to track students’ behaviour by centralising administrations throughcomputer-based learning platforms that enable students to access teaching resources, submit coursework, take assessments and contact teachers and classmates.
Data captured in these “virtual learning environments” can help staff understand student engagement and shape teaching based on a student’s activity. They can also gather information about teachers’ communication with students and the delivery of assignment deadlines, feedback and grades.
However benign or intrusive, the growth of student surveillance has given birth to an array of start-ups in the US in particular.
“Classroom management is a newer market that’s rapidly growing, and seeing a lot of interest,” says Vinay Mahadik, chief executive of Securly. “There’s a trend towards consolidation now where all of these are being bundled into a unified solution.”
In mainland Europe, social attitudes towards data protection and strict regulation have created a more conservative culture around surveillance technology and data tracking on campus. But in the US, education technology providers explicitly market their products with surveillance features, such as Securly’s Auditor, which provides real-time email and document monitoring. Securly also markets a platform called Classroom, which gives teachers oversight and control of students’ computers with the ability to open web pages, close tabs, lock screens and view browsing history.
The market for such tools has been encouraged by the Children’s Internet Protection Act (CIPA). In order to receive public funding, schools must monitor students’ internet usage on school-owned devices and install technology tools preventing access to content that is “harmful to minors”.
But many technology providers supplement the basic CIPA compliance features with products designed to “keep students on task”.
Other edtech companies offer narrower surveillance-based tools for efficient administration, without any claims to improve student performance. These include people-counting technology to identify underused facilities and facial recognition systems to replace traditional attendance registration.
But, again, attitudes to such systems vary across jurisdictions.
A landmark case in Sweden in 2019 decided that efficiency gains from using facial recognition attendance monitoring at a school did not justify the “intrusive” method of collecting sensitive personal data. The local authority that runs the school was fined SKr200,000 ($20,500) after the Swedish Data Protection Authority investigated its facial recognition trial.
The trialling of new surveillance technology methods on young people raises particularly pressing ethical, as well as legal, concerns, says Ms Taylor of City University.
“There’s a power dynamic in schools which means that young people don’t have the ability to question the surveillance they are subjected to,” she says. “And yet schools are often a test-bed.
“In a framework where students are already expected to follow instructions, they make the best guinea pigs to test new technologies.”
Get alerts on Artificial intelligence when a new story is published