A week after students begin their distance learning courses at the UK’s Open University this October, a computer program will have predicted their final grade. An algorithm monitoring how much the new recruits have read of their online textbooks, and how keenly they have engaged with web learning forums, will cross-reference this information against data on each person’s socio-economic background. It will identify those likely to founder and pinpoint when they will start struggling. Throughout the course, the university will know how hard students are working by continuing to scrutinise their online reading habits and test scores.

Behind the innovation is Peter Scott, a cognitive scientist whose “knowledge media institute” on the OU’s Milton Keynes campus is reminiscent of Q’s gadget laboratory in the James Bond films. His workspace is surrounded by robotic figurines and prototypes for new learning aids. But his real enthusiasm is for the use of data to improve a student’s experience. Scott, 53, who wears a vivid purple shirt with his suit, says retailers already analyse customer information in order to tempt buyers with future deals, and argues this is no different. “At a university, we can do some of those same things — not so much to sell our students something but to help them head in the right direction.”

Made possible by the increasing digitisation of education on to tablets and smartphones, such intensive surveillance is on the rise. In the US, the concept has progressed even further: two years ago, an Ivy League institution, Dartmouth College, trialled an app installed on students’ mobile phones which tracks how long they spend working, socialising, exercising and sleeping. GPS technology follows their locations around campus to work out their activities, while a listening function tunes into the noise level around the phone to detect whether its owner is conversing or sleeping. Once analysed by the lab, the information is used to understand how behaviour affects grades, and to tailor feedback on how students can improve their results.

The justification for gathering such large volumes of personal data are that this will help students get the most from the investment they are making in their education. As the UK’s £9,000-a-year tuition fees begin to catch up with average annual course charges of £17,500 in the US, university attendees on both sides of the Atlantic are under financial and academic pressure to do well and complete their degrees.

Some argue that this increasing tension puts students at risk. Andrew Keen, author of The Internet Is Not the Answer (2015) — a critique of the digital revolution and the way in which big data are being used to “monetise” human activities such as education — says the safeguarding of students’ privacy is becoming “an enormous concern”.

Keen, who predicts an imminent “reality check” on the impact of technology on education, says users are particularly vulnerable when consuming free university content from “Massive Open Online Courses”, known as Moocs, offered by institutions such as Stanford and Harvard. “Any time these services are free, eventually it’s the user who ends up paying,” Keen says. “I think the two areas . . . where we’re all especially sensitive are education and healthcare. That’s where we reveal ourselves: our concerns, our interests, our feelings, our future careers.”

 . . . 

Student monitoring service Skyfactor, which is sold in the US and used by 130 universities there, advertises itself as a risk management service, promising to help academics “quickly see which students need attention and resources now — before it’s too late”. Course tutors are given access to a dashboard that documents each student’s class attendances, assessment grades, participation in sports practices, and visits to the campus financial aid officer. A door icon placed next to each name, either closed or open, signals the program’s prediction of how likely the student is to leave the institution early. If their high grades drop, or their passion for basketball begins to wane, Skyfactor will flag these individuals in red.

David McNally, chief technology officer at Macmillan Science and Education, which owns Skyfactor, says the early warning mechanism is beneficial for all involved. “In the US more than the UK . . . losing a student is a very expensive loss to an institution because they pay high annual fees,” he says. “If you can get to a student before they drop out, you can keep them in the institution.”

When asked about privacy implications, McNally says his company — a competitor to Pearson, current owner of the Financial Times — is “extremely serious” about abiding by both US and UK data security laws. He adds that the information is “being used for the greater good, which is better education for everybody”. He insists it is not only students being tracked: the same programs that measure their performance are being used to compare how effective their tutors are and how well one school is teaching its pupils compared with another. In the future, it will be possible to compare entire local education authorities.

However, McNally does acknowledge that the success of innovations such as Skyfactor is an indication of the increasing stress that students are under, compared with previous generations. “It’s high-stakes, education now,” he says. “The amount of money that’s put in for an individual student, and the amount of debt they walk out with, is material. I doubt Skyfactor would have had a market 30 or 40 years ago.”

According to research published this week by the UK’s Institute for Fiscal Studies, the poorest 40 per cent of students are now expected to graduate from English universities with debts of up to £53,000 after three years of study — a significant rise from the previous maximum of just over £40,000. Fears about similar pressures on US undergraduates prompted Dartmouth College’s phone monitoring trial. Andrew Campbell, the computer science professor who ran the study, was concerned about the rise in anxiety and depression across the country’s higher education system.

“Here at Dartmouth, a third of the undergraduate student body saw mental health counsellors last year,” he says. “That’s actually a shocking figure . . . we need to try to get more information about what is actually going on.”

Campbell admits that the study involved a “very invasive form of monitoring”, which required oversight by Dartmouth’s Committee for the Protection of Human Subjects. Before beginning the trial, he had to assure the committee that the students’ privacy would be protected, and their data anonymised. But he insists that those who signed up to be tested did so because they had a real interest in how their behaviour was affecting their grades.

Many of Campbell’s findings were unsurprising: more conscientious students did better, as did those who had high levels of class attendance. Less predictable was that students who knuckled down towards the end of term did better than others. The highest achievers tended to party harder early in the semester, then moderate their behaviour from midterm onwards.

The initial aim of the study was to develop an app which measures behaviour and gives undergraduates feedback on how they can change their habits to maximise their grade potential. But Campbell believes students could reap bigger long-term benefits if they allow their data to be shared with their student dean, professor or clinician, who could make external interventions.

He gives the example of earlier prototype research involving two of his own computer science students who had stopped going to class and submitting assignments. Campbell could see that both had high PHQ-9 scores, which are indicators of depression. Instead of failing them at the end of the semester, he decided to give them “incompletes”, which allowed them to come back the next term and complete the class.

While this level of access to personal data represents a degree of intrusion, Campbell argues it is justified. “Do you just let them fall through the cracks,” he says, “or can you embrace technology that might help them deal with the stresses of college and progress?”

 . . . 

While benefits for the institutions are clear, students’ own attitudes to being under increased surveillance are harder to pin down. Ruth Tudor, president of the Open University’s Students’ Association, is 44 and has spent the past decade completing an undergraduate degree, masters and counselling course remotely from her home in Dumfries and Galloway, Scotland. She says when the data analytics programme was first mooted, participants were “naturally” anxious about the university selling the information it collected to a third party.

Tudor, who now works part-time as a teacher, says the university has “absolutely assured” the association that data are used only for educational purposes. Two students have been appointed as consultants to the project and advise on future developments. “I think students are more aware of digital privacy now,” she says. “But they do understand that this is about supporting them.”

However, Dr Bart Rienties, a learning analytics expert who helped design the OU’s monitoring programmes, admits some individuals remain “very concerned” about the idea of their online behaviour being tracked. On the other hand, he says: “Those who fell by the wayside were surprised that we weren’t using their information to help them.”

Rienties says it is only by using data that universities can tackle the “one size fits all” approach that has historically benefited students from higher-income families. “Students from particular socio-economic backgrounds or students from particular ethnic backgrounds are potentially more at risk than others. Being open to all students means [looking at] how we provide learning environments that meet not just the needs of middle-class England.”

Despite these advantages, research has still revealed discomfort among students about the consequences of technological advances. Two years ago, the UK’s National Union of Students, together with online learning company Desire2Learn, undertook a major study of what youngsters thought of their universities and colleges using online education materials. It found that students were extremely suspicious of any suggestion that technology was replacing human interaction.

The survey — conducted through focus groups and interviews — showed students were reluctant to make their data available due to fears about how they would be judged. Those polled said they did not mind their data being anonymised as part of a study of their whole cohort, but said they didn’t want their lecturers to know that they personally had logged on to the network two minutes before an assignment was due to be handed in, that they read their ebooks late at night or that they were not engaging with their discussion forums.

By coincidence, a tracking product launched the same year as the study marked a significant shift towards closer monitoring. CourseSmart, a Silicon Valley start-up, gives university customers a window into exactly how e-textbooks are being read. For students, the texts offer so many highlighting, searching and copying functions that the company claims they have “more tools than a Swiss army knife”; for tutors, an analytics function will show which individuals have read which pages on any given day.

CourseSmart was founded in 2007 by Pearson and other education providers, but acquired last year by VitalSource Technologies in California, a division of Ingram Content Group. Cindy Clarke, vice-president of marketing at VitalSource, says students can opt out of being monitored, but rarely choose to do so. “It’s a Facebook generation now, and our data suggest they’re not overly concerned,” she says. But Jess Poettcker, who carried out the NUS/Desire2Learn research, says that in the UK at least younger students were just as concerned as people of their lecturers’ age about how their data were being mined and used.

Now working as education co-ordinator at the University of Calgary in Canada, Poettcker is also sceptical about the idea that monitoring students could be a way to minimise stress or cut dropout rates. “What this has led to in the past . . . is increasing assessment anxiety,” she says. “So it’s not only, ‘I need to achieve this mark in this course’, but also, ‘I need to read this certain amount in a week or I’m going to come off badly to my lecturer’. Students were saying, ‘because that information was available, that made me feel self-conscious’.”

A growing area of concern for students, says Poettcker, is the role that technology plays in the perceived “privatisation” of higher education, which has been debated in Britain after the tripling of tuition fees three years ago.

“When you make all this data available, what’s going to happen?” Poettcker asks. One option, she suggests, is for universities to begin advertising continuing education or additional tutoring services to certain people on the back of information they collect. “I think you’ll have students being resistant to the way this is going to change the sector.”

 . . . 

From their experimental lab at the Open University, Peter Scott and his colleagues seem determined to drag education into the 21st century. Scott’s vision is that courses will increasingly be tailored to fit each student’s preferences about how to learn. Advances in information analytics will mean studying the same subject in the same way as 1,000 other people will soon be consigned to history.

He is particularly evangelical about the evolution of a world in which data are “ubiquitous”. In his version of the future, the “internet of things” will already link people up to a far wider virtual network.

In this context, will the monitoring of students’ study habits really seem so bad? “We don’t need to use any of the data about you . . . to try and manipulate you,” he says. “We want to give you the data so you manipulate yourself”.

Helen Warrell is the FT’s public policy correspondent

Slideshow photographs: Daniel Stier

Letter in response to this article:

University should stimulate individualism / From Tristram C Llewellyn Jones, Ramsey, Isle of Man

Copyright The Financial Times Limited 2024. All rights reserved.
Reuse this content (opens in new window) CommentsJump to comments section

Follow the topics in this article

Comments