Every Monday this fall, we’re covering how schools are using technology during the pandemic, because Monday mornings are just harder now. It’s not only teaching that’s happening remotely for millions of students — it’s testing, too. And many colleges are using software to watch students take those tests.
Big providers include Respondus, ProctorU and Proctorio. Some of them use webcams to track how often students move their heads or eyes or touch the keyboard. Anything out of the ordinary is flagged for teachers to review. I spoke with Todd Feathers, who recently co-wrote a piece about a “rebellion” against this kind of surveillance for Motherboard. One concern: false positives. The following is an edited transcript of our conversation.
Todd Feathers: Once you start the exam, there are all kinds of environmental factors that can lead to false positives. If you’re a parent who has a child in the room, you’re much more likely to be looking away from your screen or moving around than somebody who doesn’t have a child in the room. If you have ADHD, or some kind of anxiety disorder that’s tied to taking tests, you’re likely to exhibit behaviors that fall outside of the norm of other people in your class. And all of this can lead to being flagged as suspicious activity. We have heard from students who have cried during tests because they’re anxious about them and get flagged. Alternatively, I should say, I’ve also heard stories from professors about students who go to all kinds of creative lengths to cheat from home and have been caught as a result of using this kind of software.
Amy Scott: So you talked to students who are really worried about privacy, surveillance, the potential for people to sort of fall outside the spectrum of what the algorithm thinks is normal behavior, but not in fact, be cheating. You also spoke with a student of color who couldn’t even take the test because the software didn’t recognize his face.
Feathers: This is a problem that is not just related to this kind of software. A lot of these digital proctoring vendors don’t create their own facial recognition technology. They are licensing it from other companies that specialize in this, and facial recognition has been shown over and over again, in different settings, to not be as good at recognizing people of color, not be as good at recognizing women, and to struggle to the point of absolute failure when it comes to people who don’t identify as one gender or another.
Scott: What do the companies say about these criticisms? I mean, some students have protested the use of the software on campuses. How have the companies reacted?
Feathers: They reacted with a response that is pretty common across a lot of different technologies, applications of technologies, which is that this software is a tool, that companies give it to universities and to professors, and all they’re doing is providing a quicker, more efficient way for professors to identify possibly suspicious moments during an exam. There has been a lack of response directly to the criticisms about the invasion of privacy and about the way that it can negatively affect people who are underprivileged or from certain ethnic backgrounds.
Scott: Right now we’re, of course, in this moment where a lot more college students and students in pre-K-12 are at home taking online classes. If and when we all get back together again, do you see these companies continuing to thrive and there being enough demand?
Feathers: That’s the million-dollar question. I think that it’s a pretty fair bet to say that once students are back in classrooms of all types, this will not be quite as widely used as it is now. But that being said, some of these companies are looking for other ways to expand their customers. For example, Proctorio has recently announced [a] partnership with McGraw Hill, the textbook maker, to integrate its services into some of those online textbooks. And there’s certain kinds of assessments for certain professional certifications, such as nursing, which do require as part of state laws that the exams be proctored. And so, if those aren’t taking place in a physical room, or even if they are but there’s the option to do it online, there is a space where tools like this are not only a possibility, they’re arguably required by law.
Related links: More insight from Amy Scott
Back in April, the nonprofit education technology group Educause surveyed colleges about remote proctoring during the pandemic. More than three-quarters of the institutions surveyed are either using or considering using online services. But many reported some conflict about the technology. Cost and student privacy were the most widespread concerns. Many colleges also have honor codes and see spying on students as maybe not the best way to build a culture of trust.
I mentioned last week that Facebook is speeding up its creation of a new oversight board, though still not in time to really deal with issues that come up during the election. Some critics are tired of waiting for the company to do more to root out misinformation and outright lies that could undermine the election and suppress voters. On Friday, a few dozen researchers, civil rights activists and journalists announced they’ve formed their own group to monitor and critique how Facebook moderates content on the platform, calling itself the “Real Facebook Oversight Board.” The group holds its first meeting live on Oct. 1 via, naturally, Facebook.
The future of this podcast starts with you.
Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.