Brianna Hill is a recent law school graduate. She knew it would be hard to contest a false report of cheating by proctoring software, the technology increasingly being used to monitor individuals taking tests remotely during the Covid-19 era. So she continued to complete her bar exam despite going into labor during the test this October. She sat still during contractions knowing she might be disqualified if the artificial intelligence couldn’t watch her every move.
Test-takers can’t leave the view of the camera for any reason. Even looking away from the screen too many times or for too long can get them kicked out of the test or cause them to fail.
When a test administrator utilizes proctoring software, the technology records test-takers through their computer’s camera, microphone and web browser, among others. The software often requires a room scan to ensure the student is alone and validates their identity using biometric measures like facial recognition or keystroke analysis, which tracks how someone types in order to identify them. Test-takers can’t leave the view of the camera for any reason. Even looking away from the screen too many times or for too long can get them kicked out of the test or cause them to fail.
While automated proctoring might seem like a panacea in this age of remote schooling and examinations, it’s a terrible solution for the millions of students in high schools and colleges around the country. The AI technology used by proctoring software has a history of discrimination, and there’s little students can do when penalized for things they can’t control — like having children interrupt them or exhibiting unconscious behaviors such as reading questions out loud. Meanwhile, trying not to run afoul of proctoring software increases students’ anxiety and invades their privacy.
Many students have said being watched like this, even by an algorithm, is an unfair way to assess learning, especially when it’s so hard to contest when the software accuses them of cheating. The College Board, the company behind the SAT, abandoned plans to use proctoring software for the exam this Saturday because of the technology barriers it creates for students, though the company hopes to offer it in the future. But other places are moving forward with employing proctoring software, which means it’s urgent that we address these problems before it spreads more widely.
Dana Jo is a student who described her experience on the video-sharing app TikTok of being falsely accused of cheating by proctoring software. She was eventually able to have her grade corrected, but it cost her time and emotional energy that not all students have. She also went through multiple layers of bureaucracy, something others — particular those who are marginalized — may not know how to do.
Students have reported having to urinate in bottles, buckets or diapers during a test so as not to be disqualified. Many common behaviors, such as fidgeting or looking upward when thinking, can raise a student’s suspicion rating. Shaima Dallali, a Muslim woman who wears a hijab, said she was forced out of her test for not exposing her hair to make sure she wasn’t hiding anything in it.
Students at dozens of universities are petitioning to ban proctoring software. Many faculty members are concerned, as well. The Academic Senate at San Francisco State University recently passed a resolution calling for third-party proctoring to be banned, describing the software as inequitable because it fails to meet accessibility standards and creates unequal hardship for students.
And it gets worse.
One transgender student at Georgia Institute of Technology was recently outed by proctoring software in a “humiliating” fashion, Teen Vogue reported. Honorlock, a remote proctoring company, requires students to use a government-issued ID before their test starts, which, depending on the state, can put trans and undocumented students in vulnerable positions.
Proctoring companies also commonly use facial recognition or facial detection software to verify a student’s identity. Both of these technologies enable racist episodes such as misidentifying or failing to detect Black students. This is because those technologies were calibrated to detect white skin and struggle to see people with darker skin, a form of racism found in products ranging from soap dispensers to film. Part of the Civil Rights Act was created to protect students against this type of racism, but it’s happening anyway.
The keystroke-analysis feature in several proctoring products, meanwhile, can’t always accurately identify students who have certain physical disabilities or medical conditions. This technique measures typing patterns, rhythm and speed to produce a biometric signature that can flag inconsistent typing as a possible sign of cheating. The result is technological ableism — which isn’t allowed for schools that receive federal funding from the Department of Education.
To add insult to injury, in some cases, students are made to pay out of pocket to use proctoring software at around $15 per test. Given that a 2019 survey found that 39 percent of college students were food insecure in the last 30 days and 46 percent were housing insecure in the last year, these charges add an unfair financial burden to students.
Given these gross deficiencies, I’m working to get test proctoring software and related technologies like facial recognition banned on my campus. Skeptics ask me, “How will we stop students from cheating?” My answer: “Do you care more about some students who will experience discrimination or some students who might cheat?”
Moreover, there are other — even better — ways to measure learning besides tests. Statistics classes, for instance, don’t need to rely on formula memorization and proctored testing. Instead, they can give students real-world problems based on real-world data that take weeks to develop projects around. Give students opportunities to think critically about the data, show their work and demonstrate how their ideas develop over time.
Students have reported having to urinate in bottles, buckets or diapers during a test so as not to be disqualified.
This type of assessment has been around for years and proven to be effective for measuring learning. It’s also almost impossible to cheat when used. However, implementing these strategies usually takes training and time to rework courses, something that universities haven’t robustly invested in or rewarded.
We need to stop believing that technology can solve complex social problems like cheating, and we need to start reframing our educational goals to center equity, compassion and trust. The future of online education depends on it.