![](https://i.kinja-img.com/gawker-media/image/upload/c_fill,f_auto,fl_progressive,g_center,h_675,pg_1,q_80,w_1200/993a99733dfaa38fd2b36b1d55de265f.png)
Honorlock’s flagship monitoring tool reportedly saw a boom in sales during the pandemic as well. Those tools are reportedly capable of verifying students’ identities through face scans and can even use specific phrases via a student’s computer microphone. Like other remote monitoring services, Honorlock has spurred vigorous debates over privacy and ethics at some schools. Last October, for example, students at The University of Wisconsin pleaded with administrators not to renew the company’s contract over concerns the software failed to properly identify students with darker skin tones.
Advertisement
“It’s a major invasion of privacy, and it doesn’t help our learning at all,” one UW student said in an interview with student paper The Badger Herald. The university renewed the contract despite the student opposition.
Universities could seemingly sidestep all this gargantuan investment in monitoring tools and the predictable public backlash if they simply committed to creating exams with questions less tailored to Google search answers. That, however, would require a somber, introspective look at the pedagogical mechanism underpinning education writ large. Monitoring tools offer a simple, if largely ineffective, escape hatch.
Advertisement