K-12 school administrators signed contracts with technology vendors to implement campus surveillance and monitoring technologies that vendors say will help keep students safe. But these tools can actually threaten student privacy and result in false positives that lead to discipline, investigative reports and independent research shows. Additionally, sensitive information collected by these vendors may end up online.
And, as surveillance and study tools integrate artificial intelligence, even more data privacy challenges are on the horizon for schools.
That’s why it’s important for school districts to be transparent about contracts with technology vendors, including how their tools collect information, who has access to the information, where it is stored and how long it is stored.
Thus far, volunteer pledges to protect student data privacy haven’t proven to protect student data from unauthorized disclosure. Help might be needed from federal regulators, according to a panel of journalists and researchers.
The panel discussed the state of student data privacy during an Education Writers Association workshop. The session took place at the 2024 SXSW EDU Conference & Festival in Austin, Texas last March.
The Reporters
- Mark Keierleber, The 74
- Mackenzie Wilkes, POLITICO
Data Breaches Threaten Student Privacy. – Mark Keierleber
Mark Keierleber writes about school safety and civil rights for The 74.
Keierleber found that despite safety pledges, schools continue to experience privacy breaches.
In a recent investigation, Keieleber discovered that a security lapse at Raptor Technologies, a leading school safety company, exposed 4 million school records, including the districts’ active-shooter response plans, students’ medical records and court documents about child abuse. And, according to Keierleber, anyone using Google search can track down this data.
“Data breaches are inevitable –everyone is going to be breached sooner or later,” Keierleber said.
Data retention laws vary across the U.S., and many laws have not kept pace with technology.
“We are building up this storehouse of information, and students don’t know about it. How do we minimize the collection of sensitive information?”
Students and parents also should be aware about whether or not their school district shares data with local law enforcement, Keierleber said.
AI Tools Bring More Privacy Challenges. – Mackenzie Wilkes
Mackenzie Wilkes produces Morning Education, the POLITICO Pro’s daily education policy newsletter.
As vendors pitch schools on AI software to help students learn, these tools are creating privacy challenges for K-12 classrooms, Wilkes, told reporters.
“Ed-tech companies are starting to have products with AI,” Wilkes said. She added that not only are they used in surveillance, students can use these tools to generate images.
But whether schools are using AI to screen campus visitors or integrating AI tools into curricula , they haven’t received much guidance from the federal government on how to vet vendors. Some vendors may offer products that don’t take into account federal laws aimed at protecting data.
In addition to data privacy issues, such products could also introduce discrimination,” Wilkes found in her reporting.
The Researchers
- Jason Kelley, Electronic Frontier Foundation
- Lisa LeVasseur, Internet Safety Labs
GoGuardian Is a Red Flag Machine. – Jason Kelley
Jason Kelley directs activism for Electronic Frontier Foundation, a think tank formed in 1990 in response to threats to privacy. Prior to his work at EFF, he managed marketing strategy for a software company and also for a student loan start-up.
One of the more popular online safety vendors is GoGuardian, which monitors the online activities of more than 25 million students across 10,000 schools nationwide.
The California-based company created activity monitoring software that examines student browsing habits. Based on a list of keywords, the tool sends alerts to school officials, letting them know every time a student views online content that falls outside of a district’s guidelines.
In fact, a large school district might receive up to 50,000 warnings a day that students are viewing inappropriate content, Kelley said.
But are students really spending that much time exploring the dark and dirty corners of the web?
Kelley says no.
EFF filed dozens of public records requests and analyzed tens of thousands of results from the software. “Our research found that it flags broad keywords,” Kelley told journalists. He provided a demo of what he refers to as GoGuardian’s “Red Flag Machine,” explaining issues with how the software filters and monitors school search sites.
“What we uncovered was that, by design, GoGuardian is a red flag machine—its false positives heavily outweigh its ability to accurately determine whether the content of a site is harmful,” Kelley said. “This results in tens of thousands of students being flagged for viewing content that is not only benign, but often, educational or informative.”
For example, in February and March 2023, more than 900 website visits were flagged for the word “colon.” To GoGuardian, this might have been considered explicit because of its proximity to other parts of the body often involved in sexual activity, according to Kelley. But, these alerts resulted in students flagged for visiting pages about how to use a colon or semicolon in sentences and mathematical formulas, biographical pages about Christopher Columbus (aka Cristóbal Colón), and educational pages about human anatomy, EFF found.
According to EFF’s fall 2023 report, GoGuardian acknowledges that its software creates “unnecessary and often times innocuous noise” and flags keywords that don’t even appear on a web page, but they are buried deep in the source code and metadata and are “not necessarily being searched for intentionally by a student.”
Based on its research, Kelley said EFF has asked the Federal Trade Commission to intervene.
Schools Must Publish Data Privacy Agreements. – Lisa LeVasseur
Lisa LeVasseur is executive director of Internet Safety Labs, an independent product safety testing lab for software that advocates for consumer safety.
In a March 2024 letter to the Federal Trade Commission, LeVasseur asked the agency to make it easier for parents to find out about the data privacy agreements schools have with their vendors.
Based on ISL’s research, administrators at school districts across the country are recommending or requiring that their schools incorporate as many as 20 technologies.
“Imagine as a parent having to go to 20 or more different operator sites to search for your school’s agreement. It would be much easier and practical for both parents and students to have a simple, single list from their school,” LeVasseur said.
“Even just a posted spreadsheet with links to the agreements on the school or district’s website would be adequate,” LeVassuer said. Schools already should be maintaining this list as a matter of vendor management practice, she added.
Responsibility for determining who is the data controller is another policy issue in educational technology that needs more scrutiny.
In educational technology, the data controller sets the purposes and means of processing personal data. But in many school districts, it isn’t clear if the person in charge should be the vendor or the authorities at the school to license the software.
The data controller is in charge of making sure the campus complies with data protection regulations, which can include obtaining consent and providing privacy notices. Additionally, the data controller responds to rights requests from the data subject.
Some campuses may not have the staff to take on this role, according to LeVasseur.
“Schools are not resourced and funded to have business-grade [information technology], which is what they need now because they’re in it deep with it,” LeVasseur said.
Despite confusion, whether this is the job of the technology platform manufacturer or the school that licenses the technology – vendors always should bear the responsibility to control how student data is processed either solely or jointly with the school, depending on their business relationship, LeVasseur posited in a 2023 paper that explores the confusion.
“We are seeing a dearth of technology notice, consent and vetting practices in place in schools in the U.S.,” LeVasseur said. “Every school needs to have a software specialist to establish processes and make the call for what is acceptable.”
On its website, ISL offers whitepapers that can help journalists get up to speed on the status of student data privacy, including recommendations for school districts to implement technologies safely with a focus on equity.
Story Ideas
- How frequently do “red flags” result in discipline?
- Search for lawsuits involving data breaches at schools.
- Look into the effectiveness and ineffectiveness of school security cameras.
- Research a school district’s process for vetting technology vendors.
- Examine failures and false positives in facial recognition technology and how that affects students.