In January, for instance, The Texas Tribune reported that an 11-year-old Frisco ISD student received mental health help after the district flagged one of his text entries on a school-issued device. He’d typed “give me 10 GOOD reasons why I shuldnt kill myself here” on a Google Doc, with no reasons listed.
U.S. Sen. John Cornyn has also argued that schools need to reopen for in-person learning. “The pandemic has simply taken a devastating toll on our children, academically, socially and emotionally,” he said.
Many parents credit monitoring systems like Frisco’s with saving their kids’ lives, but some privacy and civil liberties experts are sounding the alarm, saying well-meaning districts could be doing more harm than good.
In February, Frisco ISD’s director of counseling, Stephanie Cook, told NBC-DFW that Frisco was up 200% in hospitalizations at the beginning of the year. And last fall, there was a point when there weren’t any vacant mental health beds for adolescents in Collin County, she said.
Another set of Frisco ISD parents got help for their daughter thanks to the district's monitoring system, with the mother singing its praises.
“Yes, it may be an invasion of privacy, but so are our devices — our Echoes and our Alexas,” she told NBC-DFW. “Everything spies on us, so if this is going to help save lives, absolutely. Go for it.”
Frisco ISD began using a program called Lightspeed Alert at the start of the school year, assistant communications director Meghan Cone said in an email. The program purports to prevent suicide, cyberbullying and self-harm by monitoring across education apps and social media.
Another technology company, Bark, offered districts its surveillance services for free following the Parkland, Florida, school shooting in 2018, according to The Guardian. Many companies now advertise that their services will help to prevent similar violence.
Yet such claims are unethical because there’s no way to substantiate them with data, said Chad Marlow, senior advocacy and policy counsel with the American Civil Liberties Union. Software makers may have varied their pitch over time, but they always sell one thing: fear.
“They are either telling schools that using the software will stop a school shooting or using the software will stop a suicide, with the implication being that if you don’t do it, you’re leaving students at risk,” Marlow said. “What I always note that they don’t do is discuss at length how many suicides they’ve caused.”
When a student is confronted over what they've written, they may be driven to self-harm because they're embarrassed and humiliated, Marlow said. The kid might not actually be suicidal, but having a concerned adult swoop in could be the thing that “tips [them] over the edge,” he said.
"You have to think about student safety holistically, and make sure that you’re not creating more harm, unintentionally, by deploying tools like these.” – Elizabeth Laird, director of Equity in Civic Technology
On top of that, schools can flag any terms they want to, such as “gay” or “lesbian,” “Black Lives Matter,” “Biden,” or “Trump,” Marlow said. Students who know they’re being monitored could avoid engaging in basic academic research, what he calls “the sort of curiosity that education is made of.”
Those kids may then just turn to other devices where they know they aren’t being monitored, he said. Or, they could skip searching for help entirely.
“Do you want a student who is contemplating suicide, who wants to research services and guidance … not do that search because they don’t want it to be revealed that this is what they’re contemplating?” Marlow said. “So, you may actually deprive a student of the resources the student needs to be safe.”
For some low-income families, a school-issued device is their only computer, Marlow added. Then, the entire family is effectively being monitored by the district in violation of their privacy.
Using school-issued devices to monitor key terms could create disparities surrounding data collection, said Elizabeth Laird, director of Equity in Civic Technology at the Center for Democracy and Technology. Students from low-income families and students of color are frequently punished at higher rates than their peers.
If search data is used against them, then it could exacerbate disproportionate discipline rates, Laird said.
Many search algorithms may miss out on nuanced speech, such as slang and emojis, she added. Such systems may overly flag certain students, or not flag them at all. In addition, some services may not work with multiple languages— something that’s particularly important for states with large Spanish-speaking populations, like Texas.
Most districts that roll out monitoring systems do it because they love and care for students, Laird said. Schools should just be careful to weigh potential benefits against all the costs.
“The main thing is that when it comes to monitoring student activity, that we don’t make some students unsafe in the name of keeping others safe,” she said. “You have to think about student safety holistically, and make sure that you’re not creating more harm, unintentionally, by deploying tools like these.”