The rapid growth of Artificial Intelligence (AI) and technology has provided schools with accessible tools and effective alternatives for daily operations. Now, district leaders may be considering whether they can, or should, use new technology as part of their safety and security efforts.
Gone are the days when student monitoring consisted of intercepting passed notes, signing out hall passes, and checking for lanyard IDs at the door. Methods of mitigating bullying, student victimization, and on-campus weapons use have evolved to using expertly programmed metal detectors, digital hall passes, online activity monitoring programs, and AI video tracking systems. The efficiency with which AI and updated software programs can detect risks and respond accordingly is tangible, but so are its inaccuracies and imperfections compared to manual methods.
Districts need to understand the full implications of AI safety technology before making a long-term decision with financial and social effects. Implementation costs and student response to increased safety and monitoring efforts are the most prevalent factors when considering security upgrades.
What AI Can Do
In addition to firearm attacks, campus risks today include cyberbullying, nonfatal crimes against students, safety threats against teachers, and student conduct violations. According to a recent report, occurrence rates within these domains have generally decreased over the past decade, except for cyberbullying which nearly doubled from 2010 to 2020.1
As a result, campus security measures have aligned closely with threat evolution through some of the following programs:
-
Bark: A free AI-enabled online monitoring program used in school-issued devices to identify student safety risks to themselves or others. In 2021, it was reported that this software had previously identified 5,000 self-harm risks during the span of one week.2
-
Lightspeed: Another internet and student activity monitoring software with AI capability. A district in California implemented this program and within the first week of the semester was notified of five risks.2
-
e-hallpass: A digital student location and time tracking program used to track and limit activity in hallways, bathrooms, or other campus areas.
-
Evolv: A weapons screening program that uses AI to help with accuracy and speed. Using Evolv sensors, a school district in North Carolina was able to significantly decrease their gun identification occurrences from 30 to three within one school year.3
-
ZeroEyes: A supplementary software used with a school’s respective camera system to identify and track guns using AI and trained staff.
These tools can oversee student activity with automated alerts and action steps, and with a faster process than manual tracking, they have the potential to considerably prevent, reduce, or stop harm on school grounds more efficiently than current security methods. However, they are not infallible and have notable shortcomings as well as privacy concerns.
What AI Cannot Do
At its current stage, security programs and AI tools are not capable of fully replacing human monitoring and safety oversight in schools.
In addition to errors, like Evolv’s failure to identify a knife later associated with an incident, some programs still rely on personnel input before action can be taken.3 ZeroEyes, for example, uses AI to identify guns from live security footage feeds, but each identification alert is manually reviewed by experienced internal professionals to determine authenticity and appropriate next steps.4
Like traditional security approaches, AI programs may be under-sensitive or overreactive, causing false alarms or worse, no alarms when it matters most. Schools must carefully train and monitor new interfaces, sensors, or programs to ensure accuracy and efficacy. This may also require more personnel than previous safety protocols, which needs to be compared against the anticipated increased speed and precision gained from using new technology.
Student Response and Implications
It’s been established that an increased security presence negatively impacts students socially and academically due to the interpretation that greater security implies a greater threat.5 This can potentially be mitigated by the enhanced discretion and reduced visibility of newer security devices and programs, but the concerns regarding privacy and autonomy may grow in turn.
Softwares like Bark for online monitoring and e-hallpass for physical activity tracking, may feel intrusive to students and deplete morale if they feel a loss of autonomy more akin to an institution than a learning space. Moreover, using online monitoring programs to protect the safety of students may also set a precedent that they can not be independently trusted and raise data privacy ethical challenges for minors.
Another disconcerting aspect of using AI-powered security is the potential for bias. Some tools could reduce bias by exclusively focusing on objective data, such as weapon identifiers, but other tools like online activity trackers may rely more on subjective interpretation, especially when school personnel make the final evaluation before responsive action.
Accordingly, thorough research and surveying of all stakeholders are essential prior to full integration to better anticipate and address concerns.
Getting Support: Weighing Costs and Community Buy-in
Emerging AI security technology has a wide price range contingent on desired features, district size, and length of use.
In the lower range, a Texas school district paid $6,000 for a 90-day trial run of an online monitoring software, while a midrange security addition of AI in camera feeds cost a New Jersey district $76,000.6&7 For larger service areas, the funding required may stretch into the millions, such as the $3.7 million investment made by Utica City Schools in New York for a weapons detector supported by AI.3
Furthermore, since these programs and technological advancements are constantly developing, funding may inadvertently be used on a soon-to-be-obsolete program or may cost more in the long term with necessary personnel oversight of the new technology that was originally intended to automate activities.
Security upgrades need to be analyzed against respective schools’ needs, as well as the community’s likelihood of buying in when it comes to voting for the school bond referendums that would provide funding. School leaders can examine the full risks, rewards, and public opinion related to new safety implementations by looking at relevant data to determine necessity, reading ethics and legal reports on intended software, communicating with students and teachers to gauge interest, and gaining a full understanding of the technology’s impact to be fully prepared to navigate the politics of the new investment.
AI for Security Support, Not Substitution
Some school leadership may oppose new technology, while others are eager to integrate new supports and replacements. The goal for schools now is to find a solution that simultaneously provides a happy medium and vital security for all building inhabitants.
Undoubtedly, AI tools have the potential to make schools safer than was previously feasible, but it shouldn’t come at the expense of student unease or the large financial expense when it may not be warranted.
All upgrades must be considered carefully and with the collaboration of existing personnel so school security can be supported, not replaced, with automated technology.
- Irwin, V., Wang, K., Cui, J., & Thompson, A. (2023, September 13). Report on indicators of school crime and safety: 2022 and indicator 2: Incidence of victimization at school and away from school. Bureau of Justice Statistics. https://bjs.ojp.gov/library/publications/report-indicators-school-crime-and-safety-2022-and-indicator-2-incidence
- Schlitz, H. (2021, September 21). Schools often use AI to find students in crisis — one software-monitoring company reported 5,000 self-harm situations in a single week. Business Insider. https://www.businessinsider.com/school-students-communication-monitoring-ai-software-self-harm-suicide-risks-2021-9
- Clark, J. (2023, May 4). JCPS is poised to spend $17 million on AI weapons detection. Is it worth it? Louisville Public Media. https://www.lpm.org/news/2023-05-04/jcps-is-poised-to-spend-17-million-on-ai-weapons-detection-is-it-worth-it
- Kukulka, M. (2023, August 26). Adrian schools deploy AI gun-detection tech to security system. mlive. https://www.mlive.com/news/jackson/2023/08/adrian-schools-ai-gun-detection-tech-to-security-system.html
- Mowen, T. J., & Freng, A. (2019). Is More Necessarily Better? School Security and Perceptions of Safety among Students and Parents in the United States. American journal of criminal justice : AJCJ, 44(3), 376–394. https://doi.org/10.1007/s12103-018-9461-7
- Windes, I. (2023, March 1). San Antonio schools pilot AI surveillance tool. San Antonio Report. https://sanantonioreport.org/saisd-artificial-intelligence-surveillance-tool/
- DeGerolamo, D. (2023, August 8). Phillipsburg school board gives green light to AI technology upgrade for enhanced security surveillance. TAPinto. https://www.tapinto.net/towns/phillipsburg/sections/education/articles/phillipsburg-school-board-gives-green-light-to-ai-technology-upgrade-for-enhanced-security-surveillance