On a steamy evening in May, 9,000 people filled Stingaree Stadium at Texas City High School for graduation night. A rainstorm delayed ceremonies by a half hour, but the school district’s facial recognition system didn’t miss a beat. Cameras positioned along the fence line allowed algorithms to check every face that walked in the gate.

As the stadium filled with families, security staff in the press box received a notification that the system had spotted someone on their watch list. It was a boy who had been expelled from the district and sent to a county disciplinary school, whose pupils are barred by district rules from visiting other campuses.

Less than 30 seconds after the boy sat down, a sheriff’s deputy asked for his name. When he replied, he was escorted from the stadium and missed his sister’s graduation. “Mama was upset, but that’s the rules,” says Mike Matranga, executive director of security at Texas City Independent School District, on the shore of Galveston Bay south of Houston.

Matranga proudly relates the incident to show how facial recognition can make schools safer. It also shows how the nation’s schoolchildren have been thrust into a debate over the value—and the risks—of AI-enhanced surveillance.

WIRED identified eight public school systems, from rural areas to giant urban districts, that have moved to install facial recognition systems in the past year. There likely are many more. The technology watched over thousands of students returning to school in recent weeks, continually checking faces against watch lists compiled by school officials and law enforcement.

Administrators say facial recognition systems are important tools to respond to or even prevent major incidents such as shootings. But the systems are also being used to enforce school rules or simply as a convenient way to monitor students.

This spring, staff at Putnam City Schools in Oklahoma needed to check whether a student reported as having run away from home was at school. Rather than ask teachers, Cory Boggs, who directs IT for the district, tapped facial recognition cameras to quickly spot the student. “It’s a very, very efficient way of monitoring a group of people,” he says. Putnam City and Texas City both bought surveillance software called Better Tomorrow from AnyVision, an Israeli startup that media reports in its home country say supplies Israeli army checkpoints in the West Bank.

Not everyone likes the idea of facial recognition in schools. Last year, parents in Lockport, New York, protested plans by school officials to install a $1.4 million facial recognition system, saying it was inappropriate to use such potentially intrusive technology on children. “The moment they turn those cameras on, every student, including my daughter, is being surveilled by a system that can track their whereabouts and their associations,” says Jim Shultz, the parent of a Lockport junior. The district says it doesn’t intend to watch students; rather, officials say they want to keep out unwelcome visitors, including suspended students and local sex offenders.

ILLUSTRATION: ELENA LACEY; GETTY IMAGES

The parent protests, reported first by the Lockport Journal, caught the attention of the New York Civil Liberties Union, which raised concerns about the accuracy of facial recognition algorithms on darker skin tones. The NYCLU noted that the district planned to include suspended students, who are disproportionately black, on its watch list. Similar worries have helped motivate cities including San Francisco and Oakland to ban their public agencies from using facial recognition. In June, the New York State Education Department ordered Lockport to halt testing of the system.

Companies selling facial recognition systems see schools as a growing market. Shootings like the murder of 14 students and three staff members at Marjory Stoneman Douglas High School in Parkland, Florida, last year drive interest and sales. Max Constant, AnyVision’s chief commercial officer, won’t disclose how many US schools the company has worked with but says its work “typically centers around areas in which previous tragedies have occurred.” In a statement, AnyVision said its technology is installed at hundreds of sites worldwide. “Our technology never catalogs or retains records of individuals screened, and AnyVision remains committed to operating under the highest level of privacy and ethical standards,” the company said.

The Parkland shooting prompted another tech firm, RealNetworks, to offer its facial recognition software to schools for free. “Parkland happened, [and] we said as a matter of public impact we will make it available,” says CEO Rob Glaser. “Districts representing over 1,000 schools have expressed interest.” Mike Vance, RealNetworks’ senior director of product management, says dozens of schools are using the technology to automatically open gates for parents or staff, or watch for persons of interest, such as parents subject to court orders in custody disputes. RealNetworks directs schools it works with to a short best-practice guide on facial recognition in schools, which discusses privacy and transparency, but the company does not monitor how schools are using its technology.

This spring, three Panasonic engineers journeyed from Houston and Japan to West Platte, Missouri, 30 miles from Kansas City. There, they helped install a $200,000 camera system the district ordered to watch over its 600 students, including licenses to equip 13 cameras with Panasonic’s FacePRO facial recognition. The cameras primarily guard school entrances and feed footage to the school’s IT office and local law enforcement, which both receive alerts when the system identifies someone on the district’s watch list. The footage is stored by default for a month, says Chad Bradley, CTO of TriCorps Security, the Oklahoma City company that oversaw the installation. Panasonic did not respond to requests for comment.

In rural east Texas, the 1,900-student Spring Hill Independent School District this summer installed cameras and facial recognition software. The $400,000 system was called into service the night before school resumed in August, after a high school student posted a threat on social media. Staff added his photo to the software’s watch list as a precaution, although the incident was resolved before school started, says superintendent Wayne Guidry. “I think our campuses are a lot safer,” he says.

Texas City, an oil town of 46,000, adopted facial recognition after two local tragedies. In 2017, after Hurricane Harvey damaged some of the district’s buildings, voters approved a $136 million bond measure to pay for four new schools, buses, and security upgrades. Days after that vote, the alleged shooter, a student, walked into the art block at the high school in nearby Santa Fe, Texas, with a shotgun and revolver, killing eight students and two teachers.

Rodney Cavness, Texas City’s school superintendent, reacted quickly. Three days after the Santa Fe tragedy, he hired Matranga, a Texas City native who had spent years in the Secret Service assigned to candidate and then President Obama. “I knew we needed to do something different,” Cavness says. “I hired an expert and let him do the job.”

ILLUSTRATION: ELENA LACEY; GETTY IMAGES

Matranga built a small team of military veterans and got to work. The district installed hundreds more security cameras, applied bullet-resistant film to windows, and hardened classroom doors with bolts and a remote locking system. It invested in software that trawls the web and social media for mentions of the school.

Matranga says the facial recognition system could help him move against a potential shooter more quickly—perhaps before they start shooting. The system silently compares every face to a watch list of persons of interest, such as the students from the disciplinary school. The alleged shooter in Parkland, Matranga notes, had a history of behavioral problems and had been forced to withdraw from school roughly a year before he returned with a semiautomatic rifle. Texas City doesn’t have enough software licenses to run facial recognition on all of its 1,600 cameras, so it prioritizes building entrances and switches the software to watch different feeds for special events, such as the stadium cameras for graduation. The district spent $38,000 on a server to support the system and pays an annual $26,000 subscription. “You have surveillance cameras at Disney World, why should schools be different?” Matranga asks.

When a WIRED reporter stepped out of the doughy September heat of southeast Texas into the lobby area of Matranga’s offices, the system compiled multiple images of the new face from different angles. A few mouse clicks added the face to the system’s watch list. When a camera detected the reporter walking back into the lobby, a siren sounded. Matranga, the three other members of his security team, and the district’s 19 sheriff’s deputies all received notifications.

The reporter was deleted from the watch list soon afterward; the students from the district’s disciplinary school remain, along with local registered sex offenders. So does a man who was escorted from school grounds by law enforcement and given a criminal trespass warning after arguing with his ex-partner, a parent at the high school. She provided Matranga’s team with his photo. Other images on the watch list came from a Ring doorbell camera. Matranga’s staff added them to help a local resident who complained that a child had been hanging around their house and licked their surveillance camera. So far, the system hasn’t registered a hit.

Another person on the Texas City watch list is Mandalyn Salazar, although she didn’t know it until contacted by WIRED. She does volunteer work with families in the Texas City school system. Last month she got into an argument with Matranga on the sidelines of a school board meeting. He and Salazar both say the encounter culminated in her calling Matranga an asshole, and Salazar being told she would be arrested if she returned to school property.

ILLUSTRATION: ELENA LACEY; GETTY IMAGES

Salazar does not recall being informed that her image had been added to the district’s facial recognition system. Matranga says his team did so using her Facebook profile photo. “She’s irrational, and she’s unstable,” he says. “Those are the type of people that we need to be looking out for.” Salazar will remain on the watch list for a year, Matranga says.

Superintendent Cavness says the community and his student advisory council are “fine” with the district’s security upgrades and use of facial recognition. When the bell at Texas City High School rang at 2:50 pm on a Friday, teens swarmed between classes, exchanging friendly headlocks and complex handshakes without visible concern for the cameras overhead, strategically positioned over stairwells and at hallway intersections.

Isabela Johnston, a senior at Texas City High School and president of the political activism club, says not all students support the enhanced security. She wrote an editorial in the school newspaper, the Sting City Press, early this year flagging ACLU concerns about the effectiveness and racial bias of facial recognition systems. In April, Johnston polled more than 300 students about the new school safety measures; many said facial recognition and AR-15s on campus made them feel unsafe. More than 40 percent said the atmosphere at school had worsened compared with previous years.

Learning in the shadow of hardened doors, gun safes, and cameras backed by facial recognition algorithms can be stressful, Johnston says. “I don’t feel necessarily any safer or more in danger, but it is a constant reminder that something could happen,” she says. “I’ve heard a lot of my peers vocalize the same thing: We’re constantly reminded this is a possibility.”

In Texas City, that reminder is vivid because of the attack that killed 10 students and staff last year at the high school in Santa Fe, a smaller city 20 minutes away. After that tragedy, James Grassmuck, who has two children in the Santa Fe Independent School District, including one at the high school, volunteered for a newly created safety and security committee. Last winter he ran successfully for a seat on the school board; his platform included a pledge to install facial recognition.

That system is now up and running, part of more than $2 million of security upgrades since the shooting. Grassmuck says facial recognition was attractive because it is less visible than other security measures, such as metal detectors and new fencing, and that the local community has been supportive. “I’ve not heard a single complaint,” he says, before adding, his voice faltering, “but we’re in a little bit of a different situation.”

Across the country, administrators and lawmakers feel pressure to do something—anything—about the possibility of a mass shooting. Prominent attacks often trigger the release of new local, state, or federal funds for school security. One month after the Parkland shooting last year, Congress passed the Stop School Violence Act, which allocated funds for school security training and infrastructure. “Every time we’ve seen a high profile event like this, such as Columbine or Newtown, immediately after that you’ll see legislation that’s being introduced providing more funding for surveillance systems and police officers,” says Nance, the Florida professor.

Those types of funding measures don’t typically mention specific technologies, giving schools latitude to purchase facial recognition. In West Platte, voters approved a bond initiative that allowed the tiny rural district to pay for its $200,000 upgrade, said Bradley, the consultant who installed the system. In late 2014, New York state voters approved $2 billion for technology improvements, including “high-tech security features.” According to emails obtained by the NYCLU, officials in Lockport chose to use their allocation to purchase a facial recognition system from SN Technologies after receiving a free threat assessment offered by a consultant with financial ties to the company. SN Technologies declined to answer specific questions about the consultant’s relationship.

Another place where facial-recognition-enabled cameras will soon post watch is Fulton County, Georgia, a suburban Atlanta school district with 95,000 students. In 2017, the district upgraded its camera system with software from Motorola’s Avigilon division that offers “appearance search,” allowing searches for individuals based on the color of their shirt or hair style. Paul Hildreth, the district’s technology director, compares the process to Googling and says it has helped administrators investigate fights and vandalism.

In Texas City last month, students packed the gymnasium for a pep rally before a football game against Houston’s Clear Lake. The team swaggered out wearing jeans and jerseys as the school band played brassily. Afterward, as staff members swept up stray confetti, Matranga got word that someone had thrown chocolate milk over some cheerleaders. It was a petty incident but not one that went undocumented. Cameras gazed down from each corner of the gymnasium. “We’ll pull that video on Monday,” he said.