This post was updated Feb. 8 at 2:07 p.m.
UCLA is considering implementing facial recognition software to the campus’ security camera system, prompting concern from students.
Approximately 200 students gathered at a town hall meeting Wednesday to raise concerns about the addition of the software, which was proposed in a revised draft of Policy #133, a policy outlining UCLA’s security camera system and procedures for its revision. Administrative Vice Chancellor Michael Beck explained how the technology could be implemented at UCLA and opened the floor for students to voice their opinions.
The software was originally intended to be added to Policy #133 last year, but its implementation was postponed after public feedback, Beck said. Interim Policy #133 was implemented last year, however, there were no guidelines for security camera systems, or SCS, before the interim policy was put in place, Beck said.
If implemented, the facial recognition software would be used as multifactor authentication for restricted areas on campus and for identifying individuals who have a “campus exclusion order,” according to the updated draft of Policy #133.
A campus exclusion order means an individual has a stay-away order, a court-issued restraining order or has been mentioned in a law-enforcement bulletin, according to the most recent draft of Policy #133. However, only about four to five stay-away orders are issued per year, Beck said.
The technology could also be used to identify individuals who have temporary restraining orders, such as students’ domestic abusers, Beck said.
Several students argued that investing in student resources would be more helpful than administering the software.
“It seems like a very temporary fix,” said Angela Li, a third-year political science and Asian American studies student. “I think maybe devoting resources to working with (domestic violence) victims … like helping them to reintegrate back to the school community is much more of a beneficial aspect.”
Students from the undocumented community communicated their concerns about putting their families in danger if the technology was implemented.
“I don’t want (my family) to have the fear of having their face scanned because a lot of my family are already scared walking on the streets,” said Madeleine Flores, a first-year psychobiology student. “Having (their information) put into a system, it make that fear raised a thousand percent.”
The software is not intended to store information, but to protect restricted areas and protect people on campus, Beck said. The camera will have a limited database and check if the person on camera matches with their stored information, which will either grant them access to a restricted area or signal that they are a threat to campus, Beck said.
The footage of people who are not a part of the limited database will be stored for a minimum of 30 days and a maximum of 90 days, Beck said.
There was also concern over the software interfering with students’ privacy on campus.
Student leaders from UCLA’s Community Programs Office hosted the event. Two of the organizers, Salvador Martinez and Oscar Macias, said that implementation of the software would target marginalized communities.
“History shows that surveillance tools are used to target and criminalize marginalized communities with a centralized (camera) system,” said Martinez, a fourth-year applied mathematics student. “UCLA will be taking a step backwards by reinforcing a police state … which creates an environment of hostility for students, faculty and staff.”
Students added that the implementation of the software would restrict their freedom.
“This is supposed to be a public university … and (I’m) supposed to feel free,” said Iverson Mitchell, a first-year aerospace engineering student.
The added technology would also prevent students from comfortably accessing resources on campus such as the UCLA LGBTQ Resource Center, said Macias, a fifth-year sociology student.
Overall, many students expressed their discontent with UCLA possibly implementing the technology to SCS.
“I don’t want other students or my community to be (surveilled) or policed,” said Nicole Nukpese, one of the organizers of the event. “I just hope that in the future, and even soon, that it’s not implemented.”
Beck said the addition of the software is still being considered, but the input communicated at the event will be taken into consideration as well.