UCLA instructors express privacy concerns amid rise of AI notetaking platforms

Otter AI, which converts spoken audio into text, is pictured. Artificial intelligence-based transcription software has emerged as a popular notetaking tool among college students, but it has left some UCLA instructors with privacy concerns. (Andrew Ramiro Diaz/Photo editor)
By Izzy Becker
Dec. 4, 2025 9:24 p.m.
This post was updated Dec. 9 at 2:21 p.m.
Artificial intelligence-based transcription software has emerged as a popular notetaking tool among college students, but it has left some UCLA instructors with privacy concerns.
Programs like Krisp AI, Notion and Otter AI use AI algorithms and language models to transcribe spoken words into text, summarize large bodies of writing and draw key points from data. California, being a two-party consent state, prohibits people from recording private conversations without the consent of all parties, according to the Digital Media Law Project.
Alex Alben, a lecturer at the UCLA School of Law, said many of the law school’s classes are recorded on Zoom, which uses its own transcription software. He added that he believes both students and instructors should be aware that a recording is happening and receive confirmation that it will be used for educational purposes.
“If you had a perfect recording of a lecture that was recorded without the professor’s consent, and then somehow the person who made the recording benefitted from reselling it or sharing it, that would definitely be an instance of an unauthorized use of the professor’s intellectual property,” he said. “That is copyright infringement.”
Alben said the fair use doctrine – a U.S. principle in copyright law that allows for the use of copyrighted material if it is for purposes of criticism, comment, news reporting, teaching, scholarship or research – complicates infringement claims.
“Under the copyright law, we want to encourage criticism and commentary in education,” he said. “Therefore, if a lecture was used in an educational setting, even not authorized, there’s a pretty good argument that that is a fair use, especially if the other classroom is not charging for the use of the lecture.”
Chris Mattmann, UCLA’s chief data and artificial intelligence officer, said these AI softwares have incidental collection capabilities – they can capture information beyond what the user might think the application has access to – which may pose data privacy risks. He added that UCLA has a third-party risk management process that people can use to evaluate if they should put data into AI technologies based on protection levels.
Enya Grooms, a second-year biology student, said that while one of her professors discouraged the use of AI note-taking tools, she believes they can be beneficial.
“AI is a way to synthesize the words and create a succinct study guide or outline of a lecture that will ultimately make the student’s learning experience a little bit more streamlined,” Grooms said.
She added that many professors disapprove of AI, which can make students feel uneasy or hesitant to use it – even though it can be used as a study or organizational tool.
A TA within a humanities department who was granted anonymity due to fear of retaliation from the university said she feels skeptical about real-time AI transcription and summary programs – specifically Otter AI, which converts spoken audio into text. She added that Otter AI claiming ownership over the materials of its recordings can pose problems with intellectual property rights.
She said she is concerned with the use of AI notetakers in discussion sections, where students are asked to share their personal thoughts and opinions, sometimes through cold calling.
“You shouldn’t have to worry that if you make a mistake in the classroom, it’s recorded and permanently available to every other student in the room,” she said.
However, the TA added that she believes it can still be a helpful tool for students – especially those with disabilities.
“I absolutely understand its potential to be a very helpful technology for students,” she said. “I try to be especially understanding or empathize with my students who have asked for accommodations for accessibility purposes.”
California Governor Gavin Newsom signed Senate Bill 53 – which mandates that AI companies making more than $500 million annually must publicly communicate their safety protocols and how they align with national standards – into law Sept. 29. The law also requires the creation of an independent consortium called CalCompute that uses a public computing cluster to share research on AI.
“The big tech companies own the massive training needed and power needed to build frontier models,” Mattmann said.
He said he supports CalCompute and its use of state funding to broaden researchers’ access to critical data centers and training.
Mattmann also said continued work is being done at UCLA to better assess and manage AI notetaking tools.
“I’m working on a campus-wide AI inventory to categorize and classify the demand flow of peoples’ requests for these types of tools and understand what types of data should be going in and out of them,” he said.
Alben added that AI usage is complicated by the absence of federal direction and the fast-paced development of these technologies. To avoid further legal complications, Alben said teachers should give their students clarity about what they are allowed to do with their course materials.
“We are in an ambiguous period of time, but I do think that our existing copyright laws do provide a framework,” he said.




