Q&A: Professor addresses the role and repercussions of being a content moderator
UCLA assistant professor Sarah Roberts analyzed the mental and physical effects of being a social media content moderator in her book, “Behind the Screen: Content Moderation in the Shadows of Social Media,” which was released in June. (Courtesy of Stella Kalinina)
Aug. 25, 2019 9:07 p.m.
This post was updated August 25 at 9:14 p.m.
Followers of social media accounts aren’t the only ones checking over online posts – so are human content moderators.
Sarah Roberts, UCLA assistant professor of information studies, released her first book “Behind the Screen: Content Moderation in the Shadows of Social Media” in June. In her book, Roberts explores how content moderators are treated in social media industries. From interviewing content moderators, Roberts said she uncovered typical conditions they face, as well as the emotional and physical effects of their work environment.
Roberts spoke with Daily Bruin’s Alexsandra Coltun Schneider about her research, sensitivity while interviewing subjects and how content moderators are often undervalued.
Daily Bruin: What are the daily tasks of content moderators on social media platforms?
Sarah Roberts: Let’s say you come across something that you believe is objectionable in such a way that it violates the rules of the platform you are on at the particular moment. Typically, as a user, you have a way to make a report about that. That report disappears from your perspective, but in fact, it goes off and gets routed on the back end to these people who – depending on where they work and what the system is that’s been developed – might get a whole bunch of stuff in a glut. Or, it might be very specifically triaged depending on what you alleged was wrong.
In any way you slice it, what ends up happening is that there’s a human being who receives reports of violations of whatever type and … has to decide, “Is this actually in violation?” Then what action, if any, should be taken on that. They do that hundreds, thousands of times per day, all day, every day for those who are full time.
DB: In what ways are social media companies exploiting the content moderators?
SR: I think certainly when we’re talking about a group of workers who do a mission-critical type of activity – which is what anyone in the industry would describe this work as being – you look at it as being very precarious in terms of tenure or job stability. (Then) you look at it as low wage – that’s where we start using terms like exploitation, when we realize that much of the workers are sourced from contracting companies, for example, to avoid doing things like providing them with full benefits that other employees get. … They also denoted the work as being low skill, low level, whereas we could make an argument in the case that it’s actually quite sophisticated, complex and requires a great deal of specific skills.
DB: How did you go about your research and writing your book given the sensitive content of it?
SR: Early on, I was led to the subject area by a report in The New York Times from the summer of 2010. It was a short story, but with it had a big impact on me. It talked about some workers in Iowa who were doing this work. What the reporter did note was that it seemed like the workers he was talking to were experiencing emotional difficulty. … When I thought about wanting to learn more and learn more from the workers, I was very conscious that what I didn’t want to do was go in there … for the “tell me the worst thing you’ve ever seen” and “give me all the dirt” because these were the workers that I perceived might already be fragile from the work that they did. … The other piece that might be important was establishing anonymity for them.
DB: In what ways were the content moderators affected by their work?
SR: It’s the kind of thing that really varies from person to person … based on life experience, tolerance and all kinds of factors. Sometimes people just have a harder day. Many of them told me that … they had mixed feelings a lot of times. They had this sense of pride or this sense of doing something for the collective good that had to go unheralded, so there was frustration. On the other hand, sometimes (they) expressed feeling like it’s all futile or feeling even a total disappointment in humanity. Those were sentiments that you don’t leave at the office, you take that home with you. A lot of times they would talk about having difficulty putting those barriers around their emotional feelings off the clock.
I don’t think anyone in the industry really knows what are the long term effects. At any given time there might be 100,000 people in the world doing some kind of aspect of this work, and that we know it’s a high turnover kind of job, we’re creating this whole region of people whose well-being into the future psychologically we have no information on. People reported things to me like increases (in) drinking, intimacy problems, feeling withdrawn (and) a lack of desire to socialize. How will that look down the road?
DB: What are the experiences you typically heard from the content moderators?
SR: All of the people that I spoke to, wherever they were in the world and whatever part of this work they touched, tried to make sense of it for themselves and for the bigger picture. They tried to give it value – and that was a very interesting phenomenon. They didn’t feel necessarily like it was just throwaway work, even when that was sometimes the message they were getting externally, or even when they had a part of themselves that felt that way. I think readers can decide whether or not that sort of selling one’s self a bill of goods or whether there’s truth to that – or something in between.