Thursday, May 9, 2024

AdvertiseDonateSubmit
NewsSportsArtsOpinionThe QuadPhotoVideoIllustrationsCartoonsGraphicsThe StackPRIMEEnterpriseInteractivesPodcastsBruinwalkClassifieds

BREAKING:

SJP, UC DIVEST COALITION DEMONSTRATIONS AT UCLA

Code Red: Episode 1

Photo credit: Izzy Greig

By Ciara Terry and Izzy Greig

April 26, 2024 11:30 p.m.

Conspiracy theories and extremist beliefs seem more widespread on the internet today. You may even know someone who is increasingly consumed with such content. This miniseries seeks to explain why people are attracted to extremist content and how online platforms foster extremism.

In this episode, Podcasts contributors Izzy Grieg and Ciara Terry explore the internal factors that drive people toward extremist activity online.

Ciara Terry: Hello, and welcome to the first episode of Code Red, a three-part mini series by Daily Bruin Podcasts that explores the causes and dangers of extremism in the online world. I’m one of your hosts, Ciara Terry, and I’m a Podcasts contributor and a second-year student here at UCLA. And I’m here with my co-host, Izzy!

Izzy Greig: Hi! I’m Izzy Greig, a Podcasts contributor and a first-year student here at UCLA.

CT: In this first episode, Izzy and I will be defining online extremism, discussing some examples, and also talking about the personal or internal factors that can prompt extremist activity online.

To start us off on this discussion of online extremism, we thought it would be useful to define this term and to discuss the different perspectives on the phenomenon. Subjects that are as sensitive as this can look different depending on circumstances or the person in question. I am sure that, as a listener, judging by the fact that you are using a digital form of news at this very moment, you may be familiar with online extremist content and may have encountered it in your lifetime. And what you may have encountered is likely different than that of another, and therefore your definition of online extremism may be different from others. However, for the sake of this podcast series, we are going to outline some defining traits or characteristics that identify online content as being ‘extremist.’

IG: When we began researching this topic, we found that many researchers and academics often had similar definitions of online extremism. For instance, researchers Jens Binder and Jonathon Kenyon conducted a study in 2022 that aimed to investigate “terrorism and the internet.” They defined online extremism as “a process during which individuals get exposed to, imitate and internalize extremist beliefs and attitudes, by means of the Internet, in particular social media, and other forms of online communication.” This means of “exposure” is due to the fact that existing extremist groups on the internet have “a high interest in recruitment and a readiness to invest resources in communication and outreach activities.”

CT: This definition is actually very similar to something my World Politics professor Eric Min recently mentioned; he defined extremists as people with beliefs that are uncommon due to their radical nature. This is not to say that researchers and academics have identical definitions of the term, but it is an interesting parallel that I noticed. Also, the U.S. Department of Justice defines online extremism as content that aims to “provoke negative sentiment toward enemies, incite people to violence…[and] create virtual communities with like-minded individuals.” More specifically, this source states that “extremists post incendiary materials such as education videos about how to construct explosives and operate weapons.” So, clearly these two sources identify multiple defining characteristics of online extremism: a goal to convince internet users of certain beliefs, the consistent production of online content pertaining to a specific belief system, and the encouragement or discussion of radical or violent activity to occur beyond the internet.

IG: Considering these definitions provided by researchers and academics, we were also interested in the perspective of a college student on this subject, simply to gain insight into what non-researchers or bystanders may consider online extremism to be. Oliver Petherbridge is a second year student currently studying English at Pasadena community college and is a former executive member of Loyola Marymount’s chapter of Turning Point USA, which is a nonprofit conservative political organization. He defines online extremism as “the act of operating on a daily basis with a severely warped perception of reality that’s induced by the overuse of social media or other internet outlets.” Additionally, he believes that “real extremism is never really news that’s being reported. It’s usually people that are inventing delusions and conspiracies and really rallying around those ideas.” So again we see this common element of the definition that involves “rallying” around uncommon beliefs or ideas. It appears that it may be the uncommon belief system, itself, that brings communities of extremists together online, and therefore also encourages members to seek new recruits through means of online interactions.

CT: Also, we were interested to find some sort of quantification of online extremist activity, mainly because the articles and studies we found identified it as a pressing issue, so we were curious as to just how pressing and developing this phenomenon is. We wanted to know how many individuals actually use the internet and social media, as we are discussing extremist activity that occurs digitally.

So, according to Statistica.com, a January 2024 report indicated that there are 5.35 billion internet users and 5.04 billion social media users across the world. Considering that there are roughly 8.1 billion people in the world, these statistics indicate that over half of the world’s population actively uses the internet or social media.

IG: Statistica also conducted a study that predicts the “social media penetration rate” to continue to rise in the upcoming years. We think that these statistics on general internet and social media usage are extremely important when discussing online extremism because these platforms play a huge role in the type of content that any person or group is permitted or motivated to make, and we will further discuss this later on in the episode.

CT: Now that we have defined online extremism, we want to also talk about the causes or roots of this phenomenon. More specifically, we want to discuss the more personal and internal factors that may motivate an individual to begin posting or interacting with extremist content online. First off, we looked at a study conducted in 2021 by Ryan Andrew Brown and others that interviewed former extremists, as well as their family and friends, in order to better understand their cause or reason for turning to extremism in the first place. The study found within the 32 cases of radicalization that they investigated that the three motivating factors mentioned most consistently in the interviews were financial instability, which was present in 22 out of the 32 cases, mental health, noted in 17 out of the 32 cases, and marginalization, noted in 16 out of the 32 cases.

IG: Also in this same study, one former white supremacist claimed that it was his inability to find a job as a contractor after leaving the military that led him to “blame his problems on somebody else” and to adopt extremist opinions regarding race. Also, 17 of the 32 cases reported that their turn to extremist beliefs or activity online followed a recent “dramatic or traumatic” event, some of these being “a gun possession charge, rejection by the military, a friend’s suicide, and an extended period of unemployment.”

CT: It’s interesting to hear from former extremists, themselves, on why they believe they were prompted to partake in extremist activity. We also wanted to hear the opinion of someone outside of the research field on this matter, so we went back to Oliver Petherbridge, the second-year college student, who believes that “living in a damaging spiritual and psychological landscape” can lead people to “become aimless” which then leads them to “lash out” online. This is similar to what Brown’s study found, suggesting that a person’s mental state can play a large role.

IG: Also, along with these personal motivating factors, studies show that the anonymity that the internet offers, in which someone can hide behind a fake name or account, plays a role as well. A study conducted in 2022 by Joe Whitaker argued that “the Internet’s anonymity facilitates an environment where an individual can experiment with an array of ideas with little consequence while having limitless access to propaganda.” The internet allows for free reign in a way people can “become more than their offline personas,” leading them to post things or comment on things that they may never do off the internet.

CT: When we asked Oliver Petherbridge about the role of anonymity, he provided an answer similar to the one posed by Whitaker’s study, suggesting that using the internet to share beliefs and ideas leads people to feel “lost in the crowd,” and their actions would therefore be “less scrutinized.” This then serves as further motivation to participate.

IG: Now that we have defined online extremism and discussed what it can look like, we now want to provide some real-life examples of online extremist activity and content. Perhaps you have encountered examples of online extremism yourself, or you have heard about it from others, but it does not take much researching or aimless scrolling to come across extremist activity online.

CT: There are two particular examples that we wanted to highlight today, both of which we found within a U.S. Department of Justice Awareness Brief titled “Online Radicalization to Violent Extremism.” The first story outlines the development of violent action by Anders Breivik. According to outside sources, Anders was formerly a “normal” and “sociable” man. However, when he encountered a career failure in 2006, he moved in with his mother, claiming that he was going to gift himself a year off from working to “play video games.” However, much of his now free time was spent online, where he began to write violent content aimed to rally right-wing activists. He continued to withdraw from his social life and dedicate more time to his online activities, and then in 2011, he carried out bombing and shootings in Norway.

IG: The second story mentioned in this briefing was that of Zachary Chesser, who initially was a gifted high school student working at the video store and participating in his school’s break dancing club in Virginia. However, at the age of 18, he converted to Islam and, according to the source, “quickly became radicalized, solely on the internet.” He later quit his job so he could spend more time making graphics and contributing online to “promote extremist messages,” including making threatening videos on YouTube. Zachary also had been trying to join Al-Shabaab, and one of his attempts led him to be questioned by the police and later put in jail for 25 years.

CT: These two examples demonstrate how major life shifts and mental health struggles can drive people to participate in extremism online.

We hope that this episode gave you a better understanding of what online extremism is and what internal factors drive people toward extremist beliefs.

IG: In the next episode of this mini series we will discuss how the features of online platforms foster extremism.

CT: Again, my name is Ciara Terry.

IG: And my name is Izzy Greig, and thank you for listening!

CT: Tune in next week for the second episode of Code Red.

Share this story:FacebookTwitterRedditEmail
Ciara Terry
COMMENTS
Featured Classifieds
Help Wanted

Seeking full-time Medical Assistant for AllergyDox. Copy and paste the link to apply. Experience NOT required, training provided, pay ranges from $20-$23/h https://tinyurl.com/mr3ck3ye [email protected]

More classifieds »
Related Posts