Thursday, April 25, 2024

AdvertiseDonateSubmit
NewsSportsArtsOpinionThe QuadPhotoVideoIllustrationsCartoonsGraphicsThe StackPRIMEEnterpriseInteractivesPodcastsBruinwalkClassifieds

UCLA researchers explore factors behind online spread of conspiracy theories

An individual searches for COVID-19 on Twitter. A team of researchers from UCLA and other universities across the country found that social media enabled and reinforced the spread of misinformation during the COVID-19 pandemic. (David Rimer/Assistant Photo editor)

By Megan McCallister

Feb. 5, 2022 3:19 p.m.

Uncertainty and isolation caused a surge in social media use during the COVID-19 pandemic and made people more vulnerable to engaging with pandemic-related conspiracy theories, researchers from UCLA and other universities found.

Researchers from UCLA, Washington University in St. Louis, Northwestern University, University of Maryland and The Ohio State University reviewed hundreds of studies to create a model that explains how and why COVID-19 conspiracy theories propagated on social media.

In the article published in 2021, the researchers cite algorithms, bots and influencers as reasons for the contagiousness of conspiracy theories.

According to the review, global social media use increased by 13% and 330 million new users joined across platforms in 2020. The team attributed increased social media use to disruptions in how people maintain relationships and understand the world.

Social media served as a tool to help maintain social bonds and to compensate for loneliness during social distancing and isolation. Online structures also provided a space for individuals to consume, share and discuss information in an effort to regain a sense of understanding and control over their environment, according to the article.

The COVID-19 pandemic was especially prone to misinformation because it caused uncertainty, said Jennifer Whitson, co-author of the research and an associate professor of management and organizations.

“​​At the societal level, something (is) more subject to misinformation (if it) is vague, underexplained to the public, or not everything is known about it,” Whitson said.

Conspiracy theories can be comforting in that they provide explanations that make the world seem more orderly and structured, said Benjamin Dow, a researcher on the project from Washington University in St. Louis. Those who feel a lack of control are more open to conspiracies, he added.

Once on social platforms, users are subjected to algorithms intended to increase user engagement and bots designed to promote visibility of specific storylines and hashtags. Bots are effective in facilitating the virality of conspiracy theories, thereby making these theories appear to be more popular and widely supported.

Whitson said she was surprised to find that misinformation spreads equally, if not more quickly, than accurate information, she said.

Algorithms determine patterns that predict which content would appeal to a certain type of person, said Ramesh Srinivasan, a professor of information studies. Content evokes emotional responses, and data is gathered about what types of content attract these responses and attention, he added.

“It’s possible that (COVID-19) came out of a lab, but that’s different than saying that China planned and developed a bioweapon to destroy the world,” Srinivasan said. “What happens for many people who have some skepticism is they get fed content that feeds their skepticism, but is more hardcore and more radical because that gets your attention and gets you all revved up.”

Together, algorithms and bots facilitate the inception of information echo chambers – virtual spaces in which like-minded individuals affirm and radicalize one another’s existing beliefs. Those with the most extreme views tend to dominate conversations in these communities, according to the research.

In addition to algorithms and bots, influencers who engage with conspiracies contribute to the spread of theories, according to the article. Social media uniquely allows these influential individuals to have unchecked access to a large audience to whom they can share information without the restriction of formal journalistic laws and standards. Followers often view influencers as trusted friends, and this trust makes followers less likely to question any information the influencer may promote.

It’s easier to find like-minded individuals online, said Cynthia Wang, a contributing researcher from Northwestern University. In offline interactions, one is more likely to encounter diverse perspectives and experience disagreements, she added.

However, for members of these echo chambers, offline disagreements can ultimately become reinforcement for a preexisting belief.

“The conspiracy believer getting into that disagreement might not actually be treating the disagreement as any real human interaction or change of information,” Whitson said. “They’re just doing it to go take it back to the online community, and say, ‘Look what happened to me, somebody yelled at me because I believed in this thing,’ and then that’s sort of reinforcing and reinvesting in their beliefs and in their social connections.”

Positive reinforcement feedback loops increase engagement with like-minded people, ultimately reinforcing social identities associated with shared beliefs as well. In this way, echo chambers can become what the researchers referred to as identity bubbles, where those who subscribe to the identity feel a sense of belonging.

Public displays of identity, such as not wearing a mask when required or using insider language such as “plandemic” – the idea that the COVID-19 pandemic was planned by the Chinese government – are used to signal membership in COVID-19 conspiracy communities to both insiders and outsiders.

“​​The thing about being part of a community like that and becoming really heavily identified with them is you actually end up fearing being ostracized from those communities more than sometimes even death itself,” Dow said.

Continuous positive reinforcement from conspiracy communities as well as the entanglement of conspiracy beliefs with social identities makes beliefs more “sticky,” or resistant to change.

The stickiness of conspiracy theories has made debunking largely ineffective, Whitson said. Debunking can even result in a backfire effect that strengthens an individual’s commitment to an ideology.

Content restriction, another method for inhibiting the spread of conspiracy theories, has proven to reduce exposure but is highly controversial, according to the research.

Existing interventions primarily focus on rationalizing and could be more effective if combined with more socially-based interventions, Whitson said. Social interventions work to dissociate identity from ideology and to reestablish social connections lost because of disruptions in social structures during the pandemic.

The team recently received a grant to conduct further research on potential social interventions, Wang added.

“Factually debunking conspiracy theories is very important,” Whitson said. “The challenge is that there’s a lot of research that shows that if people are deeply invested in conspiracy theories, even when presented with this information, they’re not convinced. And so we’re saying that both are necessary.”

Share this story:FacebookTwitterRedditEmail
Megan McCallister
COMMENTS
Featured Classifieds
More classifieds »
Related Posts