logo

Algorithmic Extremism: Examining YouTube’s Rabbit Hole of Radicalization

PDF Publication Title:

Algorithmic Extremism: Examining YouTube’s Rabbit Hole of Radicalization ( algorithmic-extremism-examining-youtubes-rabbit-hole-radical )

Next Page View | Return to Search List

Text from PDF Page: 001

Algorithmic Extremism: Examining YouTube’s Rabbit Hole of Radicalization Mark Ledwich Brisbane, Australia mark@ledwich.com.au Abstract—The role that YouTube and its behind-the-scenes recommendation algorithm plays in encouraging online radical- ization has been suggested by both journalists and academics alike. This study directly quantifies these claims by examining the role that YouTubes algorithm plays in suggesting radicalized content. After categorizing nearly 800 political channels, we were able to differentiate between political schemas in order to analyze the algorithm traffic flows out and between each group. After conducting a detailed analysis of recommendations received by each channel type, we refute the popular radicalization claims. To the contrary, these data suggest that YouTubes recommendation algorithm actively discourages viewers from visiting radicalizing or extremist content. Instead, the algorithm is shown to favor mainstream media and cable news content over independent YouTube channels with slant towards left-leaning or politi- cally neutral channels. Our study thus suggests that YouTubes recommendation algorithm fails to promote inflammatory or radicalized content, as previously claimed by several outlets. Index Terms—YouTube, recommendation algorithm, radical- ization I. INTRODUCTION The internet can both be a powerful force for good, prosocial behaviors by providing means for civic participation and com- munity organization [1], as well as an attractor for antisocial behaviors that create polarizing extremism [2]. This dual nature of the internet has been evident since the early days of online communication, where ”flame-wars” and ”trolling” have been present in online communities for over two decades [3] [4] [5]. While such behaviors were previously confined to Usenet message boards and limited IRC channels, with the expansion of social media, blogs, and microblogging following the rapid growth of internet participation rates, these inflammatory behaviors are no longer confined and have left their early back-channels into public consciousness [6]. The explosion of platforms, as well as ebbs and flows in the political climate, has exacerbated the prevalence of antisocial messaging [7]. Research focusing on uninhibited or antisocial communication, as well as extremist messaging online has previously been conducted on platforms including Facebook [8], Twitter [9], Reddit [10], 4chan and 8chan [11] [12], Tumblr [13] and even knitting forums such as Ravelry [14]. In addition to these prior studies on other platforms, at- tention has recently been paid to the role that YouTube may play as a platform for radicalization [15] [16] [17]. As a content host, YouTube provides a great opportunity for Anna Zaitsev The School of Information University of California, Berkeley Berkeley, United States anna.zaitsev@ischool.berkeley.edu broadcasting a large and widely diverse set of ideas to millions of people worldwide. Included among general content creators are those who specifically target users with polarizing and radicalizing political content. While YouTube and other social media platforms have generally taken a strict stance against most inflammatory material on their platform, extremist groups from jihadi terrorist organizations [18] [19], various political positions [20], and conspiracy theorists have nonetheless been able to permeate the content barrier [21]. Extreme content exists on a spectrum. YouTube and other social media platforms have generally taken a strict stance against the most inflammatory materials or materials that are outright illegal. No social media platform tolerates ISIS beheading videos, child porn, or videos depicting cruelty towards animals. There seems to a consensus amongst all social media platforms that human moderators or moderation algorithms will remove this type of content [22]. YouTube’s automatic removal of the most extreme content, such as explicitly violent acts, child pornography, and animal cruelty, has created a new era of algorithmic data mining [23] [24] [13]. These methods range from metadata scans [25] to sentiment analysis [26]. Nevertheless, content within an ideological grey area or that can nonetheless be perceived as ”radicalizing” exists on YouTube [27]. Definitions of free speech differ from country to country. However, YouTube operates on a global scale within the cultural background of the United States with robust legislation that protects speech [7]. Even if there are limitations to what YouTube will broadcast, the platform does allow a fair bit of content that could be deemed as radicalizing, either by accident or by lack of monitoring resources. Means such as demonetization, flagging, or comment lim- iting is several tools available to content moderators on YouTube [28]. Nevertheless, removing or demonetizing videos or channels that present inflammatory content has not curtailed scrutiny of YouTube by popular media [29]. Recently, the New York Times published a series of articles, notably critiquing YouTube’s recommendation algorithm, which suggests related videos for users based on their prior preferences and users with similar preferences [30] [15]. The argument put forward by the NYT is that users would not otherwise have stumbled upon extremist content if they were not actively searching for it since the role of recommendation algorithms for content on arXiv:1912.11211v1 [cs.SI] 24 Dec 2019

PDF Image | Algorithmic Extremism: Examining YouTube’s Rabbit Hole of Radicalization

algorithmic-extremism-examining-youtubes-rabbit-hole-radical-001

PDF Search Title:

Algorithmic Extremism: Examining YouTube’s Rabbit Hole of Radicalization

Original File Name Searched:

1912-11211.pdf

DIY PDF Search: Google It | Yahoo | Bing

Cruise Ship Reviews | Luxury Resort | Jet | Yacht | and Travel Tech More Info

Cruising Review Topics and Articles More Info

Software based on Filemaker for the travel industry More Info

The Burgenstock Resort: Reviews on CruisingReview website... More Info

Resort Reviews: World Class resorts... More Info

The Riffelalp Resort: Reviews on CruisingReview website... More Info

CONTACT TEL: 608-238-6001 Email: greg@cruisingreview.com | RSS | AMP