Algorithmic Extremism: Examining YouTube’s Rabbit Hole of Radicalization

PDF Publication Title:

Algorithmic Extremism: Examining YouTube’s Rabbit Hole of Radicalization ( algorithmic-extremism-examining-youtubes-rabbit-hole-radical )

Previous Page View | Next Page View | Return to Search List

Text from PDF Page: 002

other websites is less prevalent. As such, YouTube’s algorithm may have a role in guiding content, and to some extent, prefer- ences towards more extremist predispositions. Critical to this critique is that while previous comments on the role that social media websites play in spreading radicalization have focused on user contributions, the implications of the recommendation algorithm strictly implicate YouTube’s programming as an offender. The critique of the recommendation algorithm is another difference that sets YouTube apart from other platforms. In most cases, researchers are looking at how the users apply social media tools as ways to spread jihadism [18], alt-right messages of white supremacy [12]. Studies are also focusing on the methods the content creators might use to recruit more participants in various movements; for example, radical left- wing Antifa protests [31]. Nevertheless, the premise is that users of Facebook, Tumblr, or Twitter would not stumble upon extremists if they are not actively searching for it since the role of recommendation algorithms is less prevalent. There are always some edge cases where innocuous Twitter hashtags can be co-opted for malicious purposes by extremists or trolls [19], but in general, users get what they specifically seek. However, the case for YouTube is different: the rec- ommendation algorithm is seen as a major factor in how users engage with YouTube content. Thus, the claims about YouTube’s role in radicalization are twofold. First, there are content creators that publish content that has the potential to radicalize [15]. Second, YouTube is being scrutinized for how and where the recommendation algorithm directs the user traffic [17] [15]. Nevertheless, empirical evidence of YouTube’s role in radicalization is insufficient [32]. There are anecdotes of a radicalization pipeline and hate group rabbit hole, but academic literature on the topic is scant, as we discuss in the next section. II. PRIOR ACADEMIC STUDIES ON YOUTUBE RADICALIZATION Data-drive papers analyzing radicalization trends online are an emerging field of inquiry. To date, few notable studies have examined YouTube’s content in relation to radicalization. As discussed, previous studies have concentrated on the content itself and have widely proposed novel means to analyze these data [13] [33] [25]. However, these studies focus on introducing means for content analysis, rather than the content analysis itself. However, a few studies go beyond content analysis methods. One such study, Ottoni et al. (218), analyzed the language used in right-wing channels compared to their baseline chan- nels. The study concludes that there was little bias against immigrants or members of the LGBT community, but there was limited evidence for prejudice towards Muslims. However, the study did find evidence for the negative language used by channels labeled as right-wing. Nevertheless, this study has a few weaknesses. The authors of this paper frame their analysis as an investigation into right-wing channels but then proceed to analyze kooky conspiracy channels instead of more mainstream right-wing content. They have chosen a conspiracy theorist Alex Jones’ InfoWars (nowadays removed from YouTube) as their seed channel, and their list of right- wing channels reflects this particular niche. InfoWars and other conspiracy channels represent only a small segment of right-wing channels. Besides, the study applies a topic analysis method derived from the Implicit Association Test (IAT) [34]. However, the validity of IAT has been contested [35]. In conclusion, we consider the seed channel selection as problematic and the range of the comparison channels as too vaguely explained [36]. In addition to content analysis of YouTube’s videos, Riberio et al. (2019) took a novel approach by analyzing the content of video comment sections, explaining which types of videos individual users were likely to comment on overtime. Catego- rizing videos in four categories, including alt-right, alt-light, the intellectual dark web (IDW), and a final control group, the authors found inconclusive evidence of migration between groups of videos. 1 The analysis shows that a portion of commenters does migrate from IDW videos to the alt-light videos. There is also a tiny portion of commenter migration from the centrist IDW to the potentially radicalizing alt-right videos. However, we believe that one cannot conclude that YouTube is a radicalizing force based on commenter traffic only. There are several flaws in the setting of the study. Even though the study is commendable, it is also omitting the migration from the center to the left-of-center altogether, presenting a somewhat skewed view of the commenter traffic. In addition, only a tiny fraction of YouTube viewers engage in commenting. For example, the most popular video by Jordan Peterson, a central character of the IDW, has 4.7 million views but only ten thousand comments. Besides, commenting on a video does not necessarily mean agreement with the content. A person leaving a comment on a controversial topic might stem from a desire to get a reaction (trolling or flaming) from either the content creator or other viewers [37] [5]. We are hesitant to draw any conclusions based on the commenter migration without analyzing the content of the comments. The most recent study by Munger and Phillips (2019) directly analyzed YouTube’s recommendation algorithm and suggested that the algorithm operated on a simple supply- and-demand principle. That is, rather than algorithms driving viewer preference and further radicalization, further radicaliza- tion external to YouTube inspired content creators to produce more radicalized content. The study furthermore failed to find support for radicalization pathways, instead of finding that 1The study borrows a definition for the alt-right from Anti-Defamation League: ”loose segment of the white supremacist movement consisting of individuals who reject mainstream conservatism in favor of politics that embrace racist, anti-Semitic and white supremacist ideology” (pp. 2 [32]). The alt-light is defined to be a civic nationalist group rather than racial nationalism groups. The third category, ”intellectual dark web” (IDW), is defined as a collection of academics and podcasters who engage in controversial topics. The fourth category, the control group, includes a selection of channels form fashion magazine channels such as the (Cosmopolitan and GQ Magazine) to a set of left-wing and right-wing mainstream media outlets.

PDF Image | Algorithmic Extremism: Examining YouTube’s Rabbit Hole of Radicalization

PDF Search Title:

Algorithmic Extremism: Examining YouTube’s Rabbit Hole of Radicalization

Original File Name Searched:

1912-11211.pdf

DIY PDF Search: Google It | Yahoo | Bing

Cruise Ship Reviews | Luxury Resort | Jet | Yacht | and Travel Tech More Info

Cruising Review Topics and Articles More Info

Software based on Filemaker for the travel industry More Info

The Burgenstock Resort: Reviews on CruisingReview website... More Info

Resort Reviews: World Class resorts... More Info

The Riffelalp Resort: Reviews on CruisingReview website... More Info

CONTACT TEL: 608-238-6001 Email: greg@cruisingreview.com (Standard Web Page)