Algorithmic Extremism: Examining YouTube’s Rabbit Hole of Radicalization

PDF Publication Title:

Algorithmic Extremism: Examining YouTube’s Rabbit Hole of Radicalization ( algorithmic-extremism-examining-youtubes-rabbit-hole-radical )

Previous Page View | Next Page View | Return to Search List

Text from PDF Page: 007

Several smaller categories follow the top two-categories. Notably, the two-second largest categories are also centrist or left-leaning in their political outlook. For example, the two largest channels in the Anti-SJW category (JRE Clips and PowerfulJRE)) both belong to an American podcast host, Joe Rogan, who hosts guests from a wide range of political beliefs. The Unclassified groups consist of centrist, mostly apolitical, educational channels such as TED or government- owned mainstream media channels such as Russia Today. Based on our flow diagram, we can see that the recommen- dation algorithm directs traffics from all channel groups into the two largest ones, away from more niche categories. Fig. 4. Flow diagram presenting the flow or recommendations between different groups Based on these data, we can now evaluate the claims that the YouTube recommendation algorithm will recommend content that contributes to the radicalization of YouTube’s user base. By analyzing each radicalization claim and whether the data support these claims, we can also conclude whether the YouTube algorithm has a role in political radicalization. The first claim tested is that YouTube creates C1 - Radical Bubbles., i.e., recommendations influence viewers of radi- cal content to watch more similar content than they would otherwise, making it less likely that alternative views are presented. Based on our data analysis, this claim is partially supported. The flow diagram presented in Figure 4 shows a high-level view of the intra-category recommendations. The recommendations provided by the algorithm remain within the same category or categories that bear similarity to the original content viewed by the audience. However, from the flow diagram, one can observe that many channels receive fewer impressions than what their views are i.e., the rec- ommendation algorithm directs traffic towards other channel categories. A detailed breakdown of intra-category and cross- category recommendations is presented by recommendations percentages in Figure 12 and by a number of impressions in Figure 13 in Appendix B show the strength of intra-category recommendations by channel. We can see that the recommendation algorithm does have an intra-category preference, but this preference is dependent on the channel category. For example, 51 percent of traffic from Center Left/MSM channels is directed to other chan- nels belonging to the same category (see Figure 12). Also, the remaining recommendations are directed mainly to two categories: Partisan Left (18.2 percent) and Partisan Right (11 percent), both primarily consisting of mainstream media channels. Figure 5 presents a simplified version of the recommen- dation flows, highlighting the channel categories that benefit from the recommendations traffic. From this figure, we can observe that there is a significant net flow of recommendations towards channels that belong to the category Partisan Left. For example, the Social Justice category suffers from cross- category recommendations. For viewers of channels that are categorized as Social Justice, the algorithm presents 5.9 more recommendations towards the Partisan Left channels than vice versa and another 5.2 million views per day towards Center/Left MSM channels. Figure 5 also shows a ”pipeline” that directs traffic towards the Partisan Left category from other groups via the intermediary Center/Left MSM category. This is true even for the other beneficiary category, the Partisan Right, which loses 2.9 million recommendations to Partisan Left but benefits with a net flow of recommendations from different right-leaning categories (16.9M). However, when it comes to categories that could be poten- tially radicalizing, this statement is only partially supported. Channels that we grouped into Conspiracy Theory or White Identitarian have very low percentages of recommendations within the group itself (as shown in 12). In contrast, channels that we categorized into Center/Left MSM or Partisan Left or Right have higher numbers for recommendations that remain within the group. These data show that a dramatic shift to more extreme content, as suggested by media [15] [30], is untenable. Second, we posited that there is a C2 - Right-Wing Advan- tage, i.e., YouTube’s recommendation algorithm prefers right- wing content over other perspectives. This claim is also not supported by the data. On the contrary, the recommendation algorithm favors content that falls within mainstream media groupings. YouTube has stated that its recommendations are

PDF Image | Algorithmic Extremism: Examining YouTube’s Rabbit Hole of Radicalization

PDF Search Title:

Algorithmic Extremism: Examining YouTube’s Rabbit Hole of Radicalization

Original File Name Searched:

1912-11211.pdf

DIY PDF Search: Google It | Yahoo | Bing

Cruise Ship Reviews | Luxury Resort | Jet | Yacht | and Travel Tech More Info

Cruising Review Topics and Articles More Info

Software based on Filemaker for the travel industry More Info

The Burgenstock Resort: Reviews on CruisingReview website... More Info

Resort Reviews: World Class resorts... More Info

The Riffelalp Resort: Reviews on CruisingReview website... More Info

CONTACT TEL: 608-238-6001 Email: greg@cruisingreview.com (Standard Web Page)