logo

Algorithmic Extremism: Examining YouTube’s Rabbit Hole of Radicalization

PDF Publication Title:

Algorithmic Extremism: Examining YouTube’s Rabbit Hole of Radicalization ( algorithmic-extremism-examining-youtubes-rabbit-hole-radical )

Previous Page View | Next Page View | Return to Search List

Text from PDF Page: 011

Second, the video categorization of our study is partially subjective. Although we have taken several measures to bring objectivity into the classification and analyzed similarities between each labeler by calculating the intraclass correlation coefficiencies, there is no way to eliminate bias. There is always a possibility for disagreement and ambiguity for cate- gorizations of political content. We, therefore, welcome future suggestions to help us improve our classification. In conclusion, our study shows that one cannot proclaim that YouTube’s algorithm, at the current state, is leading users towards more radical content. There is clearly plenty of content on YouTube that one might view as radicalizing or inflammatory. However, the responsibility of that content is with the content creator and the consumers themselves. Shifting the responsibility for radicalization from users and content creators to YouTube is not supported by our data. The data shows that YouTube does the exact opposite of the radicalization claims. YouTube engineers have said that 70 percent of all views are based on the recommendations [38]. When combined with this remark with the fact that the algorithm clearly favors mainstream media channels, we believe that it would be fair to state that the majority of the views are directed towards left-leaning mainstream content. We agree with the Munger and Phillips (2019), the scrutiny for radicalization should be shined upon the content creators and the demand and supply for radical content, not the YouTube algorithm. On the contrary, the current iteration of the recommendations algorithm is working against the extrem- ists. Nevertheless, YouTube has conducted several deletion sweeps targeting extremist content [29]. These actions might be ill-advised. Deleting extremist channels from YouTube does not reduce the supply for the content [50]. These banned con- tent creators migrate to other video hosting more permissible sites. For example, a few channels that were initially included in the Alt-right category of the Ribero et al. (2019) paper, are now gone from YouTube but still exist on alternative platforms such as the BitChute. The danger we see here is that there are no algorithms directing viewers from extremist content towards more centrist materials on these alternative platforms or the Dark Web, making deradicalization efforts more difficult [51]. We believe that YouTube has the potential to act as a deradicalization force. However, it seems that the company will have to decide first if the platform is meant for independent YouTubers or if it is just another outlet for mainstream media. A. The Visualization and Other Resources Our data, channel categorization, and data analysis used in this study are all available on GitHub for anyone to see. Please visit the GitHub page for links to data or the Data visualization. We welcome comments, feedback, and critique on the channel categorization as well as other methods applied in this study. B. Publication Plan This paper has been submitted for consideration at First Monday. C. Acknowledgments First, we would like to thank our volunteer labeler for all the hours spent on YouTube. We would also like to thank Cody Moser, Brenton Milne and Justin Murphy and everyone else who gave their feedback on the early drafts of this paper and aided the editing. REFERENCES [1] P. Ferdinand, The Internet, democracy and democratization. 2013. Routledge, [2] C. Blaya, “Cyberhate A review and content analysis of intervention strategies,” Aggression and Violent Behavior, vol. 45, pp. 163–172, 2019. [3] B. Pfaffenberger, “” if i want it, it’s ok”: Usenet and the (outer) limits of free speech,” The Information Society, vol. 12, no. 4, pp. 365–386, 1996. [4] J. M. Kayany, “Contexts of uninhibited online behavior: Flaming in social newsgroups on usenet,” Journal of the American Society for Information Science, vol. 49, no. 12, pp. 1135–1141, 1998. [5] H.BerghelandD.Berleant,“Theonlinetrollingecosystem,”Computer, no. 8, pp. 44–51, 2018. [6] ITU,“Worldtelecommunication/ictindicatorsdatabaseonline,”Interna- tional Telecommunication Union, 23rd Edition, http:// handle.itu.int/ 11. 1002/pub/81377c7d-en, 2019. [7] I. Gagliardone, D. Gal, T. Alves, and G. Martinez, Countering online hate speech. Unesco Publishing, 2015. [8] A. Ben-David and A. Matamoros-Ferna ́ndez, “Hate speech and covert discrimination on social media: Monitoring the facebook pages of extreme-right political parties in spain,” International Journal of Com- munication, vol. 10, pp. 1167–1193, 2016. [9] P. Burnap and M. L. Williams, “Cyber hate speech on twitter: An application of machine classification and statistical modeling for policy and decision making,” Policy & Internet, vol. 7, no. 2, pp. 223–242, 2015. [10] E.Chandrasekharan,U.Pavalanathan,A.Srinivasan,A.Glynn,J.Eisen- stein, and E. Gilbert, “You can’t stay here: The efficacy of reddit’s 2015 ban examined through hate speech,” Proceedings of the ACM on Human- Computer Interaction, vol. 1, no. CSCW, p. 31, 2017. [11] L. Knuttila, “User unknown: 4chan, anonymity and contingency,” First Monday, vol. 16, no. 10, 2011. [12] A. Nagle, Kill all normies: Online culture wars from 4chan and Tumblr to Trump and the alt-right. John Hunt Publishing, 2017. [13] S. Agarwal and A. Sureka, “Spider and the flies: Focused crawl- ing on tumblr to detect hate promoting communities,” arXiv preprint arXiv:1603.09164, 2016. [14] Q. Shen, M. M. Yoder, Y. Jo, and C. P. Rose, “Perceptions of censorship and moderation bias in political debate forums,” in Twelfth International AAAI Conference on Web and Social Media, 2018. [15] K. Roose, “The making of a youtube radical,” The New York Times (June 2019). https:// www.nytimes.com/ interactive/ 2019/ 06/ 08/ technology/youtube-radical.html, 2019. [16] ADL, “Despite youtube policy update, anti-semitic, white supremacist channels remain,” ADLs Center on Extremism, 2019. [17] L. Munn, “Alt-right pipeline: Individual journeys to extremism online,” First Monday, vol. 24, no. 6, 2019. [18] V. Andre, “neojihadism and youtube: Patani militant propaganda dis- semination and radicalization,” Asian security, vol. 8, no. 1, pp. 27–53, 2012. [19] I.Awan,“Cyber-extremism:Isisandthepowerofsocialmedia,”Society, vol. 54, no. 2, pp. 138–149, 2017. [20] N. de Boer, H. Su ̈tfeld, and J. Groshek, “Social media and personal at- tacks: A comparative perspective on co-creation and political advertising in presidential campaigns on youtube,” First Monday, vol. 17, no. 12, 2012. [21] J.B.Schmitt,D.Rieger,O.Rutkowski,andJ.Ernst,“Counter-messages as prevention or promotion of extremism?! the potential role of youtube: Recommendation algorithms,” Journal of Communication, vol. 68, no. 4, pp. 780–808, 2018.

PDF Image | Algorithmic Extremism: Examining YouTube’s Rabbit Hole of Radicalization

algorithmic-extremism-examining-youtubes-rabbit-hole-radical-011

PDF Search Title:

Algorithmic Extremism: Examining YouTube’s Rabbit Hole of Radicalization

Original File Name Searched:

1912-11211.pdf

DIY PDF Search: Google It | Yahoo | Bing

Cruise Ship Reviews | Luxury Resort | Jet | Yacht | and Travel Tech More Info

Cruising Review Topics and Articles More Info

Software based on Filemaker for the travel industry More Info

The Burgenstock Resort: Reviews on CruisingReview website... More Info

Resort Reviews: World Class resorts... More Info

The Riffelalp Resort: Reviews on CruisingReview website... More Info

CONTACT TEL: 608-238-6001 Email: greg@cruisingreview.com | RSS | AMP