YouTube Algorithm and the Alt-Right Filter Bubble

PDF Publication Title:

YouTube Algorithm and the Alt-Right Filter Bubble ( youtube-algorithm-and-alt-right-filter-bubble )

Previous Page View | Next Page View | Return to Search List

Text from PDF Page: 004

88 L. Valentino Bryant that the algorithm is showing an unexplained bias to suggest alt-right content to users, even promoting it to an inflated presence on the website without prior prompting of this preference. Some employees within YouTube’s company “raised concerns about the mass of false, incendiary and toxic content that the world’s largest video site surfaced and spread” (Bergen, 2019). A user-run video database without moderation is dangerous on its own, but that is not exactly what was happening here; the algorithm was interfering with peoples’ preferences and seemed to be pushing racist and alt-right propaganda to the surface. An employee at YouTube proposed a new “vertical” in 2018, “a category that the company uses to group its mountain of video footage,” suggesting that this section of videos should be dedicated to the alt-right (Bergen, 2019). “Based on engagement, the hypothetical alt-right category sat with music, sports and gaming as the most popular channels at YouTube, an attempt to show how critical these videos were to YouTube’s business” (Bergen, 2019). While this one employees’ efforts may not reflect the values of YouTube’s company as a whole, they were not the first one to notice a trend in the massive amount of white supremacist videos on the video sharing site. The Southern Poverty Law Center had articles as early as 2007 that mentioned how compared to dropping pamphlets on lawns, “posting video footage [on video sharing sites] is vastly less difficult, expensive, risky and time-consuming—and it can be done anonymously with virtually no effort” (Mock, 2007). YouTube has always had hate speech policies, but recently updated these policies of June 2019 to specifically target white nationalists by condemning behavior that might, “[c] all for the subjugation or domination over individuals or groups” or “[d]eny that a well-documented, violent event took place” such as the Holocaust which is a conspiracy theory that is popular with many members of the alt-right (“Hate speech policy”, 2019). There was evidence that YouTube knew about the problematic content in 2017 because Susan Wojiciki mentioned in a 2018 Wired interview that “we started working on making sure we were removing violent extremist content. And what we started doing for the first time last year was using machines to find this content. So we built a classifier to identify the content and lo and behold, the machines found a lot more content than we had found” (Thompson, 2018). That Wojicki both admits that the extremist content is problematic and claims that there was a lot more of it than the human searchers were able to find means that YouTube does not seem to have some secret alt-right agenda, but is just preoccupied and does not consider it a top concern. During that same Wired interview, Wojicki was asked about morality and responsibility, but she said the company is not sure where they stand with moral concerns, adding an answer in the form of an analogy: “we don’t want it to be necessarily us saying, we don’t think people should be eating donuts, right. It’s not our place to be doing that” (Thompson, 2018). Online hate groups have unified on the internet and even though they span across many spaces such as Reddit, 8chan, 4chan, Twitter, Discord channels, a large majority of them claim that the content on YouTube with its constant stream of recommendations contributed to their recruitment. An internet investigative journalist on Bellingcat, an organization of independent online investigators, tracked down and interviewed 75 white supremacist facists to find out about each person’s “red-pilling,” a term to explain “converting someone to fascist, racist and anti-Semitic beliefs” (Evans, 2018). When spaces are created on the internet for hate groups, the concern has been raised that the groups will echo hateful messages back to each other, preventing the influence from the outside world to create a more realistic perspective for these individuals. The term “filter bubble” was coined by the internet activist Eli Pariser to describe a phenomenon in which search algorithms can contribute to surrounding a user with their own viewpoints, sending their own search terms back at them in the form of results, ensuring that they rarely come into contact with an opposing source. In Pariser’s book, The Filter Bubble: How the New Personalized Web is Changing What We Read and How We Think, he points out one of the main problems with the filter bubble: “But the filter bubble isn’t tuned for a diversity of ideas or of people. It’s not designed to introduce us to new cultures. As a result, living inside it, we may miss some of the mental flexibility and openness that contact with difference creates” (Pariser, 2011). Unfortunately one of the problems with internet subcultures is that they create an artificial space that guarantees the absence of diversity and conflict, even if the belief system is illogical or harmful. A dramatic contrast to the filter bubble may be the ideal library space, where each individual can encounter intelligent, well-worded perspectives that are opposed to one’s own. The Bellingcat journalist published findings that are similar to Pariser’s “filter bubble” theory, and mentioned that “Infowars reached the height of its influence as a result of sites like Facebook and YouTube.

PDF Image | YouTube Algorithm and the Alt-Right Filter Bubble

PDF Search Title:

YouTube Algorithm and the Alt-Right Filter Bubble

Original File Name Searched:

The_YouTube_Algorithm_and_the_Alt-Right_Filter_Bub.pdf

DIY PDF Search: Google It | Yahoo | Bing

Cruise Ship Reviews | Luxury Resort | Jet | Yacht | and Travel Tech More Info

Cruising Review Topics and Articles More Info

Software based on Filemaker for the travel industry More Info

The Burgenstock Resort: Reviews on CruisingReview website... More Info

Resort Reviews: World Class resorts... More Info

The Riffelalp Resort: Reviews on CruisingReview website... More Info

CONTACT TEL: 608-238-6001 Email: greg@cruisingreview.com (Standard Web Page)