YouTube Algorithm and the Alt-Right Filter Bubble

PDF Publication Title:

YouTube Algorithm and the Alt-Right Filter Bubble ( youtube-algorithm-and-alt-right-filter-bubble )

Previous Page View | Next Page View | Return to Search List

Text from PDF Page: 003

The YouTube Algorithm and the Alt-Right Filter Bubble 87 The intent of the engineers was to maximize exposure of advertisements on YouTube which is problematic, but not as malicious as the actual outcome we see the algorithm performing. The Google engineers measure their success by “watch-time” or “click-through rate” which tells us that their goal is to make sure the user watches more videos which creates more opportunities for users to click on videos or advertisements (Covington, Adams, & Sargin, 2016). Since YouTube’s main source of revenue is through their advertisers, the most obvious goal is to encourage users to click on the advertisements. There are two sides to the YouTube algorithm: one facet represents the design and intent of the algorithm by its engineers. The other facet is represented as an unknown factor because the algorithm is a learning neural network and has created some connections on its own through machine learning. Machine learning is distinct from a programmed response in that it is a behavior that the computer has improved upon on its own, often using pattern recognition to determine a better or faster way to attain the original directive which has been set by a human engineer. There are educated guesses supported by data that conclude that this second facet of the YouTube algorithm may be operating in some ways that were unintended by its creators. An independent test was done, “each query in a fresh search session with no personal account or watch history data informing the algorithm, except for geographical location and time of day, effectively demonstrating how YouTube’s recommendation operates in the absence of personalization” (O’Donovan, Warzel, McDonald, Clifton, & Woolf, 2019). After more than 140 such tests, the observers decided, “YouTube’s recommendation engine algorithm isn’t a partisan monster — it’s an engagement monster. [...] Its only governing ideology is to wrangle content — no matter how tenuously linked to your watch history — that it thinks might keep you glued to your screen for another few seconds” (O’Donovan et. al., 2019). Indeed, this is a popular theory that YouTube’s algorithm is not trying to teach or convince the user of a certain truth, but simply wants to convince them to continue to watch the videos on the site. Zynep Tufekci, a Turkish techno-sociologist, has studied YouTube ever since she had an unexpected experience with it as she was researching the 2016 presidential election campaign. While doing research for an article, she watched several “Donald Trump rallies on YouTube” which led the recommendation engine to autoplay “white supremacist rants, Holocaust denials and other disturbing content” (Tufekci, 2018). Even though racist content was the trigger that caused Tufecki to dig deeper, she concluded that the algorithm’s overall intent was not racist in nature. “For all its lofty rhetoric, Google is an advertising broker, selling our attention to companies that will pay for it. The longer people stay on YouTube, the more money Google makes” (Tufekci, 2018). YouTube is a for-profit company driven by online advertisements and we know that the goal of the algorithm is to drive users to use the service as much as possible, optimizing the chance that the user will click on an ad, therefore generating revenue for the website. YouTube’s recommendation system makes complex, goal-based decisions, using a set of independently operating computer programs that mimic the human brain, often called a neural network. YouTube uses a neural network learning algorithm that perpetuates content to users. This algorithm may have found an unexpected relationship between racism and the right amount of curiosity that prompts a person to continue to watch YouTube videos. Two academic researchers made a visualized data map of 13,529 YouTube channels, starting with the top most popular channels from opposite political perspectives, recreating the YouTube algorithm in an attempt to figure out what was happening when it was recommending increasingly extreme political content (Kaiser & Rauchfleisch, 2018). They found tightly bound relationships between the right-leaning content on YouTube, specifically “Fox News,” “Alex Jones,” and “white nationalists,” along with some conspiracy theories, anti-feminist, anti-political correctness channels, and a channel called the Manosphere (Kaiser & Rauchfleisch, 2018). The connection between these channels was more tightly knit and closer together on the right-leaning side than it was on the left-leaning one. The authors found this, “highly problematic” because a user who pursues even mildly conservative content is “only one or two clicks away from extreme far-right channels, conspiracy theories, and radicalizing content” (Kaiser & Rauchfleisch, 2018). There was a multitude of other data included in the visualization data, including non-political channels such as video games, guns, music, and tech, which are individually popular on their own but not interconnected in the way the right-wing communities within YouTube are connected (Kaiser & Rauchfleisch, 2018). Since the algorithm has made this unlikely connection, it has a bias of recommending racist or white supremacist videos more often to users. The surprising outcome of this machine learning is

PDF Image | YouTube Algorithm and the Alt-Right Filter Bubble

PDF Search Title:

YouTube Algorithm and the Alt-Right Filter Bubble

Original File Name Searched:

The_YouTube_Algorithm_and_the_Alt-Right_Filter_Bub.pdf

DIY PDF Search: Google It | Yahoo | Bing

Cruise Ship Reviews | Luxury Resort | Jet | Yacht | and Travel Tech More Info

Cruising Review Topics and Articles More Info

Software based on Filemaker for the travel industry More Info

The Burgenstock Resort: Reviews on CruisingReview website... More Info

Resort Reviews: World Class resorts... More Info

The Riffelalp Resort: Reviews on CruisingReview website... More Info

CONTACT TEL: 608-238-6001 Email: greg@cruisingreview.com (Standard Web Page)