Youtube Radicalization

Reading reflections on:

  1. The Making of a YouTube Radical on the New York Times. Note: this article mentions rape, violence, and contains offensive racial and other slurs. If you are uncomfortable, you may skip this article.
  2. Optional, but recommended: Recommending What Video to Watch Next: A Multitask Ranking System by many authors at Google. [Google owns Youtube]

Suggestion: In your reflection, you could discuss if Youtube’s new system would have made a difference to the protagonist in the New York Times article.


  1. Tl,dr for the article: The journey of Caleb Cain with left & far-right politics, which is clustered to changes in Youtube algorithms
    The article got me thinking about two main things: The wired-in characteristics of human beings that make them vulnerable to the Youtube’s radical politics hole & The fact that it’s never solely about whether the creators make sense

    We’re social animals
    Caleb Cain was pulled into a far-right universe, videos filled with conspiracy theories, misogyny & racism. He was struggling to find his way in the world (broke, depressed, college dropped out), he was vulnerable. Thus, when his insecurities and issues were embraced by certain arguments from certain people, he took to it for a sense of belonging that they offer. As a liberal himself, when exposed to satirical commentaries that seek to expose the truths of the “social justice warriors”, it was almost like a lift that made him feel more knowledgeable, that he somewhat became better because of that enlightenment that his peers didn’t have. For a person suffering from social isolation and deprivation of purpose like Cain, this feeling of empowerment certainly compelled him to delve deeper into the radical Youtube rabbit hole.

    Cain’s story reminded me of this TedTalk, in which the speaker tells the story of how he became a Neo-Nazi, and how he got out:
    He made a very compelling point about why people get radicalized: Humans are social creatures naturally wired to seek a sense of belonging & purpose; without that, they feel insecure and marginalized from the world. They tend to distance themselves from what they don’t understand, and gradually become feared by that, and there born hatred when they keep being isolated from it.
    What radicalizes people is not the persuasion of the ideology or dogma itself, but because they want to belong
    It’s very well aligned with the circumstances that Cain was in when he started to be radicalized.

    Gaming the system is more important & the implications
    A recurring point in the article is that the radical right content creators are almost always in good hands to capitalize Youtube’s algorithm, regardless of the tweaks made to it. The 2012 algorithm change, for example, gave more weight to the watch time rather than views to encourage creators to make higher quality content, not overusing exaggerated titles or salacious thumbnails. Even though it might have intended to limit radical content that’s solely popular for its provocative features, the far right creators have already made long video essays on inflammatory messages, thus still inherently engaging more viewers. The new Reinforcement A.I that was supposed to recommend new ideas that would expand viewers’ tastes over time, instead of exposing them to the same kind of content also fell to the radical right benefit, as they experimented with various cross-genre content – like delivering a political dogma through analysis of pop culture

    This algorithm-content dynamic implicates a lifestyle-political profile dynamic, and how knowing to tweak such dynamic is perhaps much more crucial for the message to be heard – than what the message is itself. The transition to the left content observed in Cain was largely due to his realization that liberal content creators could speak “the native language of Youtube” as well – which is essentially eye-rolls and satire at the radical right to make it significantly less persuasive. The relevance of them also chiefly relies on their relevance to young audiences – one way Cain mentioned was through “edgy humour, edgy memes”.
    This, interestingly, reflects the exquisite link between lifestyle and political profile. All this algorithm-hijacking that the left wing has been able to do is mostly based on changing their approach to become more relevant to the youth’s “humour & edgy memes” preference. If people like Cain don’t view creators like Natalie Wynn as relatable, and instead just a typical white blonde feminist, or Oliver Thorn as just another elitist, carefree about job loss and similar white men’s fears, they wouldn’t bother listening to their argument at all. This also explains why men are much more likely to fall into the hands of white extremism: the content creators are indeed white; thus whatever they say, it’s likely that they derive it from their experience, and are consequently trustworthy.

    It is perhaps best to end this with a saying from Cain:
    YouTube is the place to put out a message. But I’ve learned now that you can’t go to YouTube and think that you’re getting some kind of education, because you’re not.
    True, the political culture that people still believe they are active and free-willed participants of is fundamentally shaped by algorithms of internet platforms like Youtube. That belief is naive, because the things you see daily on your newsfeed is not reflective of your ideology and desire, but rather just a result of algorithm tweaks and a bunch of content creators trying to hijack such tweaks.

  2. Youtube’s new algorithm, introduced in 2012, ranked videos and recommendations by watch time instead of clicks on the video. This new algorithm was complemented by a new AI algorithm called ‘Reinforcement’. The algorithm promised to recommend videos to users that will broaden their content type preferences and introduce them to new topics/areas. However, we cannot deny the influence the company – Google – has on its product ‘Youtube’. Google has history of having and supporting liberal views and their projection of liberal views in evident in their products too – with Youtube recommending LGBTQ content to its users. The new AI algorithm is thus somewhat influenced by left wing spectrum of Youtube HQ.

    Mr. Cain – who saw himself vulnerable and prone to antisocial behavior- resorted to Youtube to find a better way out of his life problems. According to him, he was radicalized by Youtube’s algorithm into watching more right wing content. Right wing supporters would somehow tweak the algorithm – or would use tricks such as ‘clickbaiting’ – to gain more views on their videos. Although the power this right wing group might have, the power and authority owned by Youtube will have a much larger impact. Youtube’s new algorithm – supplemented by reinforcement AI algorithm, can filter out and not recommend right wing content. On the other hand, Youtube can prove to recommend more left wing content (supported by Youtube’s company policies as aforementioned). This can have an impact on Mr. Cain. Mr. Cain denied reducing his watch time on Youtube, but instead he mentioned that “he needed to think critically about the videos he watched “. Whatever way Mr. Cain choose to change his Youtube watch time habits, he will always be recommended videos that don’t reflect his own tastes and views.

    The videos that Mr. Cain might see on his feed will not be an indicated on his own views and ideological beliefs, but a reflection of Youtube’s algorithm and recommendation systems. Whether the content recommended to Mr. Cain promotes right wing or left wing behavior, Youtube has a role in how that content is being recommended to Mr. Cain.

  3. The protagonist in the New York Times article seems to be trapped in a circle of far-right extremism, due to the Youtube Algorithm. The personalized recommendation algorithm of Youtube will provide the users with those high-exposure videos. However, provocative videos have a very high possibility to be rewarded because of high view counts and advertising dollars. However, I believe Youtube’s new algorithm has the ability to curb the spread and proliferation of extreme information. First, some topics can be explicitly banned, and Youtube can weed out most of the videos that could have extreme effects by reviewing the uploaded videos. Second, improvements to the recommendation algorithm could also prevent users from falling into a “local minimum”, i.e., from falling into a closed circle of recommendations. The new algorithm can divert users and instantly recommend videos and presentations from certificated media to them as extreme-topic videos proliferate. Overall, Caleb’s event is much less likely to have similar incidents. However, the definition of “right” content and certificate media will depend on Youtube’s own values. It might spread “wrong” information on one day.

  4. Caleb Cain is a clear example of how social media sites, in particular, recommendation systems can drastically shift a person’s views and personality over a relatively short period of time. While reading the article I was initially shocked by how someone could believe what some far-right activists were saying on topics such as feminism, racism, homophobia, etc. To me, these ideas seemed obviously wrong and I felt uncomfortable just reading the titles. I couldn’t understand why someone would want to watch these videos especially if they had a more liberal view before.

    Further on in the article I kind of understand why someone would be drawn in and hooked into this kind of world. The way that these videos are created was to be over the top with exaggerated titles and thumbnails. As Cain mentions in the article, seeing someone talk about things that we normally wouldn’t feel comfortable talking about due to moral and social reasons (like talking about how the Holocaust was fake or sharing anti-Semitic views) would produce a feeling similar to thrill. It would seem dangerous or bold to talk about these things openly which might attract the attention and admiration of young viewers, especially those who don’t have a solid support system. Watching these videos makes it seem like you’re part of an exclusive club where only members who are smart or privileged enough can join. We are social beings and when we’re unable to make strong social connections in the real world, many of us shift to the online environment to make friends and build relationships with who we may believe to be like-minded people.

    Once you get hooked in and you feel like you’re a part of something, you want to do whatever you can to show others you belong. This includes backing them up when they express their opinions and showing them support by sharing what they do and criticizing what their counterparts do. At a young age, a person’s belief system isn’t developed yet which makes them more susceptible to external manipulations.

    I’m not sure how effective Youtube’s new system would be in preventing cases like Cain’s from happening again since their main objective hasn’t changed. Youtube wants people to watch more videos and watch for a longer time, and far-right speakers are taking advantage of this by creating sensationalized content that can manipulate young, susceptible minds. While Youtube claims to understand its responsibility in curbing the spread of misinformation, I don’t think they care enough to make a significant change to limiting bias in their recommendations.

Leave a Comment

Your email address will not be published.