Print

YouTube’s secret life as an engine for right-wing radicalization

Columbia Journalism Review/September 19, 2018

By Mathew Ingram

For many casual YouTube users, the Google-owned video service is a harmless way to waste time, listen to music, or maybe even learn how to install a new appliance. But if you dig below the surface, as the non-profit research institute Data & Society does in a new report, you quickly start to see odd or even disturbing links to right-wing pundits and conspiracy theories. This is YouTube’s alter ego, what sociologist Zeynep Tufekci has called “one of the most powerful radicalizing instruments of the 21st century.” And it’s not a coincidence, the report says—it’s a deliberate attempt to radicalize users by pulling them into a vortex of reactionary content.

In the Data & Society analysis, “Alternative Influence: Broadcasting the Reactionary Right on YouTube,” researcher Rebecca Lewis looks at 65 political influencers across 81 YouTube channels, and identified what she calls an Alternative Influence Network or AIN. The AIN uses the same techniques that brands and other social-media influencers use to build followers and garner traffic, but uses them as a way to sell users on a specific right-wing ideology. This media pundits and internet celebrities in the network, which include Canadian professor Jordan Peterson and white supremacist Richard Spencer, “use YouTube to promote a range of political positions, from mainstream version of libertarianism and conservatism, all the way to overt white nationalism,” Lewis writes in the report.

Just as Instagram users might market a new brand of alcohol by posting photos and videos of themselves and tagging others to extend their reach, social networking among right-wing influencers on YouTube “makes it easy for audience members to be incrementally exposed to, and come to trust, ever more extremist political positions,” Lewis writes. And Google, of course, happily monetizes all of that engagement and traffic with ads.

It’s not just that Google is taking advantage of the traffic generated by these networks. As I wrote for CJR earlier this year, the problem is exacerbated by Google’s recommendation engine, an algorithm that suggests new videos for users to watch after they have finished with the one they clicked on or searched for. For many younger users, this is the new TV—watching video after video on YouTube. And the site’s algorithm is often gamed by right-wing trolls to get their hoaxes or fake news high up in the recommended list, an example of what the Oxford Internet Institute has called “computational propaganda.”

Google has said it is concerned about misinformation on YouTube (especially after conspiracy theories were some of the top recommendations after the school shooting in Parkland, Florida in February) and that it is trying to implement a number of features that will reduce the likelihood users will see fake news in the recommended list. But what Lewis describes in her Data & Society report is even harder to root out—a coordinated attempt to expose viewers to right-wing ideologies, not necessarily through the use of conspiracy theories or fakes, but through the kind of brand-building that YouTube and other social tools excel at.

Here are some more links related to misinformation and computational propaganda:

To see more documents/articles regarding this group/organization/subject click here.

Educational DVDs and Videos