YouTube still hosts radical and extremist videos
YouTube is overshadowed by Facebook and Twitter in the debate over the harms of social media, but the site has massive reach — 3 in 4 Americans report using it. This growth has been driven by YouTube’s use of algorithms to recommend more videos to watch, a feature that critics warn can lead people down rabbit holes of conspiracy theories and racism.
In 2018, for example, the sociologist Zeynep Tufekci described how YouTube started suggesting she check out “white supremacist rants, Holocaust denials and other disturbing content” after she started watching videos of Donald Trump rallies in 2016, prompting her to warn about the site “potentially helping to radicalize billions of people.”
Google — YouTube’s parent company — has sought to address these concerns. In 2019, for instance, it announced new efforts to remove objectionable content and reduce recommendations to “borderline” content that raises concerns without violating site policies.
Has YouTube done enough to curb harmful material on the platform? In a new report published by the Anti-Defamation League, my co-authors and I find that alarming levels of exposure to potentially harmful content continue.
When we directly measured the browsing habits of a diverse national sample of 915 participants, we found that more than 9 percent viewed at least one YouTube video from a channel that has been identified as extremist or white supremacist; meanwhile, 22 percent viewed one or more videos from “alternative” channels, defined as non-extremist channels that serve as possible gateways to fringe ideas.
My academic co-authors Annie Y. Chen, Jason Reifler, Ronald E. Robertson, Christo Wilson and I collected this data from April to October 2020 via a browser extension that respondents installed voluntarily to help us track their YouTube habits.
The study revealed that Google’s takedowns have not addressed many channels of potential concern. By combining lists compiled by academic researchers and subject matter experts at groups like the ADL and the Southern Poverty Law Center, we identified 290 channels that could be categorized as extremist or white supremacist and 322 in the “alternative” category. Of these, 515 channels were still active as of January. We found that participants visited 265 of them during the study, including more than 50 of the extremist channels.
Viewership of these channels was highly concentrated. Among the 1 in 10 participants overall who watched at least one extremist video, for instance, many watched considerably more. Mean viewership of videos from extremist channels among those who saw at least one was 11.5. In total, 6 percent of participants in the study were responsible for 80 percent of consumption of videos from these channels. Similarly, the mean number of videos from alternative channels watched by the 22 percent of participants who watched at least one such video was 64.2.
We’re better off without Trump on Twitter. And worse off with Twitter in charge.
Rates of viewership were highest among a particular subset of the population: people who express high levels of racial resentment. This measure was derived from respondents’ prior score on a standard survey scale often used by political scientists; such people tend to agree, for instance, with the statement that “if blacks would only try harder they could be just as well off as whites.”
Participants with levels of racial resentment corresponding to the top third of the distribution watched 5.8 videos from extremist channels on average; the corresponding values for the middle and bottom thirds of the sample in racial resentment levels were 0.38 and 0.03, respectively. Overall, more than 90 percent of views of videos from alternative and extremist channels were from people with high levels of racial resentment. (We took considerable steps to protect the privacy of people in our study, preventing researchers from seeing information that could identify them, for example.)
To what extent does YouTube’s recommendation algorithm make this problem worse? Encouragingly, we find that recommendations to videos from alternative and extremist channels are exceptionally rare when people are watching videos from other channels: They made up fewer than 2 percent of suggestions seen by our participants in those cases (and most are to alternative channels).
However, YouTube does frequently recommend other videos from alternative and extremist channels when people are already watching content of either type: Thirty-eight percent of the referrals on videos on alternative channels were to other videos from alternative channels, for instance. And when people were watching extremist channels, 29 percent of recommendations were to other extremist content while 14 percent were to videos from alternative channels.
In other words, YouTube does not appear to be sending innocent cat video watchers down a white-supremacist rabbit hole. However, our findings suggest that its design can reinforce exposure patterns among people who are interested in or susceptible to harmful content.
Concerns are growing in Washington about YouTube’s influence on extremism. When our report was released last month, Rep. Anna G. Eshoo (D-Calif.), who represents the district where Google is based, issued a statement calling the findings “damning” and arguing for legislation to regulate the “dangerous algorithms” used by the tech platforms.
Last week, the chair of the House Committee on Energy and Commerce, Frank Pallone Jr. (D-N.J.), went further, sending a public letter to the chief executive of Google, Sundar Pichai, requesting internal data on viewership of videos that the company eventually removed or classified as borderline.
YouTube is also getting fresh attention for its role in disseminating political misinformation. Last week, the final report of the Election Integrity Partnership — a coalition of academic and private sector researchers dedicated to tracking misinformation — highlighted YouTube’s role as “a space for video-format misinformation” about the 2020 election.
The group noted that YouTube content “could be shared easily across platforms,” including on Twitter, where the partnership identified more than 14,000 original tweets linking to YouTube misinformation; these were then retweeted almost 270,000 times.
Sadly, there is all too much demand for harmful online content. YouTube must decide whether it wants to be a platform that offers the supply to meet that demand.
Source: Washington Post