YouTube’s algorithm recommends videos that violate its policies

YouTube’s algorithm recommends videos that violate its policies

YouTube’s algorithm recommends videos that violate the company’s own policies on inappropriate content, according to a crowdsourced study.

Not-for-profit company Mozilla asked users of its Firefox browser to install a browser extension called RegretsReporter, which tracked the YouTube videos they watched, and asked them if they regretted watching each video.

Between July 2020 and May 2021, 37,380 users flagged 3362 videos they seen as regrettable – a fraction of just one 1 per cent of all those they watched. Reports of the videos were highest in Brazil, with about 22 videos from every 10,000 viewed being logged as regrettable.

Advertisement

Researchers then watched the reported videos and checked them against YouTube’s content guidelines; they discovered that 12.2 % of the reported videos either shouldn’t be on YouTube, or shouldn’t be recommended through its algorithm, say the Mozilla researchers.

In regards to a fifth of the reported videos would are categorized as what YouTube’s rules classify as misinformation, and an additional 12 % spread covid-19 misinformation, say the researchers. Other issues flagged in the survey included violent or graphic content and hate speech.

“Some of our findings, if scaled up to the size of YouTube’s user base, would raise significant questions and become really concerning,” says Brandi Geurkink at Mozilla in Germany. “What we’ve found may be the tip of the iceberg.”

Read more: Banning extreme views on YouTube really does help stop their spread

Most of the contentious videos were sent to users through YouTube’s algorithm, which recommends videos from channels that a user may well not necessarily follow or hasn’t searched for. Seven in 10 of the regret reports were linked with recommended videos, and the ones recommended by YouTube were 40 % much more likely to be regretted than videos users actively sought out, say the Mozilla researchers.

Non-English language videos were reportedly 60 % much more likely to be regretted, that your researchers believe may be because YouTube’s algorithms are trained on generally English-language videos.

“This highlights the need to tailor moderation decisions on a per-country level, and make sure YouTube has expert moderators that really know what is going on in each country,” says Savvas Zannettou at the Max Planck Institute for Informatics in Germany.

Geurkink said YouTube’s lack of transparency over its algorithm is “unacceptable”, especially after years of research has raised concerns about its impact on society.

A YouTube spokesperson said: “The purpose of our recommendation system is to hook up viewers with content they love and on any given day, a lot more than 200 million videos are recommended on the homepage alone.”

The business added it had made changes to its recommendation system in the last year that reduced consumption of “borderline content” to significantly less than 1 per cent of most videos.

More on these topics:

  • internet

Leave a Reply

Your email address will not be published.