YouTube’s “not interested” and “dislike” buttons barely work, according to a new study by Mozilla.
When users tell the video platform they don’t want to watch certain videos using the buttons, they are often fed similar recommendations regardless, the research found. Other options such as “stop recommending channel” and “remove from watch history” were similarly ineffective.
Mozilla used video recommendation data from more than 200,000 YouTube users and found that more than half of recommendations were related or similar to the types of videos users had requested not to see.
YouTube is the second most visited website in the world, and like most platforms uses an algorithm to drive video views. Previously, research from Mozilla showed that users are regularly recommended videos they don’t want to see, including hate speech, political misinformation and violent content.
YouTube claims its recommendation system takes factors such as “user satisfaction” and “time well spent” into account, and doesn’t focus solely on watchtime. “However, evidence suggests that the platform continues to prioritize engagement over people’s well-being,” the Mozilla report reads.
Last week YouTube said it would expand its policies on violent extremism to target content that glorifies violence even if the videos are not related to a terrorist organization. In some cases the company was found to have not applied those rules to videos promoting militia groups involved in the storming of the U.S. Capitol. In May, a report by the Tech Transparency Project found 435 pro-militia videos on YouTube, some of which gave training advice such as how to carry out a guerilla-style ambush.
The video platform has also come under fire for the misogyny faced by female creators on the site, which has reportedly increased since the Johnny Depp v Amber Heard defamation trial. During the trial, many creators posted misogynistic anti-heard videos which brought them huge numbers of followers.