YouTube’s dislike button does not have a direct effect on users’ algorithmic recommendations, according to new data.
Mozilla announced Tuesday that it had done a quantitative and qualitative analysis of how one’s interactions with some aspects of YouTube’s interface affected its algorithmic recommendations and found that the “dislike” button had minimal effects.
The company found that users believed their user interfaces did not affect what they saw on the platform and that the website fails to prevent “unwanted” recommendations, according to an “experimental audit” of the platform involving more than 22,000 people using Mozilla’s YouTube algorithmic research tool RegretsReporter.
ZUCK LOSES BUCKS: META CHIEF DROPS $70 BILLION, NOW 20TH-RICHEST PERSON IN WORLD
“Nothing changed,” one user said. “Sometimes, I would report things as misleading and spam, and the next day, it was back in. It almost feels like the more negative feedback I provide to their suggestions, the higher bulls*** mountain gets. Even when you block certain sources, they eventually return.”
Mozilla tested this by having those users report their results via RegretsReporter and provide insight into their experiences with YouTube’s algorithm. It then did a randomized controlled test asking users to test certain switches in YouTube’s user controls directly. These included buttons such as “dislike,” “Don’t recommend,” “Not Interested,” and “Remove from History.”
The data found that none of the options effectively stopped users from seeing content they did not want to see. The most effective option was “Don’t Recommend Channels,” preventing 43% of “bad recommendations” or content recommendation that resembles that which a user had previously rejected or declined to view. Removing content from a user’s watch history was only 29% effective.
CLICK HERE TO READ MORE FROM THE WASHINGTON EXAMINER
Mozilla then suggested an update to YouTube’s algorithms and controls so users could clearly shape what was shared with them. “YouTube’s user controls should be easy to understand and access. People should be provided with clear information about the steps they can take to influence their recommendations, and should be empowered to use those tools,” the company said in its recommendations.
“Our controls do not filter out entire topics or viewpoints, as this could have negative effects for viewers, like creating echo chambers,” Elena Hernandez, a spokeswoman for YouTube, told the New York Times. “Mozilla’s report doesn’t take into account how our systems actually work, and therefore it’s difficult for us to glean many insights.”