The “Dislike” Button on YouTube Is Not What You Think It Is
YOUTUBE CREATOR ⬇️
Users attempt to influence the video platform’s algorithm by downvoting content, but Mozilla researchers claim it’s not as easy as it seems.
YOUTUBE CREATORS OFTEN Implore their viewers to “smash the Like button,” believing that their success on the algorithm-driven site depends on the response. But according to a recent study from the Mozilla Foundation, viewers who click the “Dislike” button on videos to hide stuff they don’t want to watch are wasting their time.
22,722 users who installed Mozilla’s RegretsReporter browser extension and were monitored between December 2021 and June 2022 provided data for the study. More than 500 million YouTube recommendations were examined by researchers. These recommendations were produced after viewers clicked on one of YouTube’s disapproval tools, such as the “Dislike” or “Don’t Recommend Channel” buttons. Becca Ricks, senior researcher at Mozilla, questions how using these tools to regulate recommendations will truly affect the videos that are recommended to users. She refers users to YouTube’s own help page on managing recommendations and search results.
The likelihood of receiving future recommendations for similar content varied depending on the button inputs. According to Mozilla, pressing the “Don’t Recommend” option will only halt 43% of erroneous video recommendations, while pressing the “Dislike” button would only halt 12%. According to Ricks, “what we discovered is that YouTube’s control methods don’t actually seem to be sufficient for preventing unsolicited recommendations.”
The rising public criticism of YouTube’s recommendation system in recent years served as the impetus for Mozilla’s research. As opposed to watch time, “they’ve been talking a lot about measures like time well spent or user pleasure,” claims Ricks. “We were really interested in how much some of those signals were being picked up by the algorithm, especially since in the previous YouTube report we worked on, people had expressed that they didn’t feel in control or that taking actions on unwanted videos really translated well to the recommender system.”
For instance, one participant in the Mozilla study had a negative reaction to the Tucker Carlson video Fox News published on February 13. One more episode of Carlson’s TV show from Fox News’ official YouTube channel was suggested to him a month later. Later in February, a separate viewer voiced their disapproval of a video showing webcams pointed at Ukraine’s conflict areas. A month later, they were shown a third video, this time from the WarShock YouTube channel, which described how dead Russian soldiers are taken from Ukraine. Ricks claims that the videos’ content does not violate YouTube’s policies and has no issues with it. “But it’s sort of shocking that it still gets recommended if you as a user indicate you don’t want to see it,”
Former YouTube employee and creator of AlgoTransparency, a website that discusses the YouTube algorithm, Guillaume Chaslot, says, “I’m not particularly surprised.” He continues, “In the broader picture, I feel like you should be able to choose and tell the algorithm what you want, and YouTube absolutely doesn’t let you do that.
According to YouTube, all of its systems are functioning properly. Elena Hernandez, a spokesperson for YouTube, said that users have control over their recommendations and that Mozilla’s analysis “doesn’t take into consideration how our algorithms actually work, making it impossible for us to draw many insights.” You can “stop a movie or channel from being recommended to them in the future” by doing this.
The resemblance of themes, people, or content appears to be the area where Mozilla and YouTube interpret their “don’t recommend” inputs differently in terms of success. According to YouTube, telling its algorithm not to promote a particular video or channel only prevents the algorithm from doing so—and has no bearing on a user’s access to a particular subject, viewpoint, or speaker. According to Hernandez, “Our controls do not block out entire topics or points of view because doing so can have detrimental impacts on viewers, such as forming echo chambers.”
That isn’t totally evident, according to Jesse McCrosky, a data scientist who collaborated with Mozilla on the project, from YouTube’s public pronouncements and published studies regarding its recommendation methods. According to him, there are a few “small glimpses into the black box” that demonstrate how YouTube primarily takes into account two types of feedback: on the plus side, engagement, which includes how much time users spend on YouTube and how many videos they watch, and explicit feedback, which includes dislikes. According to McCrosky, “They have some balance in how much they’re valuing those two sorts of feedback.” According to the report, other types of feedback are very little taken into account whereas interaction is given a lot of weight.
According to Robyn Caplan, senior researcher at Data & Society, a New York charity that has previously looked into YouTube’s algorithms, there is a difference between what YouTube feels it is saying about its algorithms and what Mozilla claims. Some of these findings, she says, “don’t contradict what the platform is stating, but they show that consumers don’t really grasp what tools are there so they can manage their experiences versus what features are there to give content creators feedback.” Caplan applauds the study and its conclusions, stating that even though Mozilla’s anticipated slam-dunk revelation may be less dramatic than the researchers had hoped, it nevertheless sheds light on a significant issue: Users are unsure of their ability to influence YouTube suggestions. According to Caplan, “This research does point to the greater necessity to survey consumers on site features on a frequent basis.” People may stop using these feedback mechanisms if they aren’t functioning properly.
The second portion of Mozilla’s study, which included a subsequent qualitative survey of around one-tenth of people who had installed the RegretsReporter extension and took part in the study, focused on confusion over the intended usefulness of user inputs. Those Mozilla spoke with expressed appreciation for the inputs that were targeted directly at channels and videos, but they expected them to have a larger impact on YouTube’s recommendation algorithm.
“I felt that was an interesting theme because it shows that people are actually saying this: ‘This is not just me telling you I banned this channel.'” With this, I’m attempting to have more influence over the recommendations I’ll receive in the future, says Ricks. According to its research, YouTube should provide viewers more alternatives to actively design their own experiences by indicating their content preferences, and the firm could do a better job of describing how its recommendation engines operate.
According to McCrosky, the main problem is the discrepancy between what consumers think YouTube is telling them through its algorithmic inputs and what they really do. There is a discrepancy in how much people are adhering to those signals, he said.