YouTube says its systems are working as they’re meant to. “Mozilla’s report doesn’t take into account how our systems actually work, and therefore it’s difficult for us to glean many insights,” says YouTube spokesperson Elena Hernandez, who added that viewers are given control over their recommendations. This includes “the ability to block a video or channel from being recommended to them in the future.”
Where Mozilla and YouTube differ in their interpretations of how successful their “don’t recommend” inputs are appears to be around the similarity of topics, individuals, or content. YouTube says that asking its algorithm not to recommend a video or a channel simply stops the algorithm from recommending that particular video or channel—and does not affect a user’s access to a specific topic, opinion, or speaker. “Our controls do not filter out entire topics or viewpoints, as this could have negative effects for viewers, like creating echo chambers,” says Hernandez.
Jesse McCrosky, a data scientist working with Mozilla on the study, says that isn’t entirely clear from YouTube’s public statements and published research about its recommender systems. “We have some small glimpses into the black box,” he says, which show that YouTube broadly considers two types of feedback: on the positive side, engagement, such as how long users watch YouTube and how many videos they watch; and explicit feedback, including dislikes. “They have some balance, the degree to which they’re respecting those two types of feedback,” says McCrosky. “What we’ve seen in this study is that the weight toward engagement is quite exhaustive, and other sorts of feedback are quite minimally respected.”
The distinction between what YouTube believes it says about its algorithms and what Mozilla says is important, says Robyn Caplan, senior researcher at Data & Society, a New York nonprofit that has previously investigated YouTube’s algorithm. “Some of these findings don’t contradict what the platform is saying, but demonstrate that users do not have a good understanding of what features are there so they can control their experiences, versus what features are there to give feedback to content creators,” she says. Caplan welcomes the study and its findings, saying that while Mozilla’s intended slam-dunk revelation may be more muted than the researchers had hoped, it nevertheless highlights an important problem: Users are confused about the control they have over their YouTube recommendations. “This research does speak to the broader need to survey users regularly on features of the site,” Caplan says. “If these feedback mechanisms aren’t working as intended, it may drive folks off.”
Confusion over the intended functionality of user inputs is a key theme of the second part of Mozilla’s study: a subsequent qualitative survey of around one-tenth of those who had installed the RegretsReporter extension and participated in the study. Those that Mozilla spoke to said that they appreciated that inputs were directed specifically at videos and channels, but that they expected it to more broadly inform YouTube’s recommendation algorithm.
“I thought that was an interesting theme because it reveals that this is people saying: ‘This is not just me telling you I blocked this channel. This is me trying to exert more control over the other kinds of recommendations I’m going to get in the future,’” says Ricks. Mozilla recommends in its research that YouTube allow users more options to proactively shape their own experiences by outlining their content preferences—and that the company do a better job of explaining how its recommendation systems work.
For McCrosky, the key issue is that there’s a gap between the messaging users perceive YouTube is providing through its algorithmic inputs, and what they actually do. “There’s a disconnect in the degree to which they’re respecting those signals,” he says.