People feel like they don’t have control over their YouTube recommendations…
Our 2021 investigation into YouTube’s recommender system uncovered a range of problems on the platform: an opaque algorithm, inconsistent oversight, and geographic inequalities. We also learned that people feel they don’t have control over their YouTube experience — particularly the videos that are recommended to them.
YouTube says that people can manage their video recommendations through the feedback tools the platform offers. But do YouTube’s user controls actually work?
and our study shows that they really don’t.
In the qualitative portion of our study, we learned that people do not feel in control of their experience on YouTube, nor do they have clear information about how to curate their recommendations. Many people take a trial-and-error approach to controlling their recommendations using YouTube’s hodgepodge of options, like “Dislike,” “Not Interested,” and other buttons. It doesn’t seem to work.
we ran a randomized controlled experiment across our community of RegretsReporter participants that could directly test the effectiveness of YouTube’s user controls. We found that YouTube’s user controls somewhat influence what is recommended, but this effect is meager and most unwanted videos still slip through.
Even the most effective feedback methods prevent less than half of bad recommendations.
Our main recommendation is that YouTube should enable people to shape what they see.
YouTube’s user controls should be easy to understand and access. People should be provided with clear information about the steps they can take to influence their recommendations, and should be empowered to use those tools.
YouTube should design its feedback tools in a way that puts people in the driver’s seat. Feedback tools should enable people to proactively shape their experience, with user feedback given more weight in determining what videos are recommended.
YouTube should enhance its data access tools. YouTube should provide researchers with access to better tools that allow them to assess the signals that impact YouTube’s algorithm.
Policymakers should protect public interest researchers. Policymakers should pass and/or clarify laws that provide legal protections for public interest research.