YouTube Video Recommendations Lead to More Extremist Content for Right-Leaning Users, Researchers Suggest::New research shows that a person’s ideological leaning might affect what videos YouTube’s algorithms recommend to them. For right-leaning users, video recommendations are more likely to come from channels that share political extremism, conspiracy theories and otherwise problematic content.
LiberalGunNut™ here! (Yes, we exist.) I do not experience this. Bear with me a moment…
I consume loads of gun related content on YouTube. Historical, gunsmithing, basic repair, safety, reviews, testing, whatever. My favorite presenters are apolitical, or at least their presentations are.
My recommendations should be overrun with right-wing bullshit. Yet they are not. My recommendations are more of the same, and often include interesting and related media. I may stray off into other fringe areas like prepping, but even that doesn’t get radical, and my feed comes back to center in a hurry.
Can someone explain what I’m seeing here?
As a side note, I do experience this with my default “news” tab on Edge. Yes, it’s 95% crap, but I sometimes see real news I want to follow up. But fuck me, one time I clicked on a GenA vs. GenB article, flooded with it. My Android feeds me like this. Clicked on a couple of stories about wild pigs, flooded. Hummingbird story? Flooded.
But I’m not getting this on YouTube. 🤷🏻♂️
It is entirely possible that YouTube’s algorithm doesn’t see you as someone interested in right-wing rhetoric (or perhaps you may have also downvoted such videos).
They explain how it works here (without technical details): https://blog.youtube/inside-youtube/on-youtubes-recommendation-system/
They don’t do it to everyone. Some people get put in test groups that get ‘nice’ algorithms that don’t try to make you angry, so they can measure the effect on their revenue.