YouTube Video Recommendations Lead to More Extremist Content for Right-Leaning Users, Researchers Suggest::New research shows that a person’s ideological leaning might affect what videos YouTube’s algorithms recommend to them. For right-leaning users, video recommendations are more likely to come from channels that share political extremism, conspiracy theories and otherwise problematic content.
I’m not sure it’s just right leaning users. I’m pretty far to the left and I keep ketting anti-trans, anti-covid right wing talking points quite frequently. I keep pressing thumbs down but they keep coming.
What YouTube sees:
“These videos keep eliciting reactions from users, which means that they prefer to engage with this content. This bodes well for our advertisers.”
Thumbs down actually makes Youtube recommend more stuff to you because you’ve engaged with the content. As said by other people, best way to avoid recommendations you don’t want is to just ignore the content and not watch it, if it’s being recommended to you then click the three dots on the video thumbnail and choose “not interested”/“don’t recommend channel”.
They’re supposed to enrage you so you use their platform longer, hate-share the videos so others use their platform, etc. They know what they’re doing.
Yeah I’ve gotten similar results too. Fwiw, I don’t think downvoting is a good way to change your results. It seems to key into any interaction at all and also watch time. As soon as I see certain people I started just swiping immediately
As tempted as I am to reply “Well, duh,” I suppose it’s good that we’re getting research to back up what we already knew.
It’s such a slippery slope. I avoid anything mildly right leaning to keep my algorithm clean. I sometimes watch stuff incognito to preserve my accounts algorithm. What a world…
Removed by mod
LiberalGunNut™ here! (Yes, we exist.) I do not experience this. Bear with me a moment…
I consume loads of gun related content on YouTube. Historical, gunsmithing, basic repair, safety, reviews, testing, whatever. My favorite presenters are apolitical, or at least their presentations are.
My recommendations should be overrun with right-wing bullshit. Yet they are not. My recommendations are more of the same, and often include interesting and related media. I may stray off into other fringe areas like prepping, but even that doesn’t get radical, and my feed comes back to center in a hurry.
Can someone explain what I’m seeing here?
As a side note, I do experience this with my default “news” tab on Edge. Yes, it’s 95% crap, but I sometimes see real news I want to follow up. But fuck me, one time I clicked on a GenA vs. GenB article, flooded with it. My Android feeds me like this. Clicked on a couple of stories about wild pigs, flooded. Hummingbird story? Flooded.
But I’m not getting this on YouTube. 🤷🏻♂️
They don’t do it to everyone. Some people get put in test groups that get ‘nice’ algorithms that don’t try to make you angry, so they can measure the effect on their revenue.
It is entirely possible that YouTube’s algorithm doesn’t see you as someone interested in right-wing rhetoric (or perhaps you may have also downvoted such videos).
They explain how it works here (without technical details): https://blog.youtube/inside-youtube/on-youtubes-recommendation-system/