YouTube Video Recommendations Lead to More Extremist Content for Right-Leaning Users, Researchers Suggest::New research shows that a person’s ideological leaning might affect what videos YouTube’s algorithms recommend to them. For right-leaning users, video recommendations are more likely to come from channels that share political extremism, conspiracy theories and otherwise problematic content.

  • @Stovetop@lemmy.world
    link
    fedilink
    English
    317 months ago

    What YouTube sees:

    “These videos keep eliciting reactions from users, which means that they prefer to engage with this content. This bodes well for our advertisers.”

    • @Steve
      link
      English
      67 months ago

      Exactly. If you don’t want to see them, best thing is to ignore them.

      • @xangadix@lemmy.world
        link
        fedilink
        English
        27 months ago

        I actually report every single one of them, usually the channel too, for hate speech. that seems to keep them out of my feed