• StarServal@kbin.social
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      I’ve watched someone I know who only gets their news through Facebook descend into qdom over the last 5 years. Whenever I hear about a new thing conservatives are doing or saying, I can be sure that person will be doing or saying it within a week… which then feeds right back into Facebook for others.

      • NaoPb@beehaw.org
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Thanks for the confirmation. I bet they don’t even notice it happening. Though this could happen to anyone on any side of the spectrum. It’s sad that this is what the internet has become.

    • FIash Mob #5678@beehaw.org
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      Yeah, doesn’t seem complicated to me at all. Their algorithm is programmed to keep people angry, engaged, and convinced 100% that their opinion is right. (no matter what that opinion is.)

      Keeps people clicking on shitty ads and buying stupid crap.

    • fmstrat@lemmy.nowsci.com
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      Sure it is. Is it Meta’s algorithm, is it user reach, is it paid ads, is it channels, is it memes, is it leaning, is it…

      Meta is participating in a pretty big study with actual researchers here. I’m no Meta fan, and this is partly for PR I’m sure, but this is a really good thing that more social media companies should do.

      • realslef@fedia.io
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Those seem like “how” or “why” questions to me. More complicated. The big one is “how do we prevent” and I bet we won’t get an honest answer from big social themselves. That’s why there should be independent public research.

  • fmstrat@lemmy.nowsci.com
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    1 year ago

    The US 2020 Facebook and Instagram Election Study is a joint collaboration between a group of independent external academics from several institutions and Meta

    Now we have the first results from this unusual collaboration, detailed in four separate papers—the first round of over a dozen studies stemming from the project.

    “We also find that popular proposals to change social media algorithms did not sway political attitudes.”

    “In other words, pages and groups contribute much more to segregation than users,”

    Finally, the vast majority of political news that Meta’s third-party fact-checker program rated as false was viewed by conservatives, compared to liberals. That said, those false ratings amounted to a mere 0.2 percent, on average, of the full volume of content on Facebook. And political news in general accounts for just 3 percent of all posts shared on Facebook, so it’s not even remotely the most popular type of content.

    This last bit is key. This means (up to) 15% of popitical posts werr misinformation (some nonpolitical), mostly viewed by conservatives. They do not state which way this information leans.