• abhibeckert@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    10 months ago

    They’re not worried about CSAM. They worried about TikTok users being influenced during an election campaign.

    And yes, it is a moderation issue. Specifically, the US doesn’t want the current moderation team to be in charge of moderation.

    Disclosure: I don’t use Facebook, Intagram, Twitter, nor TikTok

    To put it in perspective, about a quarter of the US population uses TikTok. And politics are a major discussion point with the political content you’re exposed to selected by an algorithm that is opaque and constantly changing.

    It absolutely can be used to change the result of an election. And China has meddled in elections in the past (not least of all their own elections… but also foreign ones:

    “China has been interfering with every single presidential election in Taiwan since 1996, either through military exercises, economic coercion, or cognitive warfare, including disinformation or the spread of conspiracies”

    https://www.afr.com/world/asia/taiwan-warns-of-disturbing-election-interference-by-china-20240102-p5eunf

    • umami_wasabi@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      10 months ago

      It not uncommon to see misinformatuon to fabricated information appears on many SNS platforms including Facebook and Twitter. It is not unheard of Russia use social media to influence election too via popular platform that is US based. All SNS are subject to the same problem, but only TikTok have more active users thus more far reaching, but again this is a content moderation problem, not the inherent fault of TikTok itself. Whom should perform content moderation is a business decision. It should not be dictated by law, though they can make moderation standards that companies needs to comply. I think this is a bit unfair to just targeting TikTok only, and should be universal.


      EDIT:

      political content you’re exposed to selected by an algorithm that is opaque and constantly changing

      Isn’t TikTok opened access to its algorithm for reviewing?

      Actually it is not solely a content moderation problem. While some dumb and physically harmful content should be subject to moderation, speeches should be protected. Isn’t American all about the word “Freedom”? It should be free to speak what they believe, right?

      However, the recommendation algorithms might need some regulations that categorize content and have relevant display policies. For example, political content, user generated and advertisement, should be distributed equally for all views (i.e. a user will see content for all candidates for roughly same amount of time). The “addictive” thing shouldn’t be regulated as that the point of the algorithm: maximize user engagement. However, there could be a rating system similar to game ratings that affect who at what age can use which platform. Otherwise, it should be free for one to addict to something, as long as it doesn’t cause a physical harm to himself and others.