Several months ago Beehaw received a report about CSAM (i.e. Child Sexual Abuse Material). As an admin, I had to investigate this in order to verify and take the next steps. This was the first time in my life that I had ever seen images such as these. Not to go into great detail, but the images were of a very young child performing sexual acts with an adult.

The explicit nature of these images, the gut-wrenching shock and horror, the disgust and helplessness were very overwhelming to me. Those images are burnt into my mind and I would love to get rid of them but I don’t know how or if it is possible. Maybe time will take them out of my mind.

In my strong opinion, Beehaw must seek a platform where NO ONE will ever have to see these types of images. A software platform that makes it nearly impossible for Beehaw to host, in any way, CSAM.

If the other admins want to give their opinions about this, then I am all ears.

I, simply, cannot move forward with the Beehaw project unless this is one of our top priorities when choosing where we are going to go.

  • snowe@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    The solution is to use an already existing software product that solves this, like CloudFlare’s CSAM Detection. I know people on the fediverse hate big companies, but they’ve solved this problem already numerous times before. They’re the only ones allowed access to CSAM hashes, lemmy devs and platforms will never get access to the hashes (for good reason).

    • thySatannic@beehaw.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Wait… why is no access to csam hashes a good thing? Wouldn’t it make it easier to detect if hashes were public?! I feel like I’m missing something here…

      • snowe@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Giving access to CSAM hashes means anyone wanting to avoid detection simply has to check what they’re about to upload against the db. If it matches then they simply modify the image until it doesn’t. It’s literally guaranteed to make the problem worse, not better.