cross-posted from: https://lemmy.blahaj.zone/post/7992691

There are some straightforward opportunities for short-term safety improvements, but this is only the start of what’s needed to change the dynamic more completely.

This is a draft, so feedback welcome!

  • Scrubbles@poptalk.scrubbles.tech
    link
    fedilink
    English
    arrow-up
    13
    ·
    10 months ago

    As an instance admin, I would be all in favor of some sort of group blocklist I could subscribe and unsubscribe to. Maintained lists of things like csa, loli, or other things would be extremely helpful, and would encourage more people to run their own servers.

    While I get that individuals want free and open and a choice, fact is is that they aren’t the ones running these servers. It’s admins who have to risk things to host for other users - and giving admins an easy safety net out of the gate would really help.

    I’ve imagined something like an API through fediseer that my instance could hook into, then select specific lists that I want to block

    • density@kbin.social
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      10 months ago

      Imagining for a second that I had the technical ability to do so. The thought of running a lemmy server and letting random people make accounts sounds scary to me. Especially a “general purpose” one. I would feel responsible for the crappy stuff posted by users. How do people cope with that.

      Also would not be able to conduct the “investigations” required to determine if an instance was csam etc. Because that means you have to go and check it out! we can’t have a system where every admin is basically required to view CSAM. that’s crazy.

      • Scrubbles@poptalk.scrubbles.tech
        link
        fedilink
        English
        arrow-up
        5
        ·
        10 months ago

        Not just “feel responsible”, but under the law us admins are in fact legally responsible. I never thought I’d be a mandatory reporter and have contacts at the National Center for Missing and Exploited Children - but here I am.

        People love to say the fediverse should be completely anonymous and open to free speech - but I only a very tiny percentage of them are willing to actually spin up an instance to prove it.

        • haui@lemmy.giftedmc.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 months ago

          Thats why I dismiss free speech absolutists on sight. Also why I have a invite only instance. The people on my instance are peeps I know for a long time and trust.

    • rglullisA
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      edit-2
      10 months ago

      From previous interactions with the author, I am convinced he is not really interested in the growth of the fediverse and is more than willing to sacrifice anything if it keeps it small and on the fringes. As much as I try to steelman his arguments, I can not find a good reasoning. At best, it is just a reactionary attempt to keep the fediverse exclusive to some minority. At worst, it becomes a way to submit everyone into a ESG-compliance racket. “Nice instance you have over there, it would be a shame if it was marked as the home of nazis…”

      • The Nexus of Privacy@lemmy.blahaj.zoneOP
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        10 months ago

        No, as the article says at the very beginning, it’s that I think a big reason that fediverse isn’t growing is its failure to deal with safety.

        • rglullisA
          link
          fedilink
          English
          arrow-up
          3
          ·
          10 months ago

          This is the type of argument that makes you less credible, because even if I take what you are saying at face value it shows how all your logic is biased. If “failure to deal with safety” was such a big impediment for mass adoption, how have come the Big Tech alternative still attract billions of users?

    • xigoi@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      10 months ago

      Who is the judge of these blocklists?

      The Ministry of Truth, of course.

      • density@kbin.social
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        10 months ago

        who is the judge of the server side code? what about the terrible green on white default lemmy color scheme? who is the judge of https? who is the judge of the physical infrastructure of the internet? who is the judge of wifi6?

        omg it goes so deep judges everywhere judging me!!!

    • density@kbin.social
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      10 months ago

      global blocklists

      good thing nobody suggested that… And if they did it would be completely unenforceable.

      • The Nexus of Privacy@lemmy.blahaj.zoneOP
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Yep. But, even though I didn’t suggest it, I didn’t explicitly say that it didn’t mean global blocklists. So I clarified it, and added a footnote with more detial.

        As Instance-level federation decisions reflect norms, policies, interpretations, and (sometimes) strategy discusses, opinions differ on the definition of “bad actor.” So the best approach is probably going to present the admin of a new instance with a range of recommendations to choose between based on their preference. Software platforms should provide an initial vetted list (along with enough information for a new admin to do something sensible), and hosting companies and third-party recommenders should also be able provide alternatives.

    • haui@lemmy.giftedmc.com
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      10 months ago

      You can immediately see who is an admin and who is not.

      If you have an open instance and someone puts csam on it and reports you, you’re toast. Thats what blocklists are for and they arent new. Mastodon instances already have blocklists which every sane admin uses. They even come in different flavors.

      The way this works is that certain instances „vote“ on blocks by applying blocks to certain instances and if one, multiple or all „trusted“ insrances block an address, it „federates“ through these updated lists.