• CanadaPlus@lemmy.sdf.org
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      36 minutes ago

      If it was identifying Lemmy users, it definitely would be. But, it’s a tool that reveals identities of a small, supposedly accountable group during real-life interactions, and we’re just mentioning it, so it seems like there’s at least an argument to allow it.

    • Gaywallet (they/it)@beehaw.org
      link
      fedilink
      arrow-up
      8
      ·
      3 hours ago

      I would love to hear what has you concerned about a tool which provides a piece of information which is, by law (California Penal Code Section 830.10), supposed to be accessible to all individuals interacting with the officer - their name and/or badge number.

      • Boomkop3@reddthat.com
        link
        fedilink
        arrow-up
        1
        ·
        2 hours ago

        The concern was the lack of knowledge that this was public. I noticed it’s in the article, I may have read over it

        • Gaywallet (they/it)@beehaw.org
          link
          fedilink
          arrow-up
          3
          ·
          1 hour ago

          Even if an officer’s name and badge number were not public (which would be weird, because both of these are a part of a police officer’s uniform), what is the concern about a tool which provides these?

            • Gaywallet (they/it)@beehaw.org
              link
              fedilink
              arrow-up
              3
              ·
              22 minutes ago

              You believe that a police officer, who is doing public actions, in a public role, should be given privacy while performing public actions? Say more

    • Vodulas [they/them]@beehaw.org
      link
      fedilink
      arrow-up
      8
      ·
      edit-2
      4 hours ago

      A. No, this is an article talking about the tool.

      B. Cops are public figures. Name and badge number are public information. Hence why the first sentence in the article states it uses public records. It does not give their address and phone number. It is not doxxing

      • Boomkop3@reddthat.com
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        2 hours ago

        A. A gun is a tool as well, doesn’t mean you should make them public available

        B. That makes a lot of sense. I’m not from around there, sorry for the misunderstanding

        • icelimit@lemmy.ml
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          52 minutes ago

          In your example for (A), you’ve unnecessarily used a controversial item in comparison to information/tool that is publicly available and cannot be used to do harm beyond holding individuals accountable, which for law enforcement, needs to be doubly more so.

          A more comparable example for (A) would’ve been something like the location of a police station, which of course needs to be public, and publicly available and announced.

        • Vodulas [they/them]@beehaw.org
          link
          fedilink
          arrow-up
          3
          ·
          2 hours ago

          But an article about how guns are used and that they exist is not the same as selling them. I can see the argument that you should not even report on them because it makes them more popular, but at least in the US, guns are pretty permeated through society

          • Boomkop3@reddthat.com
            link
            fedilink
            arrow-up
            2
            ·
            54 minutes ago

            I think it’s more akin to a “get guns ez pz” article. Even if most people can get them, a lot of people don’t because it’s a hassle. But to be fair, if it’s public information then heck, it was only a matter of time until there was a website making it ez pz.

            That’s not this article’s fault. And some important context I managed to miss at first :/

            • Vodulas [they/them]@beehaw.org
              link
              fedilink
              arrow-up
              1
              ·
              13 minutes ago

              Still very US centric, but guns are incredibly easy to get here. I live in a “progressive” state and I don’t even have to take a single class to get one, or get a concealed carry permit

  • Megaman_EXE@beehaw.org
    link
    fedilink
    arrow-up
    19
    ·
    edit-2
    24 hours ago

    I find this funny because the police have been doing this with civilians. My main concern is that this tech is not 100% accurate. I feel like it shouldn’t be used on its own.

    I guess if it is used as a supplementary tool and not the main piece of evidence, it could maybe be okay? But I would be scared it would target an innocent individual, which could cause very negative or dangerous consequences. The main thing would be accuracy. I don’t know if it was addressed as part of the article is paywalled.

    Edit: I think anything that forces those in power to take accountability for their actions is great though. More tools should be in place to prevent abuse of power

    • BurningRiver@beehaw.org
      link
      fedilink
      arrow-up
      8
      ·
      18 hours ago

      I find this funny because this

      My main concern is that this tech is not 100% accurate. I feel like it shouldn’t be used on its own.

      Is generally the least of their concerns.

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      English
      arrow-up
      5
      ·
      20 hours ago

      That’s the nuance of AI that anyone who has done any actual work with ML has known for decades now. ML is amazing. It’s not perfect. It’s actually pretty far from perfect. So you should never ever use it as a solo check, but it can be great for a double check.

      Such as with cancer. AI can be a wonderful choice to detecting a melanoma, if used correctly. Such as:

      • a doctor has already cleared a mole, but if you want to know if it warrants a second opinion by another doctor. You could have the model to have a confidence of say, 80% sure that the first doctor is correct in that it is fine.

      • if you do not have access to a doctor immediately, it can be a fine check, again only to a certain percentage. Say that in this case in the future you are worried but cannot access a doctor easily. A patient could snap a photo and in this case a very high confidence rating would say that it is probably fine, with a disclaimer that it is just an AI model and if it changes or you are still worried, get it checked.

      Unfortunately, all of that nuance in that it is all just probabilities is completely lost on both the creators of all of these AI tools, and the risks are not actually passed to the users so blind trust is the number one problem.

      We see it here with police too. “It said it’s them”. No, it only said to a specific confidence that it might be them. That’s a very different thing. You should never use it to find someone, only to verify someone.

      I actually really like how airport security implemented it because it’s actually using it well. Here’s an ID, it has a photo of a person. Compare it to the photo taken there in person, and it should verify to a very high confidence that they are the same person. If in doubt, there’s a human there to also verify it. That’s good ML usage.

    • Powderhorn@beehaw.org
      link
      fedilink
      English
      arrow-up
      3
      ·
      18 hours ago

      I’m miles away from AI, so this may be me talking out of my ass, but shouldn’t a smaller database (thousands) be more accurate than anything orders of magnitude larger?

  • Powderhorn@beehaw.org
    link
    fedilink
    English
    arrow-up
    22
    ·
    1 day ago

    Up next: LAPD starts covering their faces and requests an emergency budget boost for masks.