• unexposedhazard@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    101
    arrow-down
    14
    ·
    1 year ago

    This is neither novel, nor morally acceptable. People that do this work usually end up traumatized for life, because of the fucked up shit they often have to look at. Prisoners are not really in a position to negotiate, meaning you can push this work on them in a sort of non consentual way that is below what modern society should strive for.

    • SCB@lemmy.world
      link
      fedilink
      English
      arrow-up
      32
      arrow-down
      2
      ·
      1 year ago

      If you actually really the article, she’s parsing real estate news articles.

      Most AI jobs do not involve CP.

      • Hamartiogonic@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        2
        ·
        edit-2
        1 year ago

        Rule number 1 on Reddit is: “never read the article “

        I guess that still applies here.

        Rule 2: “disagree with everyone”

        Rule 3: “You’re always right”

        Rule 4: “everyone else is always wrong“

        I’m sure there are lots of other rules, but that should get anyone started in the modern social media.

    • hh93@lemm.ee
      link
      fedilink
      English
      arrow-up
      32
      arrow-down
      3
      ·
      1 year ago

      Training an AI is not traumatizing - what you think it is moderating public networks

      • unexposedhazard@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        2
        ·
        1 year ago

        Unfortunately one major sector of image machine learning is CSAM scanning, which was also recently revealed as one of the major funding parties for the planned legislation intended to allow scanning all private communication in the EU. But generally i agree most of the things they will see might not be too bad by themselves but its still a job no human really wants to do of their own free will. If they do decide to do it, it is either out of a lack of choice or because they dont know what they are getting themselves into.

      • 30mag@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        2
        ·
        1 year ago

        It depends.

        Around the world, millions of so-called “clickworkers” train artificial intelligence models, teaching machines the difference between pedestrians and palm trees, or what combination of words describe violence or sexual abuse.

    • Vipsu@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      1 year ago

      Prisoners are not really in a position to negotiate, meaning you can push this work on them in a sort of non consentual way that is below what modern society should strive for

      Well the article does mention that the prisoner “Marmalade” was not forced to do any of this.
      In fact the article mentions that she could have spend her time in her cell, doing online courses or doing chores for the prison for little cash. The fact that wired managed to just book an interview with the prisoner also makes it quite risky for the company to subject the prisoners to any traumatizing material.

      The only problem I really see with this is the fact that this doesn’t really prepare the prisoners for live outside the prison in any way.

      • unexposedhazard@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        14
        ·
        1 year ago

        Forgive me for not trusting in the investigative journalism capabilities of fucking “wired”. How much of a choice that person really had is not something u can judge from an outside perspective. If they made this the highest paying jobs then there is no real choice probably.

    • SCB@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      3
      ·
      1 year ago

      Idk man if you read the article, it’s a pretty good system.

      • woodcroft@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        1
        ·
        1 year ago

        Expecting people to read the article before they form an opinion is asking quite a bit - I won’t do more than point out the hypocrisy here.

        You are right - it’s an interesting system being tested in Finlands unique prison system. Marmalade seemed to enjoy it!

  • Sibbo@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    The article gives a really nice perspective on how morally questionable this is.

    The company gets cheap, Finnish-speaking workers, while the prison system can offer inmates employment that, [the company] says, prepares them for the digital world of work after their release.

    Yeah sure, doing more data labelling? I highly doubt data labelling gives anyone any skill besides date labelling. Luckily this article doesn’t just accept the statement of the company, but questions it very critically.

    • eltimablo@kbin.social
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      1 year ago

      Hey, that’s not fair. What if there’s a boom in companies that need to know if it’s a hot dog or not in the next 10 years?

    • boredtortoise@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      1 year ago

      “Fun” fact. One finnish content moderation AI service was founded by a far-right party member (and an ex-cop to boot). Not surprisingly, the forums and yellow press comment fields the system “moderates” are known to be filled with racism and other hate speech.

      (Utopia Analytics and Tom Packalén if someone wants to search for more)