• DicJacobus@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    7 days ago

    Im not going to comment on whatever he’s commenting on.

    Im just going to re-affirm that Tim Sweeney is a fucking moron. in any and all cases

  • deltaspawn0040@lemmy.zip
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    7 days ago

    That’s not even what gatekeeping means. Unless he’s trying to stand up for the universal right to participate in the child porn fandom.

      • deltaspawn0040@lemmy.zip
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        edit-2
        7 days ago

        Oh that’s an even worse (and probably accurate) interpretation.

        “How are we supposed to do business if there are consequences for our actions?!”

  • Soapbox@lemmy.zip
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    2
    ·
    7 days ago

    Just look at that guy. If you were to ask 1000 people to describe what they thought a typical CSAM viewer looked like and averaged their responses together you would get something like this photo of Tim Sweeney.

  • nowwhernews@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    7 days ago

    If my political opponents are actually sexual predators and their speech is sexual harassment, I’m down with censoring them. That should be the least of their problems.

  • Rose@slrpnk.net
    link
    fedilink
    English
    arrow-up
    4
    ·
    7 days ago

    There’s this old adage, “never attribute to malice that which can be explained by stupidity”.

    Tim Sweeney is very ignorant. However, he’s also pretty malicious. His fedoraèd waffling should probably be taken exactly for what it is.

    • JcbAzPx@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 days ago

      I absolutely hate hanlon’s razor. It is only ever used to try to protect obviously malicious people.

  • termaxima@slrpnk.net
    link
    fedilink
    English
    arrow-up
    4
    ·
    7 days ago

    If there is one gate that definitely needs keeping, it is the kindergarten’s gate. Don’t let those creeps get away with it…

  • yermaw@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    ·
    7 days ago

    I bet hes kind of right, here in the UK we just lost a whole bunch of rights and privacies online under the guise of “protect the kids” but its kind of weird to be piping up against it when theres actually protections needed.

    • Rose@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      7 days ago

      It would be weird if it were the only time he’s called out the Google and Apple monopolies and their control over apps, but it’s been a running theme for him (and his legal battles). Two examples from a quick lookup:

      • His tweet where he calls out Apple for removing privacy apps at the request of Russia, and for allegedly threatening to remove Twitter in 2024.
      • His tweet on Apple removing the Russian social media app VK following the US sanctions related to the Russian invasion of Ukraine in 2022.

      Personally, I see no issue with platforms removing content they deem to be problematic, and I’m sure Sweeney agrees, given that the Epic store prohibits pornography for example. However, as he’s said repeatedly, Apple in particular is unique in that it removing an app means there’s practically no way for an iPhone user to access it, since there’s no sideloading.

      If it were an app dedicated to CSAM, I don’t think anyone would take issue, but his argument is that removing the app would deplatform all of its 500M users, most of whom are probably not pedos. I’m critical of people being on X, but it’s also undeniable that despite the far-right leaning and CEO, there are still leftists and people belonging to minority groups who are on it, for whatever reason. Are they pedophile and Nazi enablers too? I’m inclined to say yes, but I don’t know how many people would agree.

      Edit: Format and details.

    • JcbAzPx@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      7 days ago

      This isn’t really a change, though, I’m pretty sure. People have been able to make photo-realistic depictions a lot longer than AI has existed and those have rightfully been held to be illegal in most places because the confusion it causes makes it harder to stop the real thing.

      • chiliedogg@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        7 days ago

        I think the difference here is that Twitter has basically installed a “child porn” button. If their reaction had been to pull the product and install effective safeguards, it wouldn’t be as bad. It’s a serious fuckup, but people screw up every day.

        Instead, they’ve made it so you can pay them to have access to the child porn generator.

      • yermaw@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 days ago

        Its not really a change, so much as its suddenly incredibly easy for anyone of any ability to do it as much as they want with near seamless results.

        Every year its got easier and easier to do it more and more believably, but suddenly all you have to do is literally ask the computer and it happens. The line has to be drawn somewhere.

  • Cruel@programming.dev
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    5
    ·
    7 days ago

    How is he wrong?

    What images can I make in Grok that can’t be done with Gemini or GPT?

    • Rose@lemmy.zip
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      7 days ago

      He’s wrong because he’s not Gabe Newell. On a more serious note, the 404 report cited by the PCGamer article basically supports your point, though with the caveats that X and Musk are bad for other reasons and that those generated images make it into people’s feeds:

      The major, uhh, downside here is that people are using Grok for the same reasons they use AI elsewhere, which is to nonconsensually sexualize women and celebrities on the internet […]

      The situation on other platforms is better because there are fewer Nazis and because the AI-generated content cannot be created natively in the same feed, but essentially every platform has been polluted with this sort of thing, and the problem is getting worse, not better.

  • pyre@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    3
    ·
    7 days ago

    bold words from someone who looks like the stock photo for a pedophile.

  • WatDabney@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    161
    arrow-down
    1
    ·
    8 days ago

    If you can be effectively censored by the banning of a site flooded with CSAM, that’s very much your problem and nobody else’s.

    • mindbleach@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      64
      ·
      8 days ago

      Nothing made-up is CSAM. That is the entire point of the term “CSAM.”

      It’s like calling a horror movie murder.

      • ryper@lemmy.ca
        link
        fedilink
        English
        arrow-up
        45
        arrow-down
        1
        ·
        edit-2
        8 days ago

        It’s too hard to tell real CSAM from AI-generated CSAM. Safest to treat it all as CSAM.

        • greenskye@lemmy.zip
          link
          fedilink
          English
          arrow-up
          12
          arrow-down
          4
          ·
          8 days ago

          I get this and I don’t disagree, but I also hate that AI fully brought back thought crimes as a thing.

          I don’t have a better approach or idea, but I really don’t like that simply drawing a certain arrangement of lines and colors is now a crime. I’ve also seen a lot of positive sentiment at applying this to other forms of porn as well, ones less universally hated.

          Not supporting this use case at all and on balance I think this is the best option we have, but I do think thought crimes as a concept are just as concerning, especially given the current political climate.

          • Kanda@reddthat.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 days ago

            It was already a thing in several places. In my country it’s legal to sleep with a 16 year old, but fiction about the same thing is illegal.

          • shani66@ani.social
            link
            fedilink
            English
            arrow-up
            15
            ·
            edit-2
            7 days ago

            Sure, i think it’s weird to really care about loli or furry or any other niche the way a lot of people do around here, but ai generating material of actual children (and unwilling people besides) is actually harmful. If they can’t have effective safeguards against that harm it makes sense to restrict it legally.

            • greenskye@lemmy.zip
              link
              fedilink
              English
              arrow-up
              2
              ·
              8 days ago

              Making porn of actual people without their consent regardless of age is not a thought crime. For children, that’s obviously fucked up. For adults it’s directly impacting their reputation. It’s not a victimless crime.

              But generating images of adults that don’t exist? Or even clearly drawn images that aren’t even realistic? I’ve seen a lot of people (from both sides of the political spectrum) advocate that these should be illegal if the content is what they consider icky.

              Like let’s take bestiality for example. Obviously gross and definitely illegal in real life. But should a cartoon drawing of the act really be illegal? No one was abused. No reputation was damaged. No illegal act took place. It was simply someone’s fucked up fantasy. Yet lots of people want to make that into a thought crime.

              I’ve always thought that if there isn’t speech out there that makes you feel icky or gross then you don’t really have free speech at all. The way you keep free speech as a right necessarily requires you to sometimes fight for the right of others to say or draw or write stuff that you vehemently disagree with, but recognize as not actually causing harm to a real person.

              • CileTheSane@lemmy.ca
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                7 days ago

                Making porn of actual people without their consent regardless of age is not a thought crime. For children, that’s obviously fucked up. For adults it’s directly impacting their reputation. It’s not a victimless crime.

                That is also drawing a certain arrangement of lines and colours, and an example of “free speech” that you don’t think should be absolute.

                • greenskye@lemmy.zip
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  7 days ago

                  Yes sorry. My original statement was too vague. I was talking specifically about scenarios where there is no victim and the action was just a drawing/story/etc.

                  I’m not a free speech absolutist. I think that lacks nuance. There are valid reasons to restrict certain forms of speech. But I do think the concept is core to a healthy democracy and society and should be fiercely protected.

              • azertyfun@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                6
                ·
                8 days ago

                Drawings are one conversation I won’t get into.

                GenAI is vastly different though. Those are known to sometimes regurgitate people or things from their dataset, (mostly) unaltered. Like how you can get Copilot to spit out valid secrets that people accidentally committed by typing NPM_KEY=. You can’t have any guarantee that if you ask it to generate a picture of a person, that person does not actually exist.

                • greenskye@lemmy.zip
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  7 days ago

                  Totally fair stance to take. I’m 100% on board with extra restrictions and scrutiny over anything that is photo realistic.

                  To me, those aren’t necessarily victimless crimes, even if the person doesn’t actually exist, because they poison the well with realistic looking fakes. That is actively harmful to others, so is not a victimless crime. Instead it becomes just another form of misinformation.

          • SorteKanin@feddit.dk
            link
            fedilink
            English
            arrow-up
            3
            ·
            8 days ago

            I really don’t like that simply drawing a certain arrangement of lines and colors is now a crime

            I’m sorry to break it to you, but this has been illegal for a long time and it doesn’t need to have anything to do with CSAM.

            For instance, drawing certain copyrighted material in certain contexts can be illegal.

            To go even further, numbers and maths can be illegal in the right circumstances. For instance, it may be illegal where you live to break the encryption of a certain file, depending on the file and encryption in question (e.g. DRM on copyrighted material). “Breaking the encryption of a file” essentially translates to “doing maths on a number” when you boil it down. That’s how you can end up with the concept of illegal numbers.

            • greenskye@lemmy.zip
              link
              fedilink
              English
              arrow-up
              1
              ·
              7 days ago

              To further clarify it’s specifically around thought crimes in scenarios where there is no victim being harmed.

              If I’m distributing copyrighted content, that’s harming the copyright holder.

              I don’t actually agree with breaking DRM being illegal either, but at least in that case, doing so is supposedly harming the copyright holder because presumably you might then distribute it, or you didn’t purchase a second copy in the format you wanted or whatever. There’s a ‘victim’ that’s being harmed.

              Doodling a dirty picture of a totally original character doing something obscene harms absolutely no one. No one was abused. No reputation (other than my own) was harmed. If I share that picture with other consenting adults in a safe fashion, again no one was harmed or had anything done to them that they didn’t agree to.

              It’s totally ridiculous to outlaw that. It’s punishing someone for having a fantasy or thought that you don’t agree with and ruining their life. And that’s an extremely easy path to expand into other thoughts you don’t like as well. And then we’re back to stuff like sodomy laws and the like.

        • mindbleach@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          19
          ·
          8 days ago

          You can insist every frame of Bart Simspon’s dick in The Simpsons Movie should be as illegal as photographic evidence of child rape, but that does not make them the same thing. The entire point of the term CSAM is that it’s the actual real evidence of child rape. It is nonsensical to use the term for any other purpose.

          • deranger@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            14
            arrow-down
            3
            ·
            edit-2
            8 days ago

            The *entire point* of the term CSAM is that it’s the actual real evidence of child rape.

            You are completely wrong.

            https://rainn.org/get-the-facts-about-csam-child-sexual-abuse-material/what-is-csam/

            “CSAM (“see-sam”) refers to any visual content—photos, videos, livestreams, or AI-generated images—that shows a child being sexually abused or exploited.”

            “Any content that sexualizes or exploits a child for the viewer’s benefit” <- AI goes here.

            • mindbleach@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              14
              ·
              edit-2
              8 days ago

              RAINN has completely lost the plot by conflating the explicit term for Literal Photographic Evidence Of An Event Where A Child Was Raped with made-up bullshit.

              We will inevitably develop some other term like LPEOAEWACWR, and confused idiots will inevitably misuse that to refer to drawings, and it will be the exact same shit I’m complaining about right now.

              • deranger@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                5
                arrow-down
                4
                ·
                8 days ago

                Dude, you’re the only one who uses that strict definition. Go nuts with your course of prescriptivism but I’m pretty sure it’s a lost cause.

          • VeganBtw@piefed.social
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            3
            ·
            edit-2
            8 days ago

            Child pornography (CP), also known as child sexual abuse material (CSAM) and by more informal terms such as kiddie porn, is erotic material that involves or depicts persons under the designated age of majority.
            […]
            Laws regarding child pornography generally include sexual images involving prepubescents, pubescent, or post-pubescent minors and computer-generated images that appear to involve them.
            (Emphasis mine)

            https://en.wikipedia.org/wiki/Child_pornography

            • mindbleach@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              11
              ·
              8 days ago

              ‘These several things are illegal, including the real thing and several made-up things.’

              Please stop misusing the term that explicitly refers to the the real thing.

              ‘No.’

      • ruuster13@lemmy.zip
        link
        fedilink
        English
        arrow-up
        18
        arrow-down
        3
        ·
        8 days ago

        The Rape, Abuse, & Incest National Network (RAINN) defines child sexual abuse material (CSAM) as “evidence of child sexual abuse” that "includes both real and synthetic content

        Were you too busy fapping to read the article?

      • baguettefish@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        4
        ·
        edit-2
        8 days ago

        AI CSAM was generated from real CSAM

        AI being able to accurately undress kids is a real issue in multiple ways

          • rainwall@piefed.social
            link
            fedilink
            English
            arrow-up
            11
            arrow-down
            3
            ·
            edit-2
            8 days ago

            It used real images of shrek and the moon to do that. It didnt “invent” or “imagine” either.

            The child porn it’s generating is based on literal child porn, if not itself just actual child porn.

            • mindbleach@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              8
              ·
              edit-2
              8 days ago

              You think these billion-dollar companies keep hyper-illegal images around, just to train their hideously expensive models to do the things they do not want those models to do?

              Like combining unrelated concepts isn’t the whole fucking point?

              • mcv@lemmy.zip
                link
                fedilink
                English
                arrow-up
                12
                ·
                8 days ago

                No, I think these billion dollar companies are incredibly sloppy about curating the content they steal to train their systems on.

                • mindbleach@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  8 days ago

                  True enough - but fortunately, there’s approximately zero such images readily-available on public websites, for obvious reasons. There certainly is not some well-labeled training set on par with all the images of Shrek.

              • CerebralHawks@lemmy.dbzer0.com
                link
                fedilink
                English
                arrow-up
                4
                ·
                8 days ago

                Yes and they’ve been proven to do so. Meta (Facebook) recently made the news for pirating a bunch of ebooks to train its AI.

                Anna’s Archive, a site associated with training AI, recently scraped some 99.9% of Spotify songs. They say at some point they will make torrents so the common people can download it, but for now they’re using it to teach AI to copy music. (Note: Spotify uses lower quality than other music currently available, so AA will offer nothing new if/when they ever do release these torrents.)

                So, yes, that is exactly what they’re doing. They are training their models on all the data, not just all the legal data.

                • mindbleach@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  8 days ago

                  It’s big fucking news when those datasets contain, like, three JPEGs. Because even one such JPEG is an event where the FBI shows up and blasts the entire hard drive into shrapnel.

                  Y’all insisting there’s gotta be some clearly-labeled archive with a shitload of the most illegal images imaginable, in order for the robot that combines concepts to combine the concept of “child” and the concept of “naked,” are not taking yourselves seriously. You’re just shuffling cards to bolster a kneejerk feeling.

              • stray@pawb.social
                link
                fedilink
                English
                arrow-up
                5
                arrow-down
                2
                ·
                8 days ago

                It literally can’t combine unrelated concepts though. Not too long ago there was the issue where one (Dall-E?) couldn’t make a picture of a full glass of wine because every glass of wine it had been trained on was half full, because that’s generally how we prefer to photograph wine. It has no concept of “full” the way actual intelligences do, so it couldn’t connect the dots. It had to be trained on actual full glasses of wine to gain the ability to produce them itself.

    • andybytes@programming.devBanned
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      8 days ago

      Maybe we are the only people that don’t f kids. Maybe this is "H’ “E” Double Hockey Sticks.

    • fonix232@fedia.io
      link
      fedilink
      arrow-up
      16
      arrow-down
      1
      ·
      8 days ago

      It is when one side of the political palette is “against” it but keeps supporting people who think CSAM is a-okay, while the other side finds it abhorrent regardless who’s pushing it.

    • andybytes@programming.devBanned
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 days ago

      I mean the capitalist are the ones calling the shots since the imperial core is no democracy. This is their battle we are their dildos.