Robin Williams’ daughter Zelda says AI recreations of her dad are ‘personally disturbing’::Robin Williams’ daughter Zelda says AI recreations of her dad are ‘personally disturbing’: ‘The worst bits of everything this industry is’

  • Blapoo@lemmy.ml
    link
    fedilink
    English
    arrow-up
    143
    arrow-down
    5
    ·
    1 year ago

    Disturbing is an understatement. I’d call them repulsive. Relatives should be the only ones with this power, if at all.

    Sure as shit not corporations. Fuck.

    • whatwhatwhatwhat@lemmy.world
      link
      fedilink
      English
      arrow-up
      50
      arrow-down
      2
      ·
      1 year ago

      Agreed, we desperately need regulations on who has the right to reproduce another person’s image/voice/likeness. I know that there will always be people on the internet who do it anyway, but international copyright laws still mostly work in spite of that, so I imagine that regulations on this type of AI would mostly work as well.

      We’re really in the Wild West of machine learning right now. It’s beautiful and terrifying all at the same time.

        • TwilightVulpine@lemmy.world
          link
          fedilink
          English
          arrow-up
          20
          ·
          1 year ago

          Copyright IS too strong, but paradoxically artists’ rights are too weak. Everything is aimed to boost the profits of media companies, but not protect the people who make them. Now they are under threat of being replaced by AI trained on their own works, no less. Is it really worth it to defend AI if we end up with less novel human works because of it?

          • lloram239@feddit.de
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            7
            ·
            edit-2
            1 year ago

            Now they are under threat of being replaced by AI trained on their own works, no less.

            And they themselves trained on the work of other artists too. It’s just the circle of life. AI just happens to be better at learning than humans.

            Is it really worth it to defend AI if we end up with less novel human works because of it?

            AI doesn’t need defending, it fill steamroll us all just by itself. We don’t really have a choice in this.

            • TwilightVulpine@lemmy.world
              link
              fedilink
              English
              arrow-up
              7
              ·
              1 year ago

              The “circle of life” except that it kills the artists’ careers rather than creating new ones. Even fledgling ones might find that there’s no opportunity for them because AIs are already gearing to take entry-level jobs. However efficient AI may be at replicating the work of artists, the same could be said of a photocopier, and we laws to define how those get to be used so that they don’t undermine creators.

              I get that AI output is not identical and its output doesn’t go foul under existing laws, but the principles behind them are still important. Not only Culture but even AI itself will be lesser for it if human artists are not protected, because art AIs quickly degrade when AI art is fed back into it en masse.

              Don’t forget that the kind of AI we have doesn’t do anything by itself. We don’t have sentient machines, we have very elaborate auto-complete systems. It’s not AI that is steamrolling artists, it’s companies seeking to replace artists with AIs trained on their works that are threatening them. That can’t be allowed.

              • lloram239@feddit.de
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                6
                ·
                1 year ago

                The “circle of life” except that it kills the artists’ careers rather than creating new ones.

                It will kill all the ones that are stuck on old technology. Those that can make the best use of AI will prevail, at least for a little while, until AI replaces the whole media distribution chain and we’ll just have our own personal Holodeck with whatever content we want, generated on demand.

                However efficient AI may be at replicating the work of artists, the same could be said of a photocopier

                It’s not copying existing works and never did. Even if you explicitly instruct it to copy something existing, it will create its own original spin on the topic. It’s really no different than any artist working on commission.

                Don’t forget that the kind of AI we have doesn’t do anything by itself.

                You are free to ignore the reality of it, but that’s simply not the case. AI systems are getting filled with essentially all of human knowledge and they can remix it freely to create something new. This is the kind of stuff AI can produce just by itself, within seconds, the idea is from AI and so is the actual image. Sentience is not necessary for creativity.

                It’s not AI that is steamrolling artists, it’s companies seeking to replace artists with AIs trained on their works that are threatening them.

                When the artists are that easy to replace, their work can’t have been all that meaningful to begin with.

                • TwilightVulpine@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  7
                  ·
                  1 year ago

                  It’s sad to see how AI advocates strive to replicate the work of artists all the while being incredibly dismissive of their value. No wonder so many artists are incensed to get rid of everything AI.

                  Besides, it’s nothing new that media companies and internet content mills are willing to replace quality with whatever is cheaper and faster. To try to use that as an indictment against those artists’ worth is just… yeesh.

                  This is the kind of stuff AI can produce just by itself, within seconds, the idea is from AI and so is the actual image.

                  You realize that even this had to be set up by human beings right? Piping random prompts through art AI is impressive, but it’s not intelligent. Don’t let yourself get caught on sci-fi dreams, I made this mistake too. When you say “AI will steamroll humans” you are assigning awareness and volition to it that it doesn’t have. AIs maybe filled with all human knowledge but they don’t know anything. They simply repeat patterns we fed into them. An AI could give you a description of a computer, it could generate a picture of a computer, but it doesn’t have an understanding. Like I said before, it’s like a very elaborate auto-complete. If it could really understand anything, the situation would be very different, but the fact that even its most fierce advocates use it as a tool shows that it’s still lacking capabilities that humans have.

                  AI will not steamroll humans. AI-powered corporate industries, owned by flesh and blood people, might steamroll humans, if we let them. If you think that will get to just enjoy a Holodeck you are either very wealthy or you don’t realize that it’s not just artists who are at risk.

            • wanderingmagus@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Steamroll us? Not if I have anything to say about it. I look forward to setting condition 1SQ for strategic launch of thermonuclear weapons. Hooyah navy. If this is to be our end, then let it be SUCH an end, so as to be worthy of remembrance. I would rather this entire planet and all things upon it burn in radioactive fire than be sacrificed on the altar of technology.

      • lloram239@feddit.de
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 year ago

        but international copyright laws still mostly work in spite of that, so I imagine that regulations on this type of AI would mostly work as well.

        The thing is, people still don’t grasp the ease with which this will be possible and to a large degree already. This doesn’t need hours of training anymore, you can clone voices with three seconds of audio and faces from a single image. Simple images can be clicked together in seconds with zero effort. Give it a few more years and you video can be created with equal ease.

        You can regulate commercial use of somebodies likeness, which it largely already is, but people doing it for fun is unstoppable. This stuff is here today and it will get a whole lot more powerful going forward.

        • vidarh@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Just a few years back, Vernor Vinge’s scifi novels still seemed reasonably futuristic in dealing with the issue of fakes well by including several bits where the resolution of imagery was a factor in being able to analyze with sufficient certainty that you were talking to the right person, and now that notion already seems dated, and certainly not enough for a setting far into the future.

          (at least they don’t still seem as dated as Johnny Mnemonic’s plot of erasing a chunk of your memories to transport an amount of data that would be easier and less painful to fit in your head by stuffing a microsd card up your nose)

      • _number8_@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        1 year ago

        yeah i don’t think it should be legislated against, especially for private use [people will always work around it anyway], but using it for profit is really, viscerally wrong

        • TwilightVulpine@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 year ago

          You know I’m not generally a defender of intellectual property, but I don’t think in this case “not legislating because people will work around it” is a good idea. Or ever, really. It’s because people will try to work around laws to take advantage of people that laws need to be updated.

          It’s not just about celebrities, or even just about respect towards dead people. In this case, what if somebody takes the voice of a family member of yours to scam your family or harass them? This technology can lead to unprecedented forms of abuse.

          In light of that, I can’t even mourn the loss of making an AI Robin Willians talk to you because it’s fun.

      • banneryear1868@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        edit-2
        1 year ago

        IMO people doing it on their own for fun/expression is different than corporations doing it for profit, and there’s no real way to stop that. I think if famous AI constructs become part of big media productions, it will come with a constructed moral justification for it. The system will basically internalize and commodify the repulsion to itself exploiting the likeness of dead (or alive) actors. This could be media that blurs the line and proports to ask “deep questions” about exploiting people, while exploiting people as a sort of intentional irony. Or it will be more like a moral appeal to sentimentality, “in honor of their legacy we are exploiting their image, some proceeds will support causes they cared about, we are doing this to spread awareness, the issue they are representing are too important, they would have loved this project, we’ve worked closely with their estate.” Eventually there’s going to be a film like this, complete with teary-eyed behind-the-scenes interviews about how emotional it was to reproduce the likeness of the actor and what an honor it was. As soon as the moral justification can be made and the actor’s image can be constructed just well enough. People will go see it so they can comment on what they thought about it and take part in the cultural moment.

      • assassin_aragorn@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        We need something like the fair use doctrine coupled with identify rights.

        If you want to use X’s voice and likeness in something, you have to purchase that privilege from X or X’s estate, and they can tell you to pay them massive fees or to fuck off.

        Fair use would be exclusively for comedy, but still face regulation. There’s plenty of hilarious TikToks that use AI to make characters say stupid shit, but we can find a way to protect voice actors and creators without stifling creativity. Fair use would still require the person’s permission, you just wouldn’t need to pay to use it for such a minor thing – a meme of Mickey Mouse saying fuck for example.

        At the end of the day though, people need to hold the exclusive and ultimate right to how their likeness and voice are used, and they need to be able to shut down anything they deem unacceptable. Too many people are concerned with what is capable than with acting like an asshole. It’s just common kindness to ask someone if you can use their voice for something, and respecting their wishes if they don’t want it.

        I don’t know if this is a hot take or not, but I’ll stand by it either way – using AI to emulate someone without their permission is a fundamental violation of their rights and privacy. If OpenAI or whoever wants to claim that makes their product unusable, tough fucking luck. Every technology has faced regulations to maintain our rights, and if a company can’t survive without unbridled regulations, it deserves to die.

    • Jaded@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      15
      ·
      1 year ago

      What about the third option, everyone gets to have the power?

      I’ve seen what Marvin Gaye and Conan Doyle’s relatives have done with the power. Dump it in the creative commons. Nobody should own the tonalities of a voice anyways, there quickly wouldn’t be any left.

      • Puzzle_Sluts_4Ever@lemmy.world
        link
        fedilink
        English
        arrow-up
        26
        arrow-down
        6
        ·
        1 year ago

        Considering the internet is already a hellscape of deepfake porn, let’s not take the libertarian approach to this, 'kay?

        Also, there are two major issues at hand that you are conflating.

        People aren’t doing AI recreations of Robin Williams because they love the way he said “zucchini”. They are doing it because of the novelty of hearing Robin perform their material or making him say “Happy Birthday Fred” or “Jewish Space Lizards Control Kansas” or whatever. Much like with deepfake porn, the appeal is using someone against their will for your own pleasure.

        The other aspect, and what the SAG and WGA strikes have been about (and which Robin famously preempted over twenty years ago), is training data. It is the idea of using past footage and performances to make a super actor (similar to what Square tried with FF The Spirits Within). So you might have Tom Cruise’s gait coupled with Ryan Reynolds’s chin and Hugh Jackman’s nipples and so forth. And, that is still a huge mess.

          • Puzzle_Sluts_4Ever@lemmy.world
            link
            fedilink
            English
            arrow-up
            11
            arrow-down
            5
            ·
            1 year ago

            If your bad faith requirement is complete eradication, sure.

            If the goal is to vastly diminish the amount of content out there by preventing monetization and providing a legal means to pull said content? As well as to vilify the concept? Then yeah, it works.

              • Puzzle_Sluts_4Ever@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                3
                ·
                1 year ago

                Just to check: Vilifying deepfake porn and child porn is “not a positive moral outcome”?

                Holy shit. Most libertarians at least say the quiet part quiet.

                • zurneyor@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  Deepfake porn is certainly debatable. Are you against rule 34 of celebrities? Of photoshopping celebrity images to make them nude? Deepfaking is just extending that idea, and if it gets popular enough no one will take nude leaks seriously anymore.

                  Child porn you definitely would want to be faked. So long as they are faked, real children aren’t being hurt

          • wanderingmagus@lemm.ee
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            1 year ago

            Prohibition of CSAM seems to be universally accepted as a thing we should keep doing. What say you to that?

      • brsrklf@jlai.lu
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        In the context of close relatives being very disturbed by what is made with the person’s image, I really don’t think legally allowing absolutely everyone to do as they please with it will help.

  • OprahsedCreature@lemmy.ml
    link
    fedilink
    English
    arrow-up
    56
    arrow-down
    4
    ·
    1 year ago

    Capitalism literally Weekend at Berniesing the corpse of Robin Williams for profit.

    This is fine

  • Case@unilem.org
    link
    fedilink
    English
    arrow-up
    39
    arrow-down
    1
    ·
    1 year ago

    Imagine losing your father in a tragic fashion, only for Hollywood execs to make a marketable facsimile of appearance and voice. If they could store his corpse and make it dance like a marionette they would.

    Talk about retraumatizing the poor lady.

  • AllonzeeLV@lemmy.world
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    8
    ·
    edit-2
    1 year ago

    Hate it all you want. There’s a buck to be made by our owners, so it will proceed.

    Humanity at large is literally letting humanity’s owner class destroy our species’ only habitat, Earth, in the name of further growing their ego scores in the form of short term profit.

    Who gives a shit about them stealing a dead celebrity’s voice in the face of that? The hyper-rich stealing IP from the regular rich is wrong and should be illegal, but is clearly pretty far down the totem pole. Let’s say we put all our effort into stopping them from doing that and win. We’re still terraforming the planet to be less hospitable to human life, Zelda Williams included.

    Priorities, can we have them? And no we can’t “do both,” because we have had no success stopping the owner class from doing anything that hurts others to further enrich themselves. I’m for putting all our effort into our species still being able to feed itself and having enough fresh water.

    • daemoz@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      Extremely anti post-modern-organic bias you seem to have. If we dont fill space with plastic and heat it enough, then HOW exactly do you propose we encourage establishing an entire Carbon-Polyethylene based evolutionary tree ?? 🌳

  • scarabic@lemmy.world
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    11
    ·
    1 year ago

    This is such a random thought and I don’t mean to conclude anything by it:

    I’ll bet people felt this way about the very first audio recordings.

    How creepy to hear your sibling’s voice when that sibling is not even in the room!

    …and moving pictures:

    It looks like your mother is right there but she’s been dead for 10 years! Gah!

      • scarabic@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        You tend to consent to a photo or video

        I’m not sure what you mean. There’s nothing more consensual about photography necessarily. Paparazzi are a thing, for example.

        I think the real difference here is that we understand video and audio recordings, we even have some laws governing when you can record someone. So we are comfortable with those technologies. Above all, we’re used to them.

        AI isn’t the exact same thing but I think the main source of discomfort is its newness and mysteriousness. We don’t have laws governing it. We don’t understand it very well. This makes it creepy.

        • Julius_Seizures@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          1
          ·
          1 year ago

          I think consent is the most important discussion here. The people that continue to profit (monetarily or otherwise) off dead creators are often looked down upon, eg. Brian Herbert’s Dune continuation, Stephen Hillenberg’s death and continuation of spongebob (and it’s spin offs), etc. Terry Pratchett had in his will to use a steamroller to destroy all his unfinished works as he knew if not they would likely be used to profit after his death without him.

          I’m a proponent of the recent advances in machine learning, I use machine learning in my field and I write and use models for hobby level things. I’m also fully a proponent of using these things ethically, and consent here is the most important thing.

          If I created a doctored photograph of Robin Williams (even doing something innocuous) that was clearly not something he did and plastered it around the internet it would be in bad taste. If Robin Williams consented to people doing that then sure whatever its nbd. Photographs and recordings should be used with consent, and things like the paparazzi taking non consensual photos are not looked upon as particularly ethical endeavors.

          • scarabic@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            1 year ago

            Let’s say one of your parents dies and years later you stumble upon a voice recording that your sibling made of them. Your heart would probably be warmed just to hear their voice. It wouldn’t change that if you realized that your brother had recorded them from behind without their knowledge. You’d still be comfortable with that representation of your father.

            Another example: there are services which can take an old photo of a dead relative and turn it into a sort of “Harry Potter moving picture” kind of deal, using deepfake technology. Most people are amazed and touched in a positive way when they see these.

            I think someday when AI is much more mundane to us, someone out there will take old voice recordings of their long lost father, train an AI bot on them, and present it as a gift to their sibling. That sibling will have a conversation with it, and their eye will mist up, and they’ll say thank you this is so touching and wonderful.

            It’s merely a question of being comfortable with the technology itself.

    • Something_Complex@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      ·
      edit-2
      1 year ago

      To be honest it is a bit creepy if it wasn’t from Robin Williams’ personality.

      If you hear a message you brother left you is one thing. But listening to him taking when someone else is faking his voice and saying whatever they want.

      That’s the only difference, those video recording where of you brother.

      These deep-fake things are someone else speaking in your brother’s voice. A corporation using your brother to sell products and services.

      Nothing to do with him and his personality

      • Comment105@lemm.ee
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        1
        ·
        1 year ago

        Yeah, there’s a significant difference between a recording and generating.

        • MimicJar@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          1 year ago

          What are your thoughts on things like Photoshop? As an example https://old.reddit.com/r/Damnthatsinteresting/comments/psqkz6/how_you_remove_a_ex_from_your_family_picture_with/

          That is generating an image, and in turn an event that didn’t happen.

          There are also cases of “repairing” old photos. Sometimes an old black and white photo is torn or faded, but we can restore it, we can add back things that aren’t there.

          After someone passes away you often hear people say “I wish I could hear their voice again” or “I wish I could have one last conversation”.

          I’m not going to deny that there are A LOT of terrible things that could be done with this technology. I just wonder what positives exist and how we might improve things.

          • Comment105@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            I guess the two categories didn’t cover the whole space of possibilities.

            But many cases are clear-cut.

        • scarabic@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          5
          ·
          1 year ago

          I think the difference is that people understand recordings now and do not understand genAI yet. Therefore the former is “fine” and the latter is “creepy.” You could make many arguments about recordings that someone from the 1800s would be concerned about: Taking my words out of context. Editing my words to change what I said. Am I accountable for what I said when it is heard as a recording? Is my permission required for recording?

          As you’ll notice, we even have laws for some of this now. Those no doubt came from people flipping their shit about the new technology, just as we’re doing now.

    • eumesmo@lemmings.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      1 year ago

      It’s not just a matter of discomfort for something new, but at something highly dangerous. Deepfakes have several bad and disturbing use cases, like itentity theft, sexual exploitation, marketing abuse, political manipulation, etc. In fact, I hard to find a significant good use of such technology.

      • stevedidWHAT@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Ops point remains, this is exactly what everyone said about photos and then videos and then video with sound etc.

        You’ve always been told you can’t see what’s on the internet, now that’s even more true.

        There are ways we process and handle new tech, there’s a grace period to figure out issues and solutions.

        Part of the problem is regressionist ideals holding everyone back from making real changes. Being able to generate nudies of your crush is the tip of the iceberg and demonstrates our ability to create teachable models that perform well and reliably to reconstruct images from noise. There lots of applications but ultimately making images is just art and it’s sorta hard to break out of that sphere easily.

  • WuTang @lemmy.ninja
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    4
    ·
    1 year ago

    You don’t need to be the son or daughter of a celebrity, just think about it 5 freaking seconds.

  • banneryear1868@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    edit-2
    1 year ago

    Seeing Tupac’s hologram perform to a cheering crowd was when it crossed the line in to creepy for me. A lot of people seem turned off by this at least, and it’s really exposing how these studios think of people. I think this could turn in to a thing where the studios really push these personality constructs, while many actors and the public will be morally opposed to it. So the studios might have to appeal to a moral justification for when it’s appropriate to use these AI constructs, like, “we really wanted to honor Robin with this project that we felt carried on his legacy, and a percentage of proceeds will go to the good foundation to help other’s who suffer like Robin did, so seeing Robin’s personality construct perform for you is really a moral duty and helps make the world a better place.” Also anywhere AI isn’t noticeable to the viewer, for the cost savings and avoiding the negative reaction to it.

    I think there will be studios producing fully AI-driven content though. They’ll be like low budget and corny, a diarrhea level of quantity and quality. Not unlike those campy dramatized skits on YouTube now where it’s like, “homeless girl steals a rich man’s heart, will make you cry.” They’ll be these ultra-niche AI generated shorts that are a mix of advertisement and generic story arc. The AI spam is already pretty hilarious, “Elon has an invention that can make anyone a millionaire in 30 days.” I think we’re about to witness a dearth of content so shitty that no present day comparison could describe.

    • BillMurray@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      1 year ago

      Hold on, 50 cent had a hologram? Wouldn’t it be easier and cheaper to just hire him, since he’s still alive… when was this?

      edit: see OP changed his comment from 50 cent to Tupac 🙄

  • ShittyRedditWasBetter@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    3
    ·
    1 year ago

    Get used to it. Best case stuff like this gets covered commercially. Nobody is going to be able to regulate what individuals can do.

  • _number8_@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    7
    ·
    1 year ago

    imaginary scenario:

    you love good will hunting, you’re going thru a tough time, and you use AI to have robin williams say something gentle and therapist-y that directly applies to you and your situation – is this wrong?

    • Naz@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      3
      ·
      1 year ago

      I’ve asked extremely high end AI questions on ethics of this nature and after thinking for exactly 14.7 seconds it responded with:

      • The ethics of generating images, sound, or other representations of real people is considered no different than active imagination when done for fun and in privacy.

      • However, spreading those images to others, without the original person’s consent is considered a form of invasion of privacy, impersonation, and is therefore unethical.

      Basically, you’re fine with imagining Robin Williams talking to you, but if you record that and share it with others/disseminate the content, then it becomes unethical.

      • TwilightVulpine@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        5
        ·
        1 year ago

        • The ethics of generating images, sound, or other representations of real people is considered no different than active imagination when done for fun and in privacy.

        That doesn’t sound right at all. Copying and processing somebody’s works for the sake of creating a replica is completely different than imagining it to yourself. Depending on how its done, even pretending that it’s being done solely for yourself is incorrect. Many AI-based services take feedback from what their users do, even if they don’t actively share it.

        Just like looking at something, memorizing it and imitating it is allowed while taking a picture may not be, AI would not necessarily get the rights to engage with media as people do. It’s not an independent actor with personal rights. It’s not an extension of the user. It’s a tool.

        Then again I shouldn’t be surprised that an AI used and trained by AI users, replies about its use as basically a natural right.

        • JackbyDev@programming.dev
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          Please see the second point. Essentially you cannot commit copyright violation if you don’t distribute anything. Same concept.

          • TwilightVulpine@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            These AIs are not being produced by the rights owners so it seems unlikely that they are being built without unauthorized distribution.

            • JackbyDev@programming.dev
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              I get your point, but I think for the purpose of the thought exercise having the model built by yourself is better to get at the crux of “I am interested in making an image of a dead celebrity say nice things to me” especially since the ethics of whether or not building and sharing models of copyrighted content is a totally different question with its own can of worms.

    • Empricorn@feddit.nl
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      I wouldn’t apply morality, but I bet it isn’t healthy. I would urge this theoretical person to consult with an actual licensed therapist.