After being scammed into thinking her daughter was kidnapped, an Arizona woman testified in the US Senate about the dangers side of artificial intelligence technology when in the hands of criminals.

  • DarkThoughts@kbin.social
    link
    fedilink
    arrow-up
    9
    ·
    1 year ago

    Is no one questioning how the alleged kidnappers managed to create a voice profile from a random 15 year old girl to create such a convincing AI voice? The only source that claims that this was potentially an AI scam, was in fact just another parent:

    But another parent with her informed her police were aware of AI scams like these.

    Isn’t it more likely that dad & daughter did this and it backfired?

    • davidhun@lemmy.sdf.org
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Given the prevalence of social media platforms where you post videos of yourself, it seems pretty easy to get enough voice sampling to generate a convincing clone. Depending on how much personal info she and her family members put out on social media, it’s trivial to connect all the dots to concoct a plausible scenario to scam someone.

      Now whether or not it was “just a prank, bro” from family or whomever, I don’t know.

      • Scrubbles@poptalk.scrubbles.tech
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Correct, it does not take much anymore to train up a voice model, especially a hysterical sounding one that would trick a mother. Teens post enough on social media that this could be done

    • Stumblinbear@pawb.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      It’s pretty easy to create voice clones, now. As long as you tailor the speech you want it to speak and don’t have it speak too long it can get pretty good even with very little input

  • carnha@lemmy.one
    link
    fedilink
    arrow-up
    9
    ·
    1 year ago

    I’ve only been thinking about the implications of faking a celebrity’s voice - personalizing it like this makes me sick to my stomach. Had no idea it’s already that easy. I don’t think the voice would even have to be that realistic - if they’re faking a life threatening situation, my first thought isn’t going to be “Hey, their voice sounded a little off”. Absolutely horrifying.

  • Plume (She/Her)@beehaw.org
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    I hate everything about AI and this is not helping. It feels like we opened a door wide open that we should’ve never have touched.

  • WorseDoughnut@kbin.social
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    When DeStefano tried to file a police report after the ordeal, she was dismissed and told this was a “prank call”.

    Why am I not surprised.

  • pokexpert30@lemmy.pussthecat.org
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    Teach your grandparents about those scams. Insists on the “nowadays a computer can replicate a voice perfectly enough over the phone, it will really sounds like real” as well as the “the numéro that will call will look like it’s from me. Just hang up and call me of you receive such a call”

  • TruthButtCharioteer@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 year ago

    Soooo… holdup.

    1. Take out kidnapping insurance
    2. “Go to mexico”
    3. Get “kidnapped”
    4. Run the scam with your fancy schmancy ai
    5. Get pay out and get “rescued”