• stevedidWHAT@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    2
    ·
    1 year ago

    Yeah this makes a lot of sense considering the vastness of language and it’s imperfections (English I’m mostly looking at you, ya inbred fuck)

    Are there any other detection techniques that you know of? Wb forcing AI models to have a signature that is guaranteed to be indentifiable, permanent, and unique for each tuning produced? It’d have to be not directly noticeable but easy to calculate in order to prevent any “distractions” for the users.

    • Grimy@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      1
      ·
      1 year ago

      The output is pure text so you would have to hide the signature in the response itself. On top of being useless since most users slightly modify the text after receiving it, it would probably have a negative effect on the quality. It’s also insanely complicated to train that kind of behavior into an llm.

      • stevedidWHAT@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        1 year ago

        Your implementation of my concept might be useless, but that doesn’t mean the concept is.

        One possible solution would be to look at how responses are structured, letter frequencies, etc. The flexibility/ambiguous nature natural language is that you can word things in many many different ways which allows for some creative meta techniques to accomplish a fingerprint.

        • Terrasque@infosec.pub
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          It is a valid idea, and not impossible. When generating text, a language model gives a list of possible tokens… or more correctly it gives a weight to every possible token where most would be 0 weight. Then there’s multiple ways to pick the next token, from always picking top one to select random from top X tokens to mirostat and so on. You could probably do some extra weighting to embed a sort of signature. At some quality loss

    • bioemerl@kbin.social
      link
      fedilink
      arrow-up
      9
      ·
      1 year ago

      forcing AI models to have a signature that is guaranteed to be indentifiable, permanent, and unique for each tuning produced

      Either AI remains entirely in the hands of fucks like open AI or this is impossible and easily removed. AI should be a free common use tool, not an extension of corporate control.

      • stevedidWHAT@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        Agreed, such power should belong to everyone or has yet to be discovered. Even Oppenheimer knew, once the cats out of the bag…

        • bioemerl@kbin.social
          link
          fedilink
          arrow-up
          6
          ·
          1 year ago

          It’s no different than owning your computer. Something is absolutely a central and productivity boosting is artificial intelligence should not be kept in the hands of the few.

          The only way that it could be is through government intervention, you don’t need an anarchist to be against an open AI monopoly.