• DarkThoughts@kbin.social
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    I think KoboldAI runs locally, but like many current AI tools it’s a pain in the ass to install, especially if you’re on Linux, especially if you’re using AMD GPUs. I wonder if we’ll see some specialized AI related cards to slot into our pci ports or something. Not a whole lot of necessary options to fill them nowadays anyway. I’d also be interested in local AI voice changers too. Maybe even packaged like a Roland VT-4 voice transformer that sits between your mic & whatever audio other audio interface you might be using, where you just throw the trained voice models onto the device and it does all the real time computing for you.

    I’m sure things get more refined over the next years though.

    • off_brand_@beehaw.org
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      It would actually be pretty cool to see TPUs you can just plug in. They come stock in a lot of Google products now, I think.