• Amaltheamannen@lemmy.ml
      link
      fedilink
      arrow-up
      16
      ·
      11 months ago

      Check out /r/localllama. Preferably you need a Nvidia you with >= 24 GB VRAM but it also works with a cpu and loads of normal RAM, if you can wait a minute or two for a lengthy answer. Loads of models to choose from, many with no censorship at all. Won’t be as good as chatgptv4, but many are close to gpt3.

    • DarkThoughts@kbin.social
      link
      fedilink
      arrow-up
      3
      ·
      11 months ago

      I think KoboldAI runs locally, but like many current AI tools it’s a pain in the ass to install, especially if you’re on Linux, especially if you’re using AMD GPUs. I wonder if we’ll see some specialized AI related cards to slot into our pci ports or something. Not a whole lot of necessary options to fill them nowadays anyway. I’d also be interested in local AI voice changers too. Maybe even packaged like a Roland VT-4 voice transformer that sits between your mic & whatever audio other audio interface you might be using, where you just throw the trained voice models onto the device and it does all the real time computing for you.

      I’m sure things get more refined over the next years though.

      • off_brand_@beehaw.org
        link
        fedilink
        arrow-up
        2
        ·
        11 months ago

        It would actually be pretty cool to see TPUs you can just plug in. They come stock in a lot of Google products now, I think.

    • CanadaPlus@lemmy.sdf.org
      link
      fedilink
      arrow-up
      2
      ·
      11 months ago

      By design, because they don’t want some basement guy launching skynet.

      I have to agree, I trust a handful of big shops, some of which could actually be killed by ethics people against the wishes of investors, far more than the entire internet. It still might not be enough, but there is no applying breaks whatsoever if anyone can take the next step.

    • DavidGarcia@feddit.nl
      link
      fedilink
      arrow-up
      1
      ·
      11 months ago

      It won’t take long until cheap special purpose chips hit the market. Then you’ll have your offline model. There are already models that run on consumer hardware, but it’s for enthusiasts at the moment and not the same quality (but almost). But if you want to spend thousands on a PC that can handle the largest models, go ahead.