• tupalos@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 days ago

    What do you use to run it locally? If there was something that could use speech to text reliably to be able to use a open source option, I consider switching.

    • silverlose@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      2 days ago

      FWIW speech to text works really well on Apple stuff.

      I’m not exactly sure what info you’re looking but: my gaming PC is headless and sits in a closet. I run ollama on that and I connect to it using a client called “ChatBox”. It’s got a gtx 3060 which fits the whole model, so it’s reasonably fast. I’ve tried the 32b model and it does work but slowly.

      Honestly, ollama was so easy to setup, if you have any experience with computers I recommend giving it a shot. (Could be a great excuse to get a new gpu 😉)