I am talking about this particular model:

https://huggingface.co/TheBloke/goliath-120b-GGUF

I specifically use: goliath-120b.Q4_K_M.gguf

I can run it on runpod.io on this A100 instance with “humane” speed, but it is way too slow for creating long form text.

https://preview.redd.it/fz28iycv860c1.png?width=350&format=png&auto=webp&s=cd034b6fb6fe80f209f5e6d5278206fd714a1b10

These are my settings in text-generation-webui:

https://preview.redd.it/vw53pc33960c1.png?width=833&format=png&auto=webp&s=0fccbeac0994447cf7b7462f65d79f2e8f8f1969

Any advice? Thanks

    • Worldly-Mistake-8147@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      I’m sorry for a little side-track, but how much context you able to squeeze into your 3 GPUs with Goliath’s 4bit quant?
      I’m considering to add another 3090 to my own doble-GPU setup just to run this model.

      • panchovix@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        I tested 4K and it worked fine at 4.5bpw. Max will be prob about 6k. I didn’t use 8bit cache

        Now 4.5bpw is kinda overkill, 4.12~ bpw is like 4bit 128g gptq, and that would let you use a lot more context.

  • whtne047htnb@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    The GGUF one has 140 layers, more than what the textgen UI supports (128). So the slowness may be because you are using CPU for some layers (check your terminal output when loading the model). But you can manually change the source code and set the max value of the n_gpu_layers slider to a higher value (just grep for it).

    • kruk2@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      or open the UI, go to model page, right click on the layers slider -> inspect element
      and update max value for the input field from 128 to 256

        • MINIMAN10001@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          I mean it makes sense The value is chosen we’re simply chosen for being a reasonable window at the time.

          There was nothing hard coded about them they were simply a range of values that they had set for the UI.

          It certainly is interesting though.