Is this accurate?

  • ReturningTarzan@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    I’m a little surprised by the mention of chatcode.py which was merged into chat.py almost two months ago. Also it doesn’t really require flash-attn-2 to run “properly”, it just runs a little better that way. But it’s perfectly usable without it.

    Great article, though. thanks. :)

    • mlabonne@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      Thanks for your excellent library! It makes sense because I started writing this article about two months ago (chatcode.py is still mentioned in the README.md by the way). I had a very low throughput using ExLlamaV2 without flash-attn-2. Do you know if it’s still the case? I updated these two points, thanks for your feedback.