Hi all,

Just curious if anybody knows the power required to make a llama server which can serve multiple users at once.

Any discussion is welcome:)

  • SupplyChainNext@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    figure out the size and speed you need. Buy the Nvidia pro gpus (A series) x 20-50 + the server cluster hardware and network infrastructure needed to make them run efficiently.

    Think in the several hundred thousand dollar range. I’ve looked into it.