Hi all,
Just curious if anybody knows the power required to make a llama server which can serve multiple users at once.
Any discussion is welcome:)
Hi all,
Just curious if anybody knows the power required to make a llama server which can serve multiple users at once.
Any discussion is welcome:)
unless you’re doing this as a business it’s going to be massively cost prohibitive, hundreds of thousands dollars of hardware. If it is a business you better get talking to cloud vendors because GPUs are an incredibly scarce resource right now.