Was wondering if theres anyway to use a bunch of old equipment like this to build an at home crunch center for running your own LLM at home, and whether it would be worth it.

  • croholdr@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I tried it. Something like 1.2 tokens inference on lamma 70b with a mix of cards (but 4 1080s). Would process would crash occasionally. Ideally every card would have the same vram.

    Going to try it with 1660 TI’s. I think it may be the ‘sweet spot’ in power to price to performance.