ihmoguy@alien.topB to Data Hoarder@selfhosted.forumEnglish · 2 years agoIt is time keep hoarding AI models as Chinese censorship hits NYC based Huggingface the biggest AI librarywww.404media.coexternal-linkmessage-square5linkfedilinkarrow-up13arrow-down10cross-posted to: localllama@poweruser.forum
arrow-up13arrow-down1external-linkIt is time keep hoarding AI models as Chinese censorship hits NYC based Huggingface the biggest AI librarywww.404media.coihmoguy@alien.topB to Data Hoarder@selfhosted.forumEnglish · 2 years agomessage-square5linkfedilinkcross-posted to: localllama@poweruser.forum
minus-squareMasark@alien.topBlinkfedilinkEnglisharrow-up2·2 years ago You simply download the file and keep it. We’re talking about stuff you can run on your own computer, not services like chatgpt or such. Highly variable. Depends on model size and format. For LLMs, range would be from under 1GB up to a current maximum of ~240GB. It’s a big file of numbers (“weights” or “parameters”) either integers or floating point depending on format.
You simply download the file and keep it. We’re talking about stuff you can run on your own computer, not services like chatgpt or such.
Highly variable. Depends on model size and format. For LLMs, range would be from under 1GB up to a current maximum of ~240GB.
It’s a big file of numbers (“weights” or “parameters”) either integers or floating point depending on format.