ihmoguy@alien.topB to Data Hoarder@selfhosted.forumEnglish · 1 year agoIt is time keep hoarding AI models as Chinese censorship hits NYC based Huggingface the biggest AI librarywww.404media.coexternal-linkmessage-square5fedilinkarrow-up13arrow-down10cross-posted to: localllama@poweruser.forum
arrow-up13arrow-down1external-linkIt is time keep hoarding AI models as Chinese censorship hits NYC based Huggingface the biggest AI librarywww.404media.coihmoguy@alien.topB to Data Hoarder@selfhosted.forumEnglish · 1 year agomessage-square5fedilinkcross-posted to: localllama@poweruser.forum
minus-squareMasark@alien.topBlinkfedilinkEnglisharrow-up2·1 year ago You simply download the file and keep it. We’re talking about stuff you can run on your own computer, not services like chatgpt or such. Highly variable. Depends on model size and format. For LLMs, range would be from under 1GB up to a current maximum of ~240GB. It’s a big file of numbers (“weights” or “parameters”) either integers or floating point depending on format.
You simply download the file and keep it. We’re talking about stuff you can run on your own computer, not services like chatgpt or such.
Highly variable. Depends on model size and format. For LLMs, range would be from under 1GB up to a current maximum of ~240GB.
It’s a big file of numbers (“weights” or “parameters”) either integers or floating point depending on format.