minus-squareNDBellisario@alien.topBtoLocalLLaMA@poweruser.forum•What are your thoughts on the future of LLMs running mobile?linkfedilinkEnglisharrow-up1·1 year agoLatency is one thing with the internet. Any model that can run locally doesn’t need a round trip to a datacenter. This can of course depending on computer power linkfedilink
Latency is one thing with the internet.
Any model that can run locally doesn’t need a round trip to a datacenter. This can of course depending on computer power