minus-squarers137@lemmy.worldtoTechnology@lemmy.world•Smaug-72B-v0.1: The New Open-Source LLM Roaring to the Top of the LeaderboardlinkfedilinkEnglisharrow-up1·9 months agoLlama 2 70B with 8b quantization takes around 80GB VRAM if I remember correctly. I’ve tested it a while ago. linkfedilink
Llama 2 70B with 8b quantization takes around 80GB VRAM if I remember correctly. I’ve tested it a while ago.