this post was submitted on 14 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

(title)

you are viewing a single comment's thread
view the rest of the comments
[–] Susp-icious_-31User@alien.top 1 points 10 months ago

I store all mine on slow drives because no matter where you load it, RAM or VRAM, it gets fully loaded and the original file is forgotten about. And it’s not like the read speed of huge files is terrible, even on a spinning disk. Even if you overload your RAM and swap to disk, you’ll still be using your designated pagefile/swap drive rather than your LLM files drive.