this post was submitted on 23 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Looks like I'm backing up the models I like to use. Call me paranoid but this is what happens when you put all your eggs in one basket.
That isn't paranoid at all. I backup everything to two 18TB HDDs everytime I make a new SD LoRA or train an SVC/RVC voice clone.
As an aside, these huge HDD sizes are cool and all, but it's soooooo much data to trust on one or two drives, it's wild
Generally the larger disks have similar failure rates to smaller onces, which means that, if you're running a storage array, increasing the number of drives actually increases your chances of experiencing a failure, even assuming the best case that drive failures are statistically independent events, which they are not.
Larger drives also make keeping full backups much easier (i.e. entire backup fits on a single disk, which makes it easier to make and store).