this post was submitted on 23 Nov 2023
1 points (100.0% liked)

LocalLLaMA

11 readers
4 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] ex0thrmic@alien.top 1 points 2 years ago (1 children)

Looks like I'm backing up the models I like to use. Call me paranoid but this is what happens when you put all your eggs in one basket.

[–] Prince_Noodletocks@alien.top 1 points 2 years ago (2 children)

That isn't paranoid at all. I backup everything to two 18TB HDDs everytime I make a new SD LoRA or train an SVC/RVC voice clone.

[–] Rememberrmyname@alien.top 1 points 2 years ago

I appreciate people like you

[–] parasocks@alien.top 1 points 2 years ago (2 children)

As an aside, these huge HDD sizes are cool and all, but it's soooooo much data to trust on one or two drives, it's wild

[–] alraban@alien.top 1 points 2 years ago

Generally the larger disks have similar failure rates to smaller onces, which means that, if you're running a storage array, increasing the number of drives actually increases your chances of experiencing a failure, even assuming the best case that drive failures are statistically independent events, which they are not.

Larger drives also make keeping full backups much easier (i.e. entire backup fits on a single disk, which makes it easier to make and store).

[–] teleprint-me@alien.top 1 points 2 years ago

I'm not alone. 🫠