this post was submitted on 23 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Prince_Noodletocks@alien.top 1 points 10 months ago (2 children)

That isn't paranoid at all. I backup everything to two 18TB HDDs everytime I make a new SD LoRA or train an SVC/RVC voice clone.

[–] Rememberrmyname@alien.top 1 points 10 months ago

I appreciate people like you

[–] parasocks@alien.top 1 points 10 months ago (2 children)

As an aside, these huge HDD sizes are cool and all, but it's soooooo much data to trust on one or two drives, it's wild

[–] alraban@alien.top 1 points 10 months ago

Generally the larger disks have similar failure rates to smaller onces, which means that, if you're running a storage array, increasing the number of drives actually increases your chances of experiencing a failure, even assuming the best case that drive failures are statistically independent events, which they are not.

Larger drives also make keeping full backups much easier (i.e. entire backup fits on a single disk, which makes it easier to make and store).

[–] teleprint-me@alien.top 1 points 10 months ago

I'm not alone. 🫠