this post was submitted on 23 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] ex0thrmic@alien.top 1 points 11 months ago (5 children)

Looks like I'm backing up the models I like to use. Call me paranoid but this is what happens when you put all your eggs in one basket.

[–] Prince_Noodletocks@alien.top 1 points 11 months ago (4 children)

That isn't paranoid at all. I backup everything to two 18TB HDDs everytime I make a new SD LoRA or train an SVC/RVC voice clone.

[–] parasocks@alien.top 1 points 11 months ago (2 children)

As an aside, these huge HDD sizes are cool and all, but it's soooooo much data to trust on one or two drives, it's wild

[–] teleprint-me@alien.top 1 points 11 months ago

I'm not alone. 🫠

load more comments (1 replies)
load more comments (2 replies)
load more comments (2 replies)