this post was submitted on 25 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Question about the possibility of running large models on a 3070ti 32gb ram, what's the best way to run them if possible, without quality loss?

Speed isn't an issue, just want to be able to run such models ambiently.

you are viewing a single comment's thread
view the rest of the comments
[–] vikarti_anatra@alien.top 1 points 11 months ago

!remindme 7 days