this post was submitted on 24 Nov 2023
1 points (100.0% liked)
LocalLLaMA
4 readers
4 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
At full precision as in FP16, you are not going to be able to fit ot in a 4090. So if that's your goal, between the choices you are giving, there is only 1 choice. But it won't be speedy.