this post was submitted on 29 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

Have got some server hardware: One i use for games @ 18c36t 3,2Ghz 128Gb RAM (GTX970 so GPU processing is a no-go i assume) the other similar, but will have 256Gb. What’s best for these?

i’m only starting out and don’t understand the terms and measurements yet, but i am in the process and preparing the softwares to try. i would like to focus around the best options available to me.

Thanks

you are viewing a single comment's thread
view the rest of the comments
[–] uti24@alien.top 1 points 9 months ago (1 children)

For not biggest models are the best, so there is no best model for CPU, there is best model you are ready and willing to wait an answer.

Like Goliath-120B is great, and I am using it on i5-12400, having 0.4 tokens/second and I don't want anything less now.

[–] andromedians@alien.top 1 points 9 months ago

All right - the wait emphasises the quality of the question then. Will have a go - thanks!