this post was submitted on 28 Nov 2023
1 points (100.0% liked)

Machine Learning

1 readers
1 users here now

Community Rules:

founded 11 months ago
MODERATORS
 

I have a good PC, but the problem is with my RAM. The model is LLM, which makes the memory requirement understandable, but I can't afford to get that much RAM right now so I'm looking for a roundabout. Any help?

you are viewing a single comment's thread
view the rest of the comments
[–] MachineZer0@alien.top 1 points 9 months ago

You can pick up an old Xeon based server preconfigured with 512gb-1tb RAM for $350-1000. RAM will be slower 1033-2400 speed. AVX should be there by default, AVX2/AVX-512 even better. AVX2 on E5-2600 v3 series. The setup won’t rival an eight way SXM4 A100, but you can load some big models with slow responses.