this post was submitted on 28 Nov 2023
1 points (100.0% liked)

Machine Learning

1 readers
1 users here now

Community Rules:

founded 11 months ago
MODERATORS
 

I have a good PC, but the problem is with my RAM. The model is LLM, which makes the memory requirement understandable, but I can't afford to get that much RAM right now so I'm looking for a roundabout. Any help?

top 11 comments
sorted by: hot top controversial new old
[–] PaulCalhoun@alien.top 1 points 9 months ago

Could put a swap file on LTFS. Or just load more RAM via your local MicroCenter's IPoAC.

[–] KingsmanVince@alien.top 1 points 9 months ago (1 children)
[–] theonewhoask11@alien.top 1 points 9 months ago (2 children)

That is exactly the reason, I'm literally looking at my options and from the comments, it definitely isn't looking good 😭

Let's say I want to work on Noon (https://huggingface.co/Naseej/noon-7b), how much would I actually need?

[–] KingsmanVince@alien.top 1 points 9 months ago (1 children)

Paste your model name in this HF space, https://huggingface.co/spaces/Vokturz/can-it-run-llm

https://imgur.com/a/Ednemii (the result)

It seems you need less than 32 GiBs vram

[–] theonewhoask11@alien.top 1 points 9 months ago

And thanks to you, it now works. I knew exactly what to do and what I needed to get it to work. Thanks, man!

[–] The-Protomolecule@alien.top 1 points 9 months ago

Not trying to be rude, but you’re also like saying you want to participate in a car race and you don’t have a fast car kind of a prerequisite unfortunately.

You could look at things that offload your model to disk but they’re going to be slow as hell.

[–] DeliciousFriedPanda@alien.top 1 points 9 months ago (1 children)
[–] theonewhoask11@alien.top 1 points 9 months ago

Let's go, I'm saved!

[–] MachineZer0@alien.top 1 points 9 months ago

You can pick up an old Xeon based server preconfigured with 512gb-1tb RAM for $350-1000. RAM will be slower 1033-2400 speed. AVX should be there by default, AVX2/AVX-512 even better. AVX2 on E5-2600 v3 series. The setup won’t rival an eight way SXM4 A100, but you can load some big models with slow responses.

[–] Smallpaul@alien.top 1 points 9 months ago

/r/learnmachinelearning

[–] Alittlebitanalytical@alien.top 1 points 9 months ago

Sell your old ram and use the $ to upgrade. If you have an extra slot, search for donated ram and drop it in (it needs to be the same make/model/density). Or, use a flash drive as readyboost, a server to do the heavy lifting, using VRAM, etc.