this post was submitted on 28 Nov 2023
1 points (100.0% liked)

Machine Learning

1 readers
1 users here now

Community Rules:

founded 1 year ago
MODERATORS
 

I have a good PC, but the problem is with my RAM. The model is LLM, which makes the memory requirement understandable, but I can't afford to get that much RAM right now so I'm looking for a roundabout. Any help?

you are viewing a single comment's thread
view the rest of the comments
[โ€“] KingsmanVince@alien.top 1 points 11 months ago (4 children)
[โ€“] theonewhoask11@alien.top 1 points 11 months ago (3 children)

That is exactly the reason, I'm literally looking at my options and from the comments, it definitely isn't looking good ๐Ÿ˜ญ

Let's say I want to work on Noon (https://huggingface.co/Naseej/noon-7b), how much would I actually need?

[โ€“] KingsmanVince@alien.top 1 points 11 months ago (1 children)

Paste your model name in this HF space, https://huggingface.co/spaces/Vokturz/can-it-run-llm

https://imgur.com/a/Ednemii (the result)

It seems you need less than 32 GiBs vram

[โ€“] theonewhoask11@alien.top 1 points 11 months ago

And thanks to you, it now works. I knew exactly what to do and what I needed to get it to work. Thanks, man!

load more comments (1 replies)
load more comments (1 replies)