this post was submitted on 18 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

Looking for any model that can run with 20 GB VRAM. Thanks!

you are viewing a single comment's thread
view the rest of the comments
[โ€“] YuriWerewolf@alien.top 1 points 10 months ago (1 children)

How did you set settings for memory sharing (layers) between gpus? I have 2 gpus: 3060Ti and 3060 and it seems like it tries to load everything on the first one and goes out of memory.