this post was submitted on 22 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

From what I’ve read mac somehow uses system ram and windows uses the gpu? It doesn’t make any sense to me. Any help appreciated.

you are viewing a single comment's thread
view the rest of the comments
[–] ModeradorDoFariaLima@alien.top 1 points 10 months ago

Imo running any AI on anything other than VRAM makes it so slow it's unusable. So, you can play around with 13b quantized models.