this post was submitted on 22 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

From what I’ve read mac somehow uses system ram and windows uses the gpu? It doesn’t make any sense to me. Any help appreciated.

you are viewing a single comment's thread
view the rest of the comments
[–] Fluboxer@alien.top 1 points 10 months ago

7b and 13b fully in vram. 7b models have 35, 13b have 43 layers iirc

70b involving ram