this post was submitted on 19 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

Basically, that's my question. The caveat is that I would like to avoid a Mac Mini and I wonder if some of Minisforum's mini PCs can handle LLM.

you are viewing a single comment's thread
view the rest of the comments
[–] CasimirsBlake@alien.top 1 points 10 months ago

The issue is a lot of them have either Intel CPUs with the on board graphics, or AMDs CPUs... With on board graphics. Mini PCs with Nvidia GPUs are uncommon.

Zotac did some small PCs with Nvidia GPUs I think but I doubt any of them have much vram.

If you pair a mini pc with thunderbolt and connect it to a eGPU, that could be a setup that would work...