this post was submitted on 28 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

I know there are a bunch of threads here for picking up your Macbook Pro, but I can't find exactly what I'm looking for.

I'm renewing my laptop (finally) from a 2019 MBP with 16GB RAM, and I want to run a lot of AI models on my new one, without destroying my wallet. Optimally, trying not to go over 3k (with a 10% discount) on MBPs.

I'm hesitating between an M3 Max with 32 GB (https://www.apple.com/shop/buy-mac/macbook-pro/14-inch-space-black-apple-m3-pro-with-12-core-cpu-and-18-core-gpu-18gb-memory-1tb) and taking a slightly older M2 Max for decently cheaper and potentially more RAM.

What do you think is better here? I'm inclined towards the M3 just because I want to take a very recent model so that I can keep it for at least 3 years.

I found https://github.com/ggerganov/llama.cpp/discussions/4167#user-content-fn-3-99cf8a0cea6a624f72b056d3ccbe1b51 which shows that M3 Max and M2 Max with similar specs (GPU cores) perform similarly, which seems to indicate I should rather take an M2 Max. But what would be cons of doing this?

top 2 comments
sorted by: hot top controversial new old
[โ€“] watkykjynaaier@alien.top 1 points 11 months ago (1 children)

I'm on M1 Max with 32gb, with GGUF in LM Studio you can run the 34b Yi finetunes well, but that's as high as you can go for now. The 3 bit 70b quants will technically run but not in any useful way. As others have noted, RAM is the make or break factor here. Get as much as you can, the processor generation is much less important.

[โ€“] palpapeen@alien.top 1 points 11 months ago

Thanks! I may go for an M1 max with 64gb then. It seems very promising