palpapeen

joined 1 year ago
[–] palpapeen@alien.top 1 points 11 months ago

Thanks! I may go for an M1 max with 64gb then. It seems very promising

 

I know there are a bunch of threads here for picking up your Macbook Pro, but I can't find exactly what I'm looking for.

I'm renewing my laptop (finally) from a 2019 MBP with 16GB RAM, and I want to run a lot of AI models on my new one, without destroying my wallet. Optimally, trying not to go over 3k (with a 10% discount) on MBPs.

I'm hesitating between an M3 Max with 32 GB (https://www.apple.com/shop/buy-mac/macbook-pro/14-inch-space-black-apple-m3-pro-with-12-core-cpu-and-18-core-gpu-18gb-memory-1tb) and taking a slightly older M2 Max for decently cheaper and potentially more RAM.

What do you think is better here? I'm inclined towards the M3 just because I want to take a very recent model so that I can keep it for at least 3 years.

I found https://github.com/ggerganov/llama.cpp/discussions/4167#user-content-fn-3-99cf8a0cea6a624f72b056d3ccbe1b51 which shows that M3 Max and M2 Max with similar specs (GPU cores) perform similarly, which seems to indicate I should rather take an M2 Max. But what would be cons of doing this?

[–] palpapeen@alien.top 1 points 11 months ago

There's already a couple of startups working on similar things, check https://withmartian.com for example (not a reason not to do anything ofc). Interested in what it becomes!

[–] palpapeen@alien.top 1 points 1 year ago

I mean yeah but it's not done training AFAIK, and not fine-tuned either

[–] palpapeen@alien.top 1 points 1 year ago (1 children)

Thanks! But I'm not looking for one that does coding, more one that's good at detecting fallacies and reasoning. Phi-1.5 seems a better fit for that

 

I've playing with a lot of models around 7B but I'm now prototyping something that would be fine with a 1B model I think, but there's just Phi-1.5 that I've seen of this size, and I haven't seen a way to run it efficiently so far. llama.cpp has still not implemented it for instance.

Anyone has an idea of what to use?