this post was submitted on 04 Dec 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Hi there, Im looking to buy an apple laptop and I saw a macbook pro m1 max with 64gb ram and 2TB ssd for 2400 usd Will this computer be able to run the big models at reasonable speed?

I was going to buy the simple macbook air m1 8gb ram for 700usd but I saw this and I always wanted to play with LLMs but never could.

Any advice is appreciated, thanks

you are viewing a single comment's thread
view the rest of the comments
[–] fallingdowndizzyvr@alien.top 1 points 1 year ago

Yes, that M1 Max should running LLMs really well including 70B with decent context. A M2 won't be much better. A M3, other than the 400GB/s model, won't be as good. Since everything but the 400GB/s has had the memory bandwidth cut from the M1/M2 models.

Are you seeing that $2400 at B&H? It was $200 cheaper there a couple of weeks ago. It might be worth it to see if the price goes back down.