this post was submitted on 31 Oct 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

In terms of AI use, especially LLMs.

$5000 USD for the 128GB ram M3 MacBook Pro is still much cheaper than A100 80 GB.

top 5 comments
sorted by: hot top controversial new old
[–] FlishFlashman@alien.top 1 points 1 year ago (1 children)

Apple has further segmented the Apple silicon lineup with the M3.

With the M1 & M2 Max, all the GPU variants had the same memory bandwidth (400GB/s for the M2 Max). The top of the line M3 Max (16 CPU/ 40GPU cores) is still limited to 400GB/s max, but now the lower spec variants (14 CPU/30 GPU) are only 300GBs/max.

Inference is generally bound by memory bandwidth, so the M3 generation may not be much of an improvement. Apple claims improvements in the cache, but that may not mean much for inference. We'll know more once people have them in hand, which shouldn't take too long.

[–] TheRealDatapunk@alien.top 1 points 11 months ago

But what's the effect of using multiple graphics cards connected via relatively low bandwidth PCI express?

[–] Distinct-Expression2@alien.top 1 points 1 year ago (1 children)

It’s apple, so you buy a sub to something that will break and cannot be fix

[–] HipsterCosmologist@alien.top 1 points 1 year ago

Ah yes, I'm sure an A100 is extremely serviceable! Apple's customer support is pretty great if you've never had then experience.

[–] Danielanish@alien.top 1 points 1 year ago

In thier keynote apple mentioned thatbits up to 128gb on Chip memory with 92 billion transistors. I wonder how much of that is SRAM and DRAM. has anyone been able to find any breakdowns?