this post was submitted on 31 Oct 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Monkey_1505@alien.top 1 points 10 months ago

It won't be long before there are cheaper PC's with wide memory buses. AMD or Intel with lpddr5. Probs be 150-200 MB/s. AMD probs the better option (they also have their own AI accel now).

For AI, that will make these Pro configurations considerably less compelling. Which isn't a bad thing, Apple is overpriced.