this post was submitted on 22 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
One practical consideration is that if you get two cheaper GPUs your upgrade path is a lot worse. If 6 months or a year goes by and you decide you want more memory you have to basically scrap those and start over again if you've used up all your PCI slots whereas if you had one high memory card you would have the possibility of adding a second one to get more memory in the future.