this post was submitted on 27 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

Hi all,

Just curious if anybody knows the power required to make a llama server which can serve multiple users at once.

Any discussion is welcome:)

you are viewing a single comment's thread
view the rest of the comments
[–] SupplyChainNext@alien.top 1 points 9 months ago

figure out the size and speed you need. Buy the Nvidia pro gpus (A series) x 20-50 + the server cluster hardware and network infrastructure needed to make them run efficiently.

Think in the several hundred thousand dollar range. I’ve looked into it.