this post was submitted on 17 Nov 2023
1 points (100.0% liked)
LocalLLaMA
4 readers
4 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Text Generation webui for general chatting, and vLLM for processing large amount of data using LLM.
On an RTX3090 vLLM is 10~20x faster than llama.cpp for 13b awq models.