this post was submitted on 30 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Depends entirely on what model you want. The llama-2 13b serverless endpoint would only cost $0.001 for that request on Runpod.
If you rent a cloud pod it's going to cost the same per hour no matter how much or little you send to it so it's based entirely on the number of requests you can get sent to it.