this post was submitted on 12 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Use case is that I want to create a service based on Mistral 7b that will server an internal office of 8-10 users.

I’ve been looking at modal.com, and runpod. Are there any other recommendations?

you are viewing a single comment's thread
view the rest of the comments
[–] Ok-Goal@alien.top 1 points 1 year ago

In our internal lab office, we're using https://ollama.ai/ with https://github.com/ollama-webui/ollama-webui to locally host LLMs, docker compose provided by ollama-webui team worked like a charm for us.