this post was submitted on 30 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
There's a bunch of examples in the repo. Various Python scripts for doing inference and such, even a Colab notebook now.
As for the "usual" Python/HF setup, ExLlama is kind of an attempt to get away from Hugging Face. It reads HF models but doesn't rely on the framework. I've been meaning to write more documentation and maybe even a tutorial, but in the meantime there are those examples, the project itself, and a lot of other projects using it. TabbyAPI is coming along as a stand-alone OpenAI-compatible server to use with SillyTavern and in your own projects where you just want to generate completions from text-based requests, and ExUI is a standalone web UI for ExLlamaV2.
Got it to work! Thank you!!